Robot.txt is a way to tell search engines whether they are allow to index a page in the search result or not. The bots are automatic, and before they could access your site, they check the Robot.txt file to make sure whether they are allowed to crawl this page or not. Sometime, people do not want to stop searching engine from crawling their whole website. On the other hand, they want to specify few pages, which should not be indexed in the search results. Therefore, today in this article, we will show you How to Enable Custom Robot.txt File in Blogger?
No comments:
Post a Comment