You must Sign In to post a response.
  • What is Robot.txt and the use of it in blogs ?


    Want to know the role of Robot.txt in blogs and is there any relation between site map and Robot.txt? Read more about it here.

    Hi,

    I want to know what is robot.txt file in blogs and the use and necessity of having it ?

    I recently read a article about robot.txt and I didn't understand exactly what it is o the use of it. Is it a SEO friendly ?

    Is Site Map and Robot.txt same ? Also tell me how to create robot.txt for new blogs
  • Answers

    4 Answers found.
  • Robots.txt also known as Robots Exclusion protocol is an instruction given by the webmasters to the crawlers. When the owner of a site is not willing to make a page of his site to be displayed on web, he can use a text file called robots.txt in the specified section. When he adds that text file, crawlers don't enter that section and it'll not be displayed on web. It's not SEO friendly, it is to block crawler from crawling your site.

  • Hi Gowthamare

    But according to blogging, Crawling is very important right ??

    Then why should we have robot.txt file to deny Google to crawl ??

    If so what places we will use he robot.txt to deny the crawling. ?

    Live Your Life Your Way

  • Hi Blogger,
    Robot.txt is a type of file which is used in website creation. Search engine will always analyze the sites for indexing the crawling. Robots.txt is meant for user asking the search engine not to enter or visit into particular pages or contents in a particular website. It is must to creating Robots.txt in websites. It is because the search engines always update the websites and it will be indexed in the respective sites. By having Robots.txt it could be prevented by not to review some pages or contents while updating.

  • Robot.txt file is used by the webmasters to prevent the search engine spiders from indexing some particular pages of your blog which you may not want to be displayed in the search results like your login page etc for privacy.
    Robot.txt is the first place in a website or blog which the crawlers visit. This way they get to know which pages to exclude from index, but if no robot.txt file is provided by the webmaster, the crawler indexes all the pages including the pages which may contain confidential information.
    A sitemap is different. It is an xml document which contains the links to various sections of the blog or website. A sitemap is essential for increased visibility in the search results.


  • This thread is locked for new responses. Please post your comments and questions as a separate thread.
    If required, refer to the URL of this page in your new post.