You must Sign In to post a response.
  • Do you make use of Robots.txt and has it helped in SEO ?


    Want to make use of Robots.txt to improve your SEO? Our technical experts are here to show you the best ways for it.

    The robots.txt file is a special file which has all the permission for the search engine bots on what to index and what to leave out. This special text file must allow only the public sections of your website and leave out the private sections of your website.

    Do you make use of Robots.txt and has it helped in SEO ? Please share your experience.
  • Answers

    1 Answers found.
  • Let's get the concept for the robots.txt first before we explain it's usage for bloggers or say anyone who manages a website.

    Robots.txt is basically a text file that is meant for the web crawlers. It tells the web crawlers what parts of the server account for the website are allowed to access. It also speaks about the permission required for the file access and crawling for reading and indexing for the web bots.

    It has three basic fields :

    1. User agent
    2. Disallow
    3. Allow

    User agent basically specified what you're as a bot allowed to access with permissions on allowing and disallowing fields. Disallowing fields takes argument with forward slash, star operator like * and /. Here you can specific which folders in the website are allowed to access and are blocked. Usually user dashboards are disallowed. And other folders with public access are allowed to index.

    This file is very important because if you write the disallow for the content or director which is supposed to be indexed, it could hurt your ranking. Most of the modern CMS create this file and if it's not created you can create this on your own and depending on the suggestions of the SEO expert you can decide which area of your CMS can be allowed access and blocked.


  • Sign In to post your comments