The thread has not been reviewed by Editors yet. Readers are advised to use their best judgement before accessing this thread.
This thread will be reviewed shortly.
If you think this thread contain inappropriate content, please report to webmaster
Let's get the concept for the robots.txt first before we explain it's usage for bloggers or say anyone who manages website.
Robots.txt is basically a text file that is meant for the web crawlers. It tells the web crawlers what parts of the server account for the website are allowed to access. It tells the permission for the file access and crawling for reading and indexing for the web bots.
It has three basic fields :
1. User agent
User agent basically specified what you're as a bot allowed to access with permissions on allow and disallow fields. Disallow fields takes argument with forward slash, star operator like * and /. Here you can specific which folders in the website are allowed to access and are blocked. Usually user dashboards are disallowed. And other folders with public access are allowed to index.
This file is very important because if you write the disallow for the content or director which is supposed to be indexed, it could hurt your ranking. Most of the modern CMS create this file and if it's not created you can create this on your own and depending on the suggestions of the SEO expert you can decide which area of your CMS can be allowed access and blocked.