Robots.txt is a text tell search engine to avoid crawling of secret pages like the personal images folder, website administration folder, customer’s test folder of a web developer ect.
@milina788Oct 12.2012 — #Robot.txt is way to instruct the search engine crawlers that what to crawl and what to not because there are so many file which useless foe search engines and we don't want to let them indexed. If you want to prevent the spiders to index a specific file or folder on the website then you can easily disallow the spiders to index them. Following is the syntax of the robot.txt:-
User-agent: *
Disallow: /
Where ‘User-agent: *’ defines that it is applicable for all bots of search engines and ‘Disallow: /’ tells search engine bots that they should not visit any page on the website.