View Single Post
Old 03-09-2012, 04:04 AM   #13
harryjohn99
Registered User
 
Join Date: Nov 2011
Posts: 51
Robot.txt is the way to stop the search engines to crawl any page of the website. This is required because some time there is private data on some pages of the website and the website owner does not want search engines to crawl those pages and read that useful information. In those cases robot.txt is the file in which all the pages that we don't want to crawl are added and this file is then added to the root folder of the website.

Last edited by harryjohn99; 01-08-2014 at 12:44 AM..
harryjohn99 is offline   Reply With Quote