View Single Post
Old 03-25-2016, 06:06 AM   #8
Devin Mataka
Registered User
 
Join Date: Mar 2016
Location: New York
Posts: 2
The robots.txt file is a simple text file (no html) that is placed in your website’s root directory in order to tell the search engines which pages to index and which to skip. Many webmasters utilize this file to help the search engines index the content of their websites.
If webmasters can tell the search engine spiders to skip pages that they do not consider important enough to be crawled (eg. printable versions of pages, .pdf files etc.), then they have a better opportunity to have their most valuable pages featured in the search engine results pages.The robots.txt file is a simple method of essentially easing the process for the spiders to return the most relevant search results.
Another benefit of having a robots.txt is that you can specify the location of the Google .xml or Yahoo sitemap. This also increases spiderability for the search engines.
Devin Mataka is offline   Reply With Quote