View Single Post
Old 03-24-2017, 07:29 AM   #8
robinmym
Registered User
 
Join Date: Feb 2017
Location: Dhaka
Posts: 486
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.
robinmym is offline   Reply With Quote