View Single Post
Old 07-14-2016, 10:09 AM   #7
gracielahuff
Registered User
 
Join Date: Jun 2016
Posts: 79
A spider is a program run by a search engine to build a summary of a website’s content (content index). Spiders create a text-based summary of content and an address (URL) for each webpage.
robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.
__________________

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
gracielahuff is offline   Reply With Quote