A web crawler (web spider/web robot) is a program or automated script which browses the World Wide Web in a methodical, automated manner.
This process is called Web crawling or spidering.
Many legitimate sites, in particular search engines, use spidering as a means of providing up-to-date data.Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine, that will index the downloaded pages to provide fast searches.
__________________
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|