View Single Post
Old 01-23-2018, 06:09 AM   #7
sonyrobin
Registered User
 
Join Date: Jan 2018
Location: Dubai
Posts: 14
Crawling is the process by which search engines discover updated content on the web, such as new sites or pages, changes to existing sites, and dead links.
To do this, a search engine uses a program that can be referred to as a 'crawler', ‘bot’ or ‘spider’ (each search engine has its own type) which follows an algorithmic process to determine which sites to crawl and how often.
__________________

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
sonyrobin is offline   Reply With Quote