View Single Post
Old 12-15-2016, 03:39 AM   #3
Markco92
Registered User
 
Join Date: Dec 2016
Location: New York
Posts: 113
Googlebot is Google's web crawling bot (sometimes also called a "spider"). Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

We use a huge set of computers to fetch (or "crawl") billions of pages on the web. Googlebot uses an algorithmic process: computer programs determine which sites to crawl, how often, and how many pages to fetch from each site.

Googlebot's crawl process begins with a list of webpage URLs, generated from previous crawl processes and augmented with Sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links (SRC and HREF) on each page and adds them to its list of pages to crawl. New sites, changes to existing sites, and dead links are noted and used to update the Google index.
__________________
Claim ME now anytime to get
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
||
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
||
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
||
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.



To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
Markco92 is offline   Reply With Quote