Go Back   Site Owners Forums - Webmaster Forums > Search Engine Optimization > Search Engine Optimization

Notices


Reply
 
Thread Tools Rate Thread Display Modes
Old 05-10-2017, 11:23 PM   #1
Shane Bentick
Registered User
 
Join Date: Apr 2017
Posts: 355
Crawlers are programs used by search engines to explore the Internet and automatically download web content available on web sites. They capture the text of the pages and the links found, and thus enable search engine users to find new pages.

Robots performs three basic actions:
1.First they find the pages of the site and build a list of words and phrases found in every page.
2.With this list they create a database and find the exact pages they should seek by entering the query in search option.
3. After that, the robot is able to find the site when the end user type a word or phrase. This step is called query processor.
__________________

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
|
To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.
Shane Bentick is offline   Reply With Quote

Reply


Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Display Modes Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
What is crawlers? danish00 Search Engine Optimization 12 07-01-2013 01:42 AM


All times are GMT -7. The time now is 10:41 AM.


Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.