robot.txt
Hey Friends,
Please anyone can tell me what is main purpose of robot.txt and why we use robot.txt in SEO ? Because till now i am struggling for this question.so,please share with me asap..:confused: |
The robots exclusion normal, additionally called the robots exclusion protocol or just robots.txt, may be a normal utilized by netsites to speak with net crawlers and different web robots. the quality specifies the way to inform the online automaton regarding that areas of the web site shouldn't be processed or scanned.
|
Thanks for sharing
|
Hi
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site. https://www.welllivingshop.com/baby/ |
It is the text file it gives the instructions to the robots about how to crawl the web page...
http://www.multaisolutions.com/ecommerce.html |
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
|
All times are GMT -7. The time now is 07:59 AM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.