Site Owners Forums - Webmaster Forums

Site Owners Forums - Webmaster Forums (http://siteownersforums.com/index.php)
-   Search Engine Optimization (http://siteownersforums.com/forumdisplay.php?f=16)
-   -   What is the purpose of robots.txt file? (http://siteownersforums.com/showthread.php?t=141187)

sahithya 05-06-2014 11:02 PM

What is the purpose of robots.txt file?
 
What is the purpose of robots.txt file?

Sarahbriggs 05-06-2014 11:24 PM

Robots.txt is a text file which you put on your website to tell crawler which pages you would like not to crawl.

You need to create a file and you have to enter that URL which you don't want to crawl. Have a look below.

User Agent: *
Disallow: //admin

Disallow: ///t141187-

Disallow: //file.html

Disallow: //purpose

vickyjohn 05-06-2014 11:32 PM

Robots.txt is a file, which user needs to add a webpage that do not want to crawl by the spider or crawler...
eg-
User Agent: *
Disallow: //officeadmin
Type in a notepad file and add it to website root folder ..

employmentpakis 05-06-2014 11:48 PM

Robot.txt file is a text file that give instructions to search engine robots which pages you would like them not to visit.

spyindia01 05-07-2014 12:43 AM

A simple text file that stops Google (and other search engines that recognize the file and its commands) from crawling the site, selected pages in the site, or selected file types in the site.

ChristinaCa 05-07-2014 02:38 AM

Robot.txt file is a file which contains the webpages which one don't want to be crawled by the crawler.

LydiaAaron 11-27-2017 04:57 AM

Robots.txt purpose not index for the any particular web pages in our web sites.

Fozia khadim 11-27-2017 05:07 AM

Web site owners use the robots.txt file to give instructions about their site to web robots; this is called The Robot.txt. Exclusion Protocol tells the robot that it should not visit any pages on the site.

infiniumtech 11-27-2017 08:00 AM

Robots.txt is a text file webmasters create to instruct web robots. how to crawl pages on their website.

AmandaCherry 11-27-2017 08:16 PM

Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

ROYCPO 02-13-2018 08:30 AM

Robots.txt is the common name of a text file that is uploaded to a Web site's root directory and linked in the HTML code of the Web site. The robots.txt file is used to provide instructions to the Web site to Web robots and spiders. Web authors can use robots.txt

harshithaasin 02-13-2018 10:25 PM

The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl.

deepakrajput 02-24-2018 08:51 AM

Robots.txt file is set of instructions help to index the website through search engines.

Kentowin 02-25-2018 09:53 PM

robots.txt file is the file which instruct search engine which part of your webpage you don't want to get crawl by the search engine spider.

harshithaasin 02-25-2018 10:56 PM

Robots.txt is a text file webmasters create to instruct web robots (typically search engine robots) how to crawl pages on their website. The robots.txt file is part of the the robots exclusion protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users.


All times are GMT -7. The time now is 07:02 AM.


Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.