![]() |
How to prevent particular pages from crawler?
Hi,
Can anyone please share the code which helps to prevent the crawling process for specific pages? Thanks, Mollie |
Add this code to the webpage header..before
|
There are many ways to prevent pages from getting crawled by bots by using some codes on the website. For example: one is mentioned above, other we can use robots.txt file, we can server level meta tags which can tell bots not to crawl or index any page etc..
|
Use robot.txt file for that
|
Use robot.txt file to inform crawler
|
By using Robots.txt file one can prevent pages from crawler.
|
Use robot.txt to prevent your page from crawler
|
All times are GMT -7. The time now is 08:43 PM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.