![]() |
txt directives prevents search spiders from crawling the URL?
Which of the following robots.txt directives prevents search spiders from crawling the URL?
A: Allow: / B: Disallow: C: Disallow: /page1 D: Both A & C |
hi,
both A & C are the right ans . thanks, sonvi.belani |
All times are GMT -7. The time now is 02:04 PM. |
Powered by vBulletin Copyright © 2020 vBulletin Solutions, Inc.