Slurp is the Yahoo search bot for the crawl and index your web page.
It obeys the Robot Exclusion standard. If you don not want to crawl frequently your web page, create a robots.txt file in the root directory (Home folder of site and add a rule for "User-agent: Slurp"
Code example in a robot.txt file:
User-agent: Slurp<br>disallow:/cgi-bin/
|