Search Engine Spiders Print

  • 0

Can I stop a search engine spider from crawling all or part of my site?


Yes. You can use a text editor to create a robots.txt file in the main folder of your site which tells search engine spiders what they may and may not search. Spiders look for this file before doing anything.

Example:

User-agent: *
Disallow: /

Would not let any spiders search anywhere. The asterisk wildcard means "All".

User-agent: googlebot
Disallow: mypics.htm
Disallow:/images/

Would specifically ban Google from searching both mypics.htm and the /images folder.

Search the Internet for more information on using robots.txt files.


Was this answer helpful?

« Back