Vintage Lover Posted February 8, 2006 Posted February 8, 2006 Hi. I have a robot named LinkWalker visiting me. Apparently they are a company that checks for broken links and then offers their services. How do I stop them from spidering my site? Sandra Quote
travisc Posted February 8, 2006 Posted February 8, 2006 If they are following standards, their bot will check a file in your root called "robots.txt" add the following lines to that... User-agent: LinkWalker Disallow: / If that fails, shoult, and we'll work up a htaccess based trap routine for it Quote
Vintage Lover Posted February 11, 2006 Author Posted February 11, 2006 Do I create this file or is it something I need to look for? Because i can't find it . thanks. Sandra Quote
TCH-Rob Posted February 11, 2006 Posted February 11, 2006 If you do not have a robots.txt file in your root directory you will need to make one. If you are using Windows, just open Notepad and type; User-agent: LinkWalker Disallow: / Save it as robots.txt and upload it to your public_html directory. Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.