Vintage Lover Posted February 8, 2006 Share Posted February 8, 2006 Hi. I have a robot named LinkWalker visiting me. Apparently they are a company that checks for broken links and then offers their services. How do I stop them from spidering my site? Sandra Quote Link to comment Share on other sites More sharing options...
travisc Posted February 8, 2006 Share Posted February 8, 2006 If they are following standards, their bot will check a file in your root called "robots.txt" add the following lines to that... User-agent: LinkWalker Disallow: / If that fails, shoult, and we'll work up a htaccess based trap routine for it Quote Link to comment Share on other sites More sharing options...
TCH-JimE Posted February 8, 2006 Share Posted February 8, 2006 Or block the IP Jimuni Quote Link to comment Share on other sites More sharing options...
Vintage Lover Posted February 11, 2006 Author Share Posted February 11, 2006 Do I create this file or is it something I need to look for? Because i can't find it . thanks. Sandra Quote Link to comment Share on other sites More sharing options...
TCH-Rob Posted February 11, 2006 Share Posted February 11, 2006 If you do not have a robots.txt file in your root directory you will need to make one. If you are using Windows, just open Notepad and type; User-agent: LinkWalker Disallow: / Save it as robots.txt and upload it to your public_html directory. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.