Jump to content
planetphillip

Robots.txt Question

Recommended Posts

HI,

 

I've googled for the answer but couldn't find it.

 

Is it possible to have the following text in my robots.txt file?

 

User-agent: *

Disallow: *.shtml

Disallow: *.cgi

Disallow: *.html

Disallow: *.htm

 

I've had quite a few different versions of my site that used different extensions and now that I'm happy with WordPress I want to stop the wasted resources.

 

 

/EDIT/

It's always the same, just after you post a question you find the answer!

Just in case other readers are interested, the googlebot supports wildcard files but it's not standard.

 

Looks like I'll need to find another way, maybe htacess.

TIA

Phillip

Share this post


Link to post
Share on other sites

Several ways exist to deny bots access to files and folders:

  • The best, yet quite tedious, way is to use
    ><meta name="robots" content="noindex,nofollow">

    in the header of every page you don't want spidered. I use this method and it doesn't take long to implement since I use php templates to do most of the dirty work for me.

  • You can, of course, use the robots.txt file to control robot access to your sites contents. However, if you are at all security conscious I would recommend against this method. Any divulgence of your site's directory structure, file names, naming conventions, or types of extensions used will make it that much easier for knowledgeable folks to gain access to it and use it imporperly.
  • The only 100% (well, almost) guaranteed method is to password protect a directory or file. This will almost surely hinder usability but works well for folders containing only support files (e.g. cgi-bin, javascript, includes, stylesheets, images, etc...).
  • You can use
    >deny from <ip>

    in .htaccess to deny certain bots, but you would have to know the ip/domain of every bot to be denied. This isn't feasable for large-scale denial, but is good for denying those "rogue" bots that ignore robots.txt and robots meta tags. You can do a google search for bad or rogue robots to find the names and ips of many of the worst offenders.

  • After the fact if you find any of your pages indexed on a search engine you can usually submit a request for removal of any and all pages from your domain.

Share this post


Link to post
Share on other sites

Thanks for the reply, I was really just attempting to stop them trying to spider pages that no longer exist. I'm happy for them to spider, but I might follow up the idea of removing the no-longer existing pages from their index.

 

Thanks again.

 

Phillip

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...