Jump to content

Curl, Wget, .htacces, Linking To /


surfdean
 Share

Recommended Posts

Hey all, a few questions...

 

In awstats I've noticed wget and curl visits to my site, and would like to prevent them from grabbing my site. I've seen an enourmous htaccess file via googling, but is that necessary? It was written to block *most* bots.

 

I have directories with only images in them, and people are entering them into their browser. /mydomain/images/ and getting access to them. Can I stop this but still have access via links from internal pages somehow? I saw slightly similar posts in this forum, but nothing exactly like it. A modified index.htm?

 

When I use relative links, can i link to / instead of /index.htm?

 

 

TIA,

dean

Link to comment
Share on other sites

Hi Dean,

 

I use a bit of PHP code, there is a copy on this stop spiders stealing thread . The code is about half way down. This works simply by checking if someone is grabbing thae pages to fast - if so, then they are blocked. If they are surfing at a normal rate - then they get the files without problem.

 

 

You could also block the specfic wget commands etc by checking for them in the same script - and blocking the page in that case.

 

Using relative links will not help, if I understand correctly what you are suggesting.

Link to comment
Share on other sites

Hi Dean,

 

I use a bit of PHP code, there is a copy on this stop spiders stealing thread .

<snip>

 

Using relative links will not help, if I understand correctly what you are suggesting.

 

Hi Andy,

 

Nice codework. I'm not well versed with php, but what I understand is that I need to save the code as security.php, create a database with your variables and add 1 line to each page (within the body?)

 

I'm guessing you haven't updated the script much?

 

To clarify about the relative links:

can i use:

><a href="/">Home</a>

instead of

><a href="index.htm">Home</a>

 

I was trying to avoid using my 1K/sec connection to test it on a dummy page...

 

Anyway, I tested it, and it works....but I don't think I'll use it because of the local testing SNAFU.

 

Thanks,

dean

Link to comment
Share on other sites

Hi Dean,

what I understand is that I need to save the code as security.php, create a database with your variables and add 1 line to each page (within the body?)

Correct, but I add the one line at the top of the file (before the doctype and head).

 

To clarify about the relative links:

can i use:

><a href="/">Home</a>

instead of

><a href="index.htm">Home</a>

Yes, this would work, but I don't see any real benfit. I was assuming you were doing it to try and reduce / stop linking - and it wouldn't help for that.

 

You can turn indexing off, or add an index.htm file into your images directories, so that people can't see the index of files of course.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...