
IsaacSchlueter
Members-
Posts
24 -
Joined
-
Last visited
Everything posted by IsaacSchlueter
-
I've also got a problem that is hitting a roadblock of Apache 1.3's greedy regular expression engine and would be much more easily solved if we were using Apache 2. Any plans to upgrade any time soon?
-
Thanks, Rick!
-
So, anyone know anything about using PGP in Horde?
-
Thanks, bruce. Ace, I'm not sure. Click on Help, and then on About. Mine says this:
-
Congrats on getting the new version of Horde installed. It's very pretty, and much easier to use. (I had a bit of a hassle with some permission things in the changeover, but the nice people in support fixed it all for me.) What I'm wondering now is how I can use the PGP options in Horde. It keeps telling me that I need a secure connection to do it. I assume that it means SSL. What would it take to use TCH's shared SSL for webmail? Speaking of email, boards [at] totalchoicehosting [d0t] com seems to not be a valid email address. How do we get a password for the family forums? I'm a TCH customer on server77. Thanks -- Isaac Z. Schlueter http://isaacschlueter.com
-
.htaccess Rewritemap
IsaacSchlueter replied to IsaacSchlueter's topic in CPanel and Site Maintenance
Ah, well that answers that. I don't suppose that there's any way that I can access my httpd.conf file? I would expect that would be doubtful, since I believe I'm on a shared server and that would affect everyone, woudln't it? What about if I had dedicated server hosting? Would I be able to access the httpd.conf file then? -
Protecting Yourself From Spam
IsaacSchlueter replied to Deverill's topic in CPanel and Site Maintenance
Munging the email address with HTML entities or JavaScript won't work. Why not? Because it is extremely easy for a semi-competent programmer to create a spider that uses a VB Internet Explorer control. Here's a rough outline of how the program works, with a webbrowser control (wb), a list of URLs (list), and a list of emails (emails) * >0. Put some good starting point in LIST, such as http://www.google.com/search?q=the 1. For each ITEM in LIST: a. WB.navigate2 ITEM, and wait for WB to finish loading. b. For each LINK in WB.Document.Links i. If LINK.href then LIST.Add LINK.href c. Regex wb.document.body.InnerHTML for this pattern: /[a-zA-Z0-9\-_.]+@[a-zA-Z0-9\-_.]+\.[a-zA-Z0-9\-_.]/ d. For each MATCH in Regex.Matches i. EMAILS.Add MATCH Since wb.document.body.InnerHTML is the html of the document after processing, it will automatically resolve anything that internet explorer can resolve - that includes javascript, htmlentities, and all the rest. Munging is a waste of time. If a webbrowser can see your email address, so can a spider. * Of course, this is really simplified. You'd have to also handle framesets if necessary, and you'd probably run into other bugs that I haven't thought of. Personally, I have a policy of using a different email address for each site. When I sign up with www.somesite.com, I use somesite @ MyTCHDomain.com as the email address. That way, if I start getting spam, I usually know who leaked it, and just set that address to :fail:. -
I would like to use a RewriteMap to bounce known referrer spammers off of my site before they have a chance to waste my bandwidth. I found a tutorial at http://httpd.apache.org/docs/misc/rewriteguide.html (Referrer-Based Deflector) that I'd like to use, and I am going to write a php script to generate the map out of a database as needed (whenever a new item is added to the db, that is.) However, when I set up a test at http://www.isaacschlueter.com/tests/rewrite_map_tests/ I'm just getting an HTTP 500 error. My errorlog shows: >[Fri Jun 17 14:06:46 2005] [alert] [client 64.0.158.34] /home/My_User_Name/public_html/tests/rewrite_map_tests/.htaccess: RewriteMap not allowed here First question: Are we allowed to use RewriteMap directives at all, or am I wasting my time? Second question: Is there some part of my site where it'd be allowed, but perhaps it's just not allowed in /tests/rewrite_map_tests? For reference, here's what it says in my .htaccess at the moment: >RewriteEngine On RewriteMap spamdeflector txt:/home/My_User_Name/public_html/tests/rewrite_map_tests/spammerlist.map RewriteCond %{HTTP_REFERER} !="" RewriteCond ${spamdeflector:%{HTTP_REFERER}} ^-$ RewriteRule ^.* %{HTTP_REFERER} [R,L] RewriteCond %{HTTP_REFERER} !="" RewriteCond ${spamdeflector:%{HTTP_REFERER}|NOT-FOUND} !=NOT-FOUND RewriteRule ^.* ${spamdeflector:%{HTTP_REFERER}} [R,L] I've tried accessing it with the relative path (just spammerlist.map or /spammerlist.map instead of the full path) but that didn't work any better. Here's what's in spammerlist.map >http://isaac/ - http://localhost/ - (Keep in mind, this is just a test. This should bounce all requests from my local machine, and I have an index.html which links to it, running on my local box.) Any ideas?
-
I just noticed that there's a "www" folder on my site, that seems to have the exact same stuff in it as my public_html folder. Is there any difference between these two? What purpose does the www folder serve?
-
Cool, that seems to work! For anyone else in this position, you need to go to http://cvs.php.net/pear/Archive_Zip/Zip.php and dowload Zip.php. Then, include Zip.php in your script, and you can use the class. Thanks, Raul!
-
I'd like to accept and read zip file uploads at my site. Preferably, it would be great to be able to use the functions in the ZZipLib php library, described at http://us4.php.net/manual/en/ref.zip.php However, I tested it out, and they're definitely not installed with php here at TCH. Any chance of getting this set up, or does anyone know of a way that I can configure the compile-time directives for php running on my site? (I'd be really surprised if you give your customers access to compile-time directives, as some of them can be highly unsafe, but maybe there's some other workaround?) Alternatively, M-Zip could provide the same sort of functionality. However, then I'd need to know what command-line function could be used in order to unpack zip files. More info available at http://esurfers.com/m-zip/ The other option is really not an option at all, and that's to study the PKWare Zip whitepapers at http://www.pkware.com/products/enterprise/...rs/appnote.html and then read it as a raw binary file and do all the interpretation in the script file itself. (Talk about re-inventing the wheel!) Thanks for any help you guys can provide
-
Mysql Failure (help!)
IsaacSchlueter replied to IsaacSchlueter's topic in CPanel and Site Maintenance
WOW, that was fast! Thumbs Up -
Mysql Failure (help!)
IsaacSchlueter replied to IsaacSchlueter's topic in CPanel and Site Maintenance
Ok, this must be something system-level then. I 777'ed my tmp folder - still got the error. Tried manually creating the file, and 777ing that, too. Same error. Then I made the folder back to 700, and deleted the file. -
http://isaacschlueter.com >SQL-query : SELECT DISTINCT ID, post_author, post_issue_date, post_mod_date, post_status, post_locale, post_content, post_title, post_urltitle, post_url, post_category, post_autobr, post_flags, post_wordcount, post_comments, post_renderers, post_karma FROM ( evo_posts INNER JOIN evo_postcats ON ID = postcat_post_ID ) INNER JOIN evo_categories ON postcat_cat_ID = cat_ID WHERE 1 AND ( ( post_status = 'private' AND post_author =1 ) OR post_status IN ( 'published', 'protected' ) ) AND post_issue_date <= '2004-06-21 14:18:26' ORDER BY post_issue_date DESC LIMIT 8 MySQL said: #1 - Can't create/write to file '/tmp/#sql_62f1_0.MYI' (Errcode: 30) The Query is valid. Get the same results in phpMyAdmin. Looks like anything and everything MySQL just stopped working. What's up?
-
Aha. Mod_rewrite strikes again. They call it the swiss army knife of URL manipulation, but it seems more like a blowtorch sometimes. I'm using mod_rewrite to create cleaner URLs on http://isaacschlueter.com. Well, the settings in the public_html folder are propogating to the webreports subfolder, and making problems. I'll have to review the documentation on that stuff to have the same functionality in the subdomain. I think that I can get around it by declaring a RewriteBase in the .htaccess file in the subfolder. For now, I put this in my public_html/webreports/.htaccess file and it fixed it:
-
Thanks, Rob, but that doesn't seem to work either. I deleted the folder, and removed the subdomain. Then, I created the subdomain, and it did indeed create the folder once again. But, the same things happen when I try to test it out. Hmm...
-
I read through the tutorial, and I'm pretty sure that I followed the steps, but I'm having some problems setting up http://webreports.isaacschlueter.com. I created a folder at [root]/public_html/webreports Then, set up the domain in CPanel's "Manage Subdomains" section. I even created a new ftp user, and verified that he could only gain access to that folder. I uploaded a file called "test.html" to [root]/public_html/webreports When I go to http://webreports.isaacschlueter.com, I see the directory index, and my "test.html" is showing there. But, if I click on the link to the file, I get: If I go to http://isaacschlueter.com/webreports/test.html then I see which is correct. Any ideas?
-
Another way to do this, instead of allowing access to the folders, is to only rewrite certain files... >RewriteEngine On # Extra Super Duper Clean URLs # list all of the file names to rewrite separated by pipes. RewriteCond %{REQUEST_URI} ^/(file1|file2|file3)(.*)$ RewriteRule (^[^\./]+)$ $1\.php RewriteCond %{REQUEST_URI} ^/(file1|file2|file3)(.*)$ RewriteRule (^[^\./]+)/([^\.]*)(.html?)?$ $1\.php/$2$3
-
I had accidentally dropped a .htaccess file in my htsrv folder, and it was caught in an infinite rewrite! So, everything's working now. Anyhow, this is a much cleaner way to do it. >RewriteEngine On # Extra Super Duper Clean URLs # list all of the valid directory names separated by pipes. RewriteCond %{REQUEST_URI} !^/(folder1|folder2|folder3)(.*)$ RewriteRule (^[^\./]+)$ $1\.php RewriteCond %{REQUEST_URI} !^/(folder1|folder2|folder3)(.*)$ RewriteRule (^[^\./]+)/([^\.]*)(.html?)?$ $1\.php/$2$3 I'm looking into using the -d flag in the RewriteCond to check if it's a valid directory, since hard-coding the dir names is a bit ugly. Of course, it's still not nearly as ugly as rewriting, then re-rewriting back to the way that it was! And, in a way, it's more secure, since I specifically grant access this way to certain dirs, and all others get a 404 for the "missing" php file.
-
First of all, I've only been a customer for a few days, and WOW! TCH is amazing! I'm truly in love. Now on to the problem I have a site that is driven by a bunch of php scripts. I use the extra path info directive to avoid having ?s and &s in the URL. I also prefer to strip the ".php" from my URLs, so that I have something like this: http://www.isaacschlueter.com/blog/2004/05 to show all the posts in the "blog" category, from May, 2004. Of course, the actual file is "blog.php". I was doing this on my previous webhost by putting the following in my .htaccess folder. (It's a little ugly, but it's the only way that I could get it to work.) >RewriteEngine On # If it's just /blah, change it to /blah.php RewriteRule (^[^\./]+)$ $1\.php # If it's /blah/1/2/3, change to /blah.php/1/2/3 RewriteRule (^[^\./]+)/([^\.]*)(.html?)?$ $1\.php/$2$3 # Cleanup Unwanted the RR Tracks, for cases when it's a valid directory # If it's /validdirectory.php/1/2/3, change to /validdirectory/1/2/3 RewriteRule ^(xmlsrv)(\.php)/(.*) $1/$3 RewriteRule ^(htsrv)(\.php)/(.*) $1/$3 RewriteRule ^(admin)(\.php)/(.*) $1/$3 RewriteRule ^(skins)(\.php)/(.*) $1/$3 RewriteRule ^(ari)(\.php)/(.*) $1/$3 RewriteRule ^(httest)(\.php)/(.*) $1/$3 When I do this, it works fine for the URLs. And, for some reason, I can go to http://www.isaacschlueter.com/admin/whatever.php without any problem. But, http://www.isaacschlueter.com/htsrv/login.php gives an Error 500: Internal Server Error. I'm sure that it's some sort of problem with the RewriteRule stuff, because when I remove them, then everything works (but I have to put .php on my urls, of course.) For the time being, I've removed them all, but if you'd like to see what happens, just post here, and I'll upload the .htaccess again. Any ideas? Anyone know of a cleaner way to do what I'm trying to do? If I manage to figure it out, I'll post here to help any lost souls in this predicament in the future Thanks