Jump to content


  • Posts

  • Joined

  • Last visited

Everything posted by TheCanadian

  1. You did it! The errors have stopped! Thanks a bunch Balakrishnan!!
  2. Thanks for the reply. I just tried it on one of my sites and it is still happening: [07-Oct-2017 12:16:18 America/Detroit] PHP Warning: Module 'imagick' already loaded in Unknown on line 0 My server is OSTEGO
  3. This is the problem I've encountered, and I'm wondering if anyone else here is seeing the same thing. I program PHP/MySQL sites, and since my TCH server was migrated to the new cPanel & PHP versions in February, I have been using the cPanel MultiPHP INI Editor (as I was told this was the new way of doing it after all my sites broke) to set my PHP version and configure specific PHP variables I needed for my site - like file_uploads, upload_max_filesize and a custom error_log so I can have one central file stored outside the public_html to collect any scripting errors. All was fine until last night around 9:20pm. Suddenly, in all sub-folders containing PHP scripts of all the sites I have on the server, I'm getting a generated "error_log" file with an error generated on every single script execution: [03-Oct-2017 21:21:38 America/Detroit] PHP Warning: Module 'imagick' already loaded in Unknown on line 0 I contacted support about this, and was able to resolve it on one site, but the solution isn't ideal as it means changing the setup of every site I have. They suggested to bypass the cPanel MultiPHP INI Editor of the site by modifying my .htaccess file to reference a custom php.ini file using suPHP_ConfigPath, then upload a full php.ini file with the modifications I need. Yes, this seems to solve the problem, but now I have to convert all my sites back to the method that was in place *before* the server upgrade in February. Yet everything was peachy until last night - so I'm thinking this is relating to a cPanel software update and not due to a coding change on my part since it is even occurring on sites I haven't edited in weeks. Is this happening to anyone else? Does anyone know why, or how to stop the deluge of error_log files popping up in every sub-folder? I'm just trying to figure out what is the best approach to this before I spend the time converting all my sites back to custom php.ini files.
  4. When the switch occurs, what PHP version will existing accounts be defaulted to - 5.4, 5.5, 5.6 or 7.0? If 7.0, is there something I can do to set things before the switch to ensure a prior version is used? I ask because I know for a fact many of the scripts I use are not PHP 7.0 compatible. They are 5.6 compatible though.
  5. I just subscribed to the service on a couple of accounts I manage. It is catching spam (and some legitimate email), but there is still a lot of junk mail coming through. In http://nospam.totalchoicehosting.com/ I can see how to Whitelist senders or move an incorrectly identified message back to the inbox, but how would I tell it that certain messages it let through to my inbox are actually spam?
  6. Relating to the same issue, but from the perspective of the server's root certificate store, I would just like to verify if the following (taken from PayPal's recent letters of compliance checks) would be true for all TCH servers:
  7. I have just noticed something really odd start happening on pretty much all my sites. The folder modified dates are being changed whenever a visitor just accesses the folder - yet nothing in the folder is being changed! I have several scripts that rely on the modified file/folder dates to determine when portions of the site were last updated, and now it thinks everything is in a constant state of update because the dates are changing whenever a web visitor just accesses the index file in the folder! This seems to be happening exclusively with folders that have a PHP index file. I tried creating the two folders and files detailed below as a test: /test/index.phtml <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" > <?PHP print "<head><title>TEST</title>\n"; print "</head><body>\n"; print "<div id=\"Content\">\n"; print "TEST"; print "</div>\n"; print "</body>\n"; ?> </html> /test2/index.html <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" > <head><title>TEST</title></head> <body> <div id="Content">TEST</div> </body> </html> Both files essentially output the same thing, yet when I access the /test/ folder in a web browser, then modified date of the /test/ folder is changed. When I access the /test2/ folder in the web browser, the folder date is not changed. Why is PHP modifying the folder date when nothing is being written into the folder?
  8. Something else I ran into with this upgrade is that the PHP virtual() command no longer works wince PHP is now being run in CGI mode instead of Apache mode. If you were using it to include text, html or PHP files inside your PHP script, try using require() or include() instead. If you were wanting to execute another script (like a Perl script), then you'll have to use passthru().
  9. Question... is it possible that before this change, files uploaded/created by scripts running on the server as user "nobody" were not being calculated in the web space quota shown in CPanel and WHM? Because I had one site bloat by 100megs overnight, and I can't find anything on it that has significantly changed since the other day.
  10. As far as I know, this is how to do it... Look in your Apache folder for httpd.conf (might be in the "conf" subdir). Edit the file in a text editor and search for the line: Listen 80 Change it to whatever port you like, then restart Apache. Or, if you are wanting to make the server accessible to the Internet on a non-standard port, if you have a router then just port map whatever external port you like to the internal port of 80 on the server. That way locally you can access it normally, and only when accessing from outside do you have to use the non-standard port. Oh - and regarding the initial error you were getting, it looks like just a warning and not an actual error. Chances are your PHP install is showing verbose errors to the browser whereas the install here wouldn't do that by default (would get messy really quick!) I think it's just warning you that when you are running the script, because you don't check if the variable exists and are just assigning it, it's warning you the variables are empty. If you actually posted to the script, it wouldn't say that. Try it. It's good to have that enabled for testing scripts to look for all possible glitches and warnings before posting it online.
  11. Oh!! I'm getting somewhere! I checked and the server is using Pro-FTPd. HELP says it supports SITE UTIME. Docs for Pro-FTPd says this is used to modify the file dates. So I looked up how to use SITE UTIME, and found: http://www.proftpd.org/docs/contrib/mod_si...html#SITE_UTIME So I tried: >SITE UTIME 200711010000 "November 2007.pdf" And I get a response of "500 UTC Only". So I'm getting close, just missing something I think. UPDATE: Got it!!! Man, this took some heavy Googling... I found that some implementations of SITE UTIME work totally differently, letting you change the creation date, modification date and access date. So I tried changing the command: >SITE UTIME November 2007.pdf 20071101000000 20080211000000 20080211000000 UTC And holy crap, it worked! I also had to remove the quotes because it didn't work with them (they weren't needed).
  12. I tried it by using PHP to run the command: >exec("touch -t 200711010000.00 'November 2007.pdf'") but it didn't seem to do anything. I think the issue of the PHP script running as "nobody" is the problem again. My account doesn't have SSH access so I can't login to run the command directly. I think I'd pretty much need a way to run a command under my user instead of the nobody user. OJB: SmartFTP must be using an FTP command to do that. Do you know what command it is? I can run custom commands with my FTP client. I tried >MDTM 20071101000000 "November 2007.pdf" but it gave me an error: "Can't check for file existence", so I assumed the TCH FTP server I'm on doesn't support using MDTM to change the file date. I can't find another FTP command that is designed to do that.
  13. The documents are named by date, but not a numerical date (eg. "November 2007 Gazette.pdf" or "Summer 2008 Meeting.pdf") so there is no way of showing them in the correct order. Changing all the filenames to something like "2007-11 Gazette.pdf" is not an option. Is there any way to change the date of an uploaded file? Through FTP maybe? Perhaps some way to run a Unix touch command under my user id instead of the "nobody" user through cpanel somewhere? UPDATE: Found a simple solution! Since the problem was that the file was my user and the PHP touch command executed from the web runs as user "nobody", I tried uploading the file through a PHP script instead of FTP. So now my "nobody" script has the permission to modify my "nobody" file. Sweet! That will work!
  14. I have a script which loads a list of PDF newsletters in a directory and displays them according to the date they were originally uploaded. Unfortunately, on occassion I need to fix a goofed up file and reupload, which now updates the modified date and goofs up the order of the newletters. I need to be able to change the date on the file to put it back in the right order. The files are uploaded by me via FTP, so they are set to my user and not "nobody". I have tried setting the chmod to 666 and using the PHP touch command, but it tells me I'm not allowed to perform that function. How can I do this?
  15. Because of the double-slash problem that showed up, I'm guessing it's just piggy-backing on the default rule instead of overriding it, so it's redirecting to non-www, then redirecting back to www. That causes the password protected area to spit out a 401 error. I *think* it's because the switch to non-www happens first, then the authentication is processed, then the switch back to www happens. I'm not sure though - that's just a guess based on what I've found about the order of events in the htaccess files. So far this is the only code I've been able to get to reliably work: >RewriteEngine On RewriteCond %{REQUEST_FILENAME} -d RewriteCond %{REQUEST_URI} !/$ RewriteRule .* http://%{HTTP_HOST}%{REQUEST_URI}/ [L,R=301] It only works if I put it in the htaccess within the password protected folders, and after the authentication section of the htaccess file. I had to change the !-f to -d because it was trying to put a / on the end of requests for missing files too. So it seems to be working, even though I'm still confused as to why the rewrite rule was initially spitting out the server path instead of the URL.
  16. Didn't work. Now in my password protected folder, I get a 401 error unless I explicitly enter the trailing / in my URL, so the initial redirect to the non-www then back again must be clobbering my credentials somehow. Is there a way of simply disabling the default redirect to the non-www by way of some coding in my htaccess?
  17. I just tried that, but it does not quite work. I get redirected to the URL with 2 slashes now. So I'm guessing it's combining the server's default redirect to the non-www domain with a slash to the domain with www's and another slash? I'm going to try removing the trailing / after the $1 and see how that works...
  18. Had a thought and tried this, and so far it seems to be working: RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !/$ RewriteRule .* http://%{HTTP_HOST}%{REQUEST_URI}/ [L,R=301] But is there anything I'm missing, or am I not doing this the proper way? Any opinions from anyone who knows htaccess better than I do would be appreciated. Thanks!
  19. Hi all! I've run into a problem and the solutions I found online are not working as expected. I have several password protected folders (using htaccess), and I've had numerous people complain that they can't log in. The reason I have found is that they are entering the url of the directory without a trailing slash, and the server is automatically redirecting them to a URL with a slash - but the URL lacks the "www", so it's treated as a different URL and thus they get the password prompt again and assume their login is wrong. So I figured I should just be able to write my own redirect to replace directory queries without slashes to redirect to a URL with a slash but using the same domain they entered (with or without www's). Seems simple... so I thought. Here's what I have for htaccess code: RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !/$ RewriteRule ^(.*)$ http://%{HTTP_HOST}/$1/ [L,R=301] Basically that seems to be standard code I've found multiple people using, except that I'm not hard-coding the host name to be with or without www, but just passing back the %{HTTP_HOST} they entered. What I get is totally unexpected. I actually get back in place of $1 the entire local server path to the folder they entered. Example: Entering: ht tp://www.mydomain. com/members Spits back: ht tp://www.mydomain. com//home/myuser/public_html/members/ Any idea what I'm doing wrong? I basically want a way of allowing people to type in the URL like this: ht tp://www.mydomain. com/members and be able to enter their login, then silently be redirected to the same URL they entered with the trailing slash: ht tp://www.mydomain. com/members/ and NOT be redirected to: ht tp://mydomain. com/members/ PS - The spaces are intentional so I could represent a fictional URL with it being blotted out by the forum.
  20. Hold the phone, I just got it to work! My mysqldump command had flaws in it. I had spaces where there shouldn't have been spaces (eg. I had -u[space]$dbusername instead of -u$dbusername) So it works! (my rewrite using FTP that is - not SFTP) And the Zip password works too. Not the greatest security, but better than nothing. I'm still open to anyone's suggestions for encrypting this file (either directly or just during transit) while still making it able to be easily extracted on a Windows machine. Thanks for the help Bruce! I used the dbsender.php as a reference and that yielded my flaw.
  21. Okay, I think my issue is that I'm trying to test this script from the web browser, but it mustn't have the permissions to run these commands from there. I don't want to set a cron job for it until I'm sure it's working properly, but it looks like it might only work if ran from the server directly. So how do I test it?
  22. Ah, okay. Thought I could try something fancy. For security purposes, any suggestions for encrypting or password-protecting the resulting file before it gets transferred? Would this work: system('zip -P $password $zip_file $mysql_file'); Or is there a better solution? The resulting files would need to be able to be extracted on a Windows machine.
  23. I know it's blocked for incoming, but it's also blocked for outgoing? Also, any idea why is giving me an empty file: $command = "mysqldump --opt -h $dbhost -u $dbusername -p $dbpassword $dbname | gzip > $backupFile"; system($command);
  24. Hello! I'm trying to write a script which eventually will run as a cron job to backup a MySQL database and SFTP it directly to my machine at home. Here's the script so far: ><?PHP $dbhost = "localhost"; $dbusername = "*********"; $dbpassword = "*******"; $dbname = "*******"; $sftphost = "**********"; $sftpport = *****; $sftpuser = "*******"; $sftppass = "*******"; $errorlog = "mysqlbackup-errors.txt"; $backuplog = "mysqlbackup-last.txt"; $interval = 7; // days $last = file_get_contents($backuplog); $today = date("Ymd"); if ($last > date("Ymd",mktime (0,0,0,date("m"),date("d")-$interval,date("Y")))) { exit; } else { $logfile = fopen($backuplog,"w"); fwrite($logfile,$today); fclose($logfile); } $backupFile = $dbname.date("Y-m-d-H-i-s").'.gz'; $command = "mysqldump --opt -h $dbhost -u $dbusername -p $dbpassword $dbname | gzip > $backupFile"; system($command); try { $sftp = new SFTPConnection($sftphost, $sftpport); $sftp->login($sftpuser, $sftppass); $sftp->uploadFile($backupFile, $backupFile); } catch (Exception $e) { $errfile = fopen($errorlog,"a"); fwrite($errfile, "[".date("n/j/y-g:iA")."] - ".$e->getMessage()."\n"); fclose($errfile); } unlink($backupFile); exit; class SFTPConnection { private $connection; private $sftp; public function __construct($host, $port=22) { $this->connection = @ssh2_connect($host, $port); if (! $this->connection) throw new Exception("Could not connect to $host on port $port."); } public function login($username, $password) { if (! @ssh2_auth_password($this->connection, $username, $password)) throw new Exception("Could not authenticate with username $username " . "and password $password."); $this->sftp = @ssh2_sftp($this->connection); if (! $this->sftp) throw new Exception("Could not initialize SFTP subsystem."); } public function uploadFile($local_file, $remote_file) { $sftp = $this->sftp; $stream = @fopen("ssh2.sftp://$sftp$remote_file", 'w'); if (! $stream) throw new Exception("Could not open file: $remote_file"); $data_to_send = @file_get_contents($local_file); if ($data_to_send === false) throw new Exception("Could not open local file: $local_file."); if (@fwrite($stream, $data_to_send) === false) throw new Exception("Could not send data from file: $local_file."); @fclose($stream); } } ?> Nothing seems to be working right, and I'm not getting any errors in my log file to tell me what's happening either (and it is chmod 666). The .gz file is being created, but it's empty. No connection to my SFTP server appears to be happening on my computer, and I have the port open and routed properly on my firewall. Also, the .gz file does not delete after it's done like it should. I have the database login info correct (same info I use on my PHP scripts to interact with the database), and I have my server configured using Core FTP's free Mini SFTP server (http://www.coreftp.com/server/) with a dynamic DNS name and the appropriate port routed (I'm using a non-standard port for security purposes). Does anyone see anything wrong with the code? Does 'mysqldump' not work from a system() command like I'm using it? Is my SFTP outgoing connection not allowed from the server? Are there upper limitations to SFTP ports (I'm using a port over 20000)? I'd like to get this script working because I prefer to use SFTP instead of FTP or E-mail (cuz neither is very secure, plus it's annoying to have tons of backup files clutter my inbox). Thanks for any help!
  25. Hmm... that time idea is an excellent one! I think I'm going to implement that too. In the last couple of months I've noticed that spambots seem to be posting from a cached form (like you mentioned), so I added a server-side cookie to my form that is created when the form page is loaded, contains the IP and browser type, and is verified by the CGI before it does anything. If the cookie has timed out or the data doesn't match, then it's obviously a cached form and not loaded directly from my page, so it isn't processed. That has worked wonders so far by elminating 99% of the random garbage messages, and it even allowed me to reduce much of the "banwords" from my script.
  • Create New...