Jump to content

wkg

Members
  • Posts

    100
  • Joined

  • Last visited

Everything posted by wkg

  1. Thanks, Billy. I have a couple of clients with really old scripts that even had to be patched when we moved to 5.4. Setting the version per domain will be a nice feature. I don't think I'll need by-directory.
  2. I'm currently at 5.4 and request, in the strongest possible terms, that this not change after the switch. I am willing to explore working my way to higher versions, but at my own step-by-step pace to test my scripts at each level. I assume this is what the MultiPHP Manager will allow?
  3. Was MySQL upgraded to 5.1 at the same time as PHP was upgraded? Many, but not all, of my tables have been renamed with #mysql50# prepended. Can you please point me to the what, why, how, etc. My scripts are unbroken, but sorting of table names is messed up.
  4. That worked very well, thanks.
  5. I changed to these ports and checked the SSL box for each. I got a warning message: The server you are connected to is using a security certificate that could not be verified. The certificate's CN name does not match the passed value. Is this because we are using a TCH (generic) SSL cert for all domains? Seems I remember this in a discussion some time ago. If this cool, hopefully I'll only have to click OK once (or once per session.) Otherwise seems to work fine. Thanks for your help.
  6. Is it possible to set up standard email (POP/SMTP) using SSL? My concern is using public WiFi while on the road. Gmail now uses this method. I tried to find something about this in various forums, but had no luck. Am I barking up the wrong tree?
  7. Yesterday, I did try adding the ampersand and also combinations of redirecting to dev/null, but that wasn't the problem. The problem is that the script wasn't running at all. I wasn't concerned at that moment about background, just wanted to test exec() or shell_exec() to fork a process. I began to think there might have been something weird going on with my backup script - it runs fine as cron, but maybe there is something in there that is not compatible as a command script (I didn't write this script.) So I wrote the simplest, tiny script to send a test email to me. And it works both with and without the ampersand and output redirection. (An ampersand and output redirection will be important in the actual application where the script will take some time to run.) In case someone is following along, all of these worked: >shell_exec("/usr/bin/php /home/myaccount/public_html/cgi-bin/send-mail-fork.php 2> /dev/null &"); shell_exec("/usr/bin/php /home/wscp/public_html/cgi-bin/send-mail-fork.php"); exec("/usr/bin/php /home/myaccount/public_html/cgi-bin/send-mail-fork.php 2> /dev/null &"); Thanks for your time. Now I'll have to see if I can do something a little more useful
  8. I'm using that script simply to test if I can get a script to run in the background (since I know the backup script runs.) My goal is to modify a mailing list script I have. TCH has limits on the number of emails per time period. It is easy enough to put the script to sleep for, say, five minutes between two batches, but then the page appears to hang. If I could fork the send routine off to run in the background, then the page could finish loading.
  9. Anyone successfully run a php script in the background? I've tried various configurations using exec() and shell_exec() without success. I can run a command, such as 'ls', but cannot seem to get the trick of running a php script. I've tried to mimic a cron job that runs fine. This is the cron statement >php -q /home/myaccount/public_html/cgi-bin/backup_dbs.php such as >shell_exec("php -q /home/myaccount/public_html/cgi-bin/backup_dbs.php"); shell_exec("/usr/bin/php /home/myaccount/public_html/cgi-bin/backup_dbs.php"); shell_exec("/usr/bin/php ./home/myaccount/public_html/cgi-bin/backup_dbs.php"); or variations such as >exec("wget [url="http://myaccount.com/cgi-bin/backup_dbs.php&quot%3b%29;"]http://myaccount.com/cgi-bin/backup_dbs.php");[/url] Permission is set to 755 and like I said the script runs fine as a cron job.
  10. I just tried a couple of my domains and it is still working fine. You might want to submit a help ticket to see if something changed in the configuration of your server.
  11. Brilliant! Now, why didn't I think of that?
  12. Thanks for chipping in btrfld. There were maybe two things going on. One seemed to clear up while tech support was looking at it (or I was doing something wrong, so maybe that was a non-issue.) The problem on the new site I was setting up seemed to be due to my using the dollar sign in the password. It works fine to use '$' for password protected directories, but not for this script. I'm guessing this remote stats access script, which is written in PHP, chokes on the dollar sign. Yeah, I know $ denotes a variable in PHP, but I would have thought the authentication process was being handled by the same mechanism as protected directories, but apparently not. I changed the $ to S and it worked fine.
  13. OK, I've changed it. If this is spelled out anywhere, it should be in bigger letters Thanks.
  14. Hmmm, I guess I thought I needed to use my cPanel password to log in to submit a ticket. If I change my password at the Support Center, that will just be local to tech support and not my account? These things are not always obvious to the users, even experienced users.
  15. Don't know if this is the right forum, but it is a security related question/issue. Whenever I submit a ticket to the Help Desk, the confirmation email I receive in return contains my cPanel password in plain text. I understand entering my password in the help site to assist the tech. I realize the vulnerability is extremely small, but it is unnecessary to email it back to me, so why do it at all?
  16. Has something changed? I was trying to set up the AWStats access to a new client and it would not accept the username/password. I check that I had things set correctly and everything was ok. I then thought to check client site where I'd previously set this up and it had been working... and it is not working. This is the first time I've checked this since moving to the new server... and maybe the first time since the migration to PHP5, I can't remember that for a fact.
  17. True, but the beauty of a script such as Dagon's is that it automatically searches for all your databases (you can set exceptions, though I haven't tried that) and it won't need reconfiguration if you add a new database. With dbsender you'd need to add new files & directories, configure files, etc. I'd get busy with the db and forget to do the housekeeping
  18. The good folks at TCH keep reminding us we are responsible for our own data, even though they carefully back it up regularly. One of the things I keep meaning to do is set up an automatic backup of my MySQL databases. The manual method via phpMyAdmin and cPanel works great, but that gets tedious quickly. I wanted a method to back up all my databases, I currently have 7 databases. I also realized it was probably smart not to leave the backup files on my TCH site in case of disc crash or other disaster. This post covers my discoveries and what worked for me. One caveat, I haven't actually backed up using the files generated, but I did unzip them and visually examined them and in their essentials contain text as I'd seen in the files I'd manually created using phpMyAdmin, so I expect them to work fine. (Although there were some lines in the save files I didn't understand, maybe it's a version compatibility thing that someone can explain, such as >/*!40111 SET @OLD_SQL_NOTES=@@SQL_NOTES,SQL_NOTES=0 */; One of our forum moderators recommended dbsender [by Eric Rosebrock, http://www.phpfreaks.com] to another user. I looked that over. It is a fairly simple php script which will email your backup to you (TCH doesn't support the FTP that it can also use,) but it will only backup one database. I googled for scripts and visited the php script sites for a suitable program. MySQL provides powerful commands to generate backup, but though I've done some simple php scripting, I thought it would be better to find one written by a more experienced person. I looked at WipeOut's - Automatic MySQL Backup - which was reviewed favorably, but as it was a shell script, I didn't feel up to trying it. I found another php script that had lots of interesting features. It was backupDB() by James Heinrich at http://www.silisoftware.com. Unfortunately, I was unable to get it working in either the interactive or cron job mode. The author provided his email for questions or comments, but after several days receiving no response and no joy in figuring out what was going wrong, I abandoned working with it. Then I found the script at Dagon Designs. It will backup multiple databases, compress them, and either leave them on the server or put them in an archive and email them to you. I did get this to work, and will show you what I needed to do to get it to work on TCH. First I want to pass along a good piece of advice I ran across while searching for a script. I forgot who wrote it, so can't pass along credit. He was offering a script, but the link to it was no good. He recommended you create a special user to access your database(s) with the backup script. Only give this user SELECT and LOCK TABLES privileges. This provides both security and protection. This user cannot change the database and a misbehaving script cannot damage the database. Most scripts (and good practice) lock the tables while doing backup. Dagon Design's utility has only two files, a configuration you change, and the script which you don't edit. The basic instructions are provided on the site. Additionally, the config script is well documented. HOWEVER, you do need to change the defaults for the backup and temporary directory destinations! Leaving the defaults will generate errors as the script tries to create the directories on the system root. You need to change these to the full path to where you want them within your space. For example, this is what I used: >// Backup destination (will be created if not already existing) $BACKUP_DEST = '/home/cpanelname/public_html/cgi-bin/db_backups'; // Temporary location (will be created if not already existing) $BACKUP_TEMP = '/home/cpanelname/public_html/cgi-bin/db_backups/backup_temp'; Now I don't know if the script will actually create those directories itself if they don't exist, because by the time I figured out I needed to change the path, I'd already manually created them. My only other changes to the default was user and password, naturally; I changed the compression to gzip, email to true & delete after email to true, and of course the address to send the archive file and error msg. The error email only tells you that there was an error, not what it was. You will have to consult the error log for that. When I was getting it up and running I provided my email address when filling out the cron job page in cPanel (I used the standard, non advanced version.) Cron emailed a comprehensive list of the job with all the errors and/or successes. This provided enough explanation to get the script to work, so then I removed my email address from the cron job page (though I expect more experienced users can tell us how to append something to the command to keep cron from emailling its job log.) The cron command I entered was: >php -q /home/cpanelname/public_html/cgi-bin/backup_dbs.php As I undestand it you need the "php" to tell it what kind of script and "-q" means quiet, though I'm unsure of the specifics. I hope this will help you and encourage you to back up your databases.
  19. Thanks, Dick, that gets me one more step along the way. Is there somewhere I can find docs or syntax for how to set up the commands using cron? I gather the first parameter specifies a php script, but don't know what the -q does... and there might be other important things I should know lest I get myself into big trouble I tried searching for this, but didn't have any luck. Thanks, Bill
  20. I found a php script I want to try out to backup mySQL databases (this one does all the databases and tables at once.) Using cPanel to set up the job (standard method) seems pretty straight forward, but when it ran I got a "No such file or directory" error emailed to me. I used /home/my-accountname/public_html/cgi-bin/backupDB/backupDB.php?StartBackup=complete&nohtml=1 as the command name. Where the script is located at /home/my-accountname/public_html/cgi-bin/backupDB/ All the directories are set to 755 Does it need to be in the bin directory, not the cgi-bin?
  21. Since you are creating your own functions, you can do what I've done... simply subtract (or add) a constant to the the server time. In my case I simply subtract 2 from the hours to get my local time. Not very elegant, but it works.
  22. Cheryl, You may want to look at this page on MySQL for incompatibilities in the upgrade from 4.1 to 5. I didn't see anything that I can anticipate causing me problems, but if you are "heavily database driven" you should probably look it over: http://dev.mysql.com/doc/refman/5.0/en/upg...g-from-4-1.html Bill
  23. I always log into my bank from work... but then I work out of my spare bedroom The boss can be a bit much some days, but then he lets me watch afternoon baseball if I've been good and done my chores.
  24. In this case, it is not financial or high-risk, so I'm not going to sweat it. I suppose one could write their own password security using php sessions, then close the session. But that's way more than I need at this point. Thanks for your input, all.
  25. I've used cPanel to create a password protected directory for some user admin php scripts. That works fine. Is there a way to "log out" the user after they finish? This is important if the user is at a public computer, like at the library. It seems so inelegant to tell the user to "close the browser." Would be nice to have a link or button marked Log Out which would remove the permissions for access to that directory until the user logged in again with their username and password. I think that would be easier for them to remember, too. TIA
×
×
  • Create New...