Jump to content

wakewatcher

Members
  • Posts

    2
  • Joined

  • Last visited

wakewatcher's Achievements

Newbie

Newbie (1/14)

  • First Post
  • Week One Done
  • One Month Later
  • One Year In

Recent Badges

0

Reputation

  1. Did you get this sorted out? I've not had that problem. On another matter I'm looking for the inverse of the topic of this thread. Using the same basic approach how would one programmatically take a database archive (as what is produced as described above) and update/replace the database data. The idea here is that on a primary host you ftp your database to another host which serves as a back up to the primary host. So looking for once I've got the gzipped database to be able to replace the existing back up database. (Better yet update just the changes but I suppose that's asking a lot.)
  2. I also solely joined this site to say thanks for this script. (2.5 years after this post. :-) ) This really was helpful for me after messing with system() and wget for awhile. The change I made was to now send the backups to an Amazon S3 account. For my host I needed to include the username_ when specifying the data bases. I actually now use a variant of this as a cron job to send daily backups to my S3 account. You need CURL installed and it uses the S3.php library. I made a few code changes. (e.g changed to date() from time() for better readability and moved it inside the loop because I thought it would be nice to get the date/time to the second for each backup.) Anyway so far it works for me but as they say "your mileage may vary". <? require_once './lib/S3.php'; if (!extension_loaded('curl') && !@dl(PHP_SHLIB_SUFFIX == 'so' ? 'curl.so' : 'php_curl.dll')) die("\nERROR: CURL extension not loaded\n\n"); $awsAccessKey = 'xxxxxxxxxxxxxxx'; $awsSecretKey = 'xxxxxxxxxxxxxxxxxxxxxxxx'; $bucketName = 'xxxxxxxxxxx'; $user_name = "xxxxxxxx"; $user_pass = "xxxxxxxx"; $user_website = "www.yoursite.org"; $db_list = array("usrname_Backthisup.sql","usrname_backthisup2.sql"); // No need to edit below this line unless you want to keep a copy on your local host $s3 = new S3($awsAccessKey, $awsSecretKey); $looper = 0; while($db_name = $db_list[$looper++]) { $dc = Date('Y-m-d_G-i-s'); $get_this = "http://".$user_name.":".$user_pass."@".$user_website.":2082/getsqlbackup/".$db_name.".gz"; echo "<br>Attempting to download $db_name"; $handle = fopen($get_this, "r"); $save_as = "./dbbkuptmp/".$dc."_".$db_name.".gz"; if ($handle != null) { $local_handle = fopen($save_as,"x"); echo "<br>Saving $db_name as $save_as <br>"; while(!feof($handle)) fwrite($local_handle,fgets($handle)); } fclose($handle); fclose($local_handle); Echo "<br>Transferring $save_as to S3 $bucketName bucket"; $tries = 1; while(!$s3->putObjectFile($save_as, $bucketName, baseName($save_as), S3::ACL_PRIVATE) && $tries<=3) $tries++; //Not sure how robust this is so trying up to three times. unlink($save_as); // comment this out if you want to save a copy on your host } exit(); ?>
×
×
  • Create New...