papillon Posted March 4, 2011 Share Posted March 4, 2011 hi, i would like to be able to create a backup of all the contents of a directory except one subdir. (that subdir is more than 1gb and doesnt change, so i dont want it downloaded with every backup...) is there any way to do it? maybe modifiying the script from the backup faq or something? thanks Quote Link to comment Share on other sites More sharing options...
TCH-Bala Posted March 5, 2011 Share Posted March 5, 2011 Is there any particular reason for this backup setup. I am asking you this because all our shared / reseller servers are having multiple backups by default 1) Weekly backup to a secondary disk which can be accessed via the "Backups" option on your cPanel 2) offline backups to our backup server, once every 12 hours with 7 iterations = 14 backup snapshots which can be accessed via "R1soft" link on cPanel. Quote Link to comment Share on other sites More sharing options...
papillon Posted June 16, 2011 Author Share Posted June 16, 2011 ops, somehow i forgot about this post, sorry... the reason is i would like to have periodical local backups on my pc. the problem is, 90% of the size of my site reside in a specific sub-subfolder. but this 90% rarely changes over time. the other 10%, though, may be changed frecuently. so i hoped to be able to make local backups in my pc every week, for example, for that 10%, BUT every six months for the 90%. this way reducing downloading times etc. also i have had problems when trying to download the weekly full backup from cpanel (takes long time and sometimes i end up getting a broken compressed file). the option in cpanel to restore last-week backups would be great for what i want, except i cannot download it, only restore (as far as i know) thanks, Quote Link to comment Share on other sites More sharing options...
TCH-Bruce Posted June 17, 2011 Share Posted June 17, 2011 Take a look at the second post in our Backup FAQ for a script to backup a specific subdirectory. Quote Link to comment Share on other sites More sharing options...
papillon Posted June 17, 2011 Author Share Posted June 17, 2011 Take a look at the second post in our Backup FAQ for a script to backup a specific subdirectory. thanks!, thats perfect to backup the big folder. a last question: any idea how to modify a copy of the script to do a backup of more than one folder in the same gzip? so if my site consist of: -public_html - folder a - folder b - big folder - folder c i would like to do a backup of folders a, b and c in the same gzip? thanks Quote Link to comment Share on other sites More sharing options...
TCH-Thomas Posted June 17, 2011 Share Posted June 17, 2011 I haven´t used the script that Bruce and Andy made, only the dbsender script that Bruce talks about in the first post. However, while it won´t backup everything in one zip file, I think you could copy paste that script to multiple php files (calling them folder_a.php, folder_b.php and so on) and modify the lines in each php file: $path= '/home/your-account/public_html/'; // Full path to backup$zippath = "/home/your-account/backups"; // path to save backup file While I see it can be handy to have everything in one big file, I also see it being handy having separate backups, if only one or two folders needs to be restored it sounds to me it will be faster Quote Link to comment Share on other sites More sharing options...
papillon Posted June 17, 2011 Author Share Posted June 17, 2011 ok will do it that way thanks all Quote Link to comment Share on other sites More sharing options...
TCH-Bruce Posted June 17, 2011 Share Posted June 17, 2011 Agree with Thomas here. Create multiple scripts, one for each folder. Quote Link to comment Share on other sites More sharing options...
papillon Posted June 17, 2011 Author Share Posted June 17, 2011 its me again... im having problems when trying to use the per-folder script... i have configured the script, created a cronjob to run it, and it works perfectly when configured to backup a small folder (the compressed file is generated, then send to the remote ftp server, then deleted from origin). but when i try it with a bigger folder, apparently it generates the compressed file ok (it takes about 5 minutes for 1gb compressed file), but theres an error when trying to send it to the remote server. i receive a cron job error email, and in the same folder i get an error_log file with the same info. the temporal compressed file is created aparently ok, and is not deleted nor send to the remote ftp server. any ideas? should i create a ticket for this? Quote Link to comment Share on other sites More sharing options...
TCH-Bruce Posted June 17, 2011 Share Posted June 17, 2011 Yes, check with the help desk. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.