top | item 2762730

Ask HN: What backup strategy do you use for your websites and databases?

15 points| pankratiev | 14 years ago | reply

9 comments

order
[+] agj|14 years ago|reply
For backups, I use rdiff-backup across 60G+ of user data, nightly. rdiff-backup uses librsync to transfer files, but also handles incremental backups and seems to be fairly efficient at storing increments. Nightly, maybe 10%, maximum, of the user data changes and backups complete in less than 2 hours. Load is low enough I could run several times a day if I needed.

Databases should be properly dumped to a file before back up.

I've also tried BackupPC, which was a great project, but probably not the best fit for this case. I was running it in a virtualized container, and I ran into a lot of memory issues backing up large servers. This issue likely came down to memory -- not necessarily BackupPC itself -- but I dropped it because backups commonly took around 6-8h, if they didn't silently hang on me.

[+] ScottWhigham|14 years ago|reply
I make local backups and then have a routine that downloads those backups. I tried one too many times relying on web hosts for backups...
[+] adyus|14 years ago|reply
Hmm, given that the question was asked on HN, there could be a good business idea there... File and DB backups offered as SaaS.
[+] latch|14 years ago|reply
I actually check that I can restore from my backups once a week (it's the first thing I do when I wake up Saturday morning).
[+] sander|14 years ago|reply
There actually is a service that backups your website through FTP regularly, just trying to think of the name...
[+] timdev|14 years ago|reply
rsnapshot with some scripts to prepare database dumps to grab ensure mysqldata is included. Works very well.
[+] Zakuzaa|14 years ago|reply
I rsync my backups to bqbackup.