top | item 29534150

(no title)

galonk | 4 years ago

A lot of people sing BB's praises but I never had a good experience with them. The client was always slow, buggy, and resource hungry, and its UI is terrible. They got shirty with me for reporting bugs when I was using a macOS beta. And finally, at some point even though nothing about my computer changed (it was a Mac Mini, what was going to change), I got a message saying some security/copy protection system had detected that my computer was "different", and I had to un-install and re-install the entire app to fix it (there apparently being no easier way to unset a flag). I uninstalled and skipped the second part.

Instead of using BB, get a Synology/Qnap/FreeNAS box to backup all your stuff locally, and back that up to another service (e.g. Glacier or Synology's own C2).

discuss

order

willis936|4 years ago

I caution the casual reader against glacier. It's not what it appears at a glance. Your files should be put into a single archive before upload otherwise you'll spend weeks waiting for AWS scripts to manage old files.

B2/S3 is what most people want.

linsomniac|4 years ago

We have 23TB of images stored in S3 and I was recently looking at moving them to Backblaze to save hundreds of dollars per month. These are all individual image files, because reasons.

Then I realized that S3 Glacier and Deep Archive were even less expensive than B2. I took a bit further of a look and found that Glacier/DA files have some fairly chonky metadata that must be stored in normal S3, and for a lot of our images the metadata was larger than the image in question. So Glacier/DA would increase our storage costs. Over all it probably wasn't a money-saving situation.

The ideal use case is to bundle those up into a tar file or something and store those large files, and manage the metadata and indexing/access ourselves.

So, using rclone to copy 11TB of data to B2.

iJohnDoe|4 years ago

> otherwise you'll spend weeks waiting for AWS scripts to manage old files.

Can you elaborate on this part?

spacedcowboy|4 years ago

I just bought a Quantum Superloader3 last Xmas. Each LTO-8 tape (it can take 16 in 2 magazines, I use 15 + 1 cleaning tape) will hold 12 TB without compression, 30TB with, and 7 of them can back up the 100TB used on the 128TB of disk that is the house RAID array.

It takes about 2 days to make a full backup, and I can fit incrementals for the next 5 days on the batch-of-7. Then I switch to the second magazine, and do the same thing. I actually have 3 magazines, one of which I swap in and out every week, and during the before-times, I'd take that off-site to work.

I have ~30 years of data, from back when I was in college and writing CD-ROMs for backup, all on the one system. Admittedly, the major space-taking thing is the Plex library, but I wouldn't want to lose that either. It takes about 5 minutes to walk into the garage (where the server-rack is), swap magazines and I'm done - the rest is automatic.

I have vague ideas for writing a more-efficient tar designed for this specific type of setup (big disk with attached tape). The best way to do it I think is to have multiple threads reading and bzip2-compressing data, piping blobs through to a singleton tape-writer thread. Every now and then (50GB, 500GB, 1TB ?) close the device and reopen the non-rewindable device to get a record-marker on the tape, and then store the tape/record-marker/byte-offset etc. into a SQLite database on the disk. That way I'd get:

- High levels of compression without making the tape head wait for the data, which ruins the tape head. Multiple threads pooling highly-compressed data into a "record"

- fast lookup of what is where, I'm thinking a SQL LIKE query syntax for search, against the disk-based DB. No more waiting for the record to page in from the end of the tape.

- fast find on-tape, since you'd know to just do the equivalent of 'mt fsf N' before you actually have to start reading data

Right now, tar is good enough. One of these days when I get time, I'll write 'bar' (Backup And Restore :)

simplyaccont|4 years ago

small question about superloader3. how noisy it is ?

k8sToGo|4 years ago

You need to differentiate between BB Personal Backup and BB B2 service which is more like something you suggested. But these days I just use rsync.net + Wasabi + Kopia + rclone.

ericcholis|4 years ago

I really want to use rsync.net, but the price per GB scares me off.

Osiris|4 years ago

I have 3.5TB backed up to Backblaze for $5 a month. I haven't found any other online backup option that provides anywhere near that cost.

I use a RAID 1 to handle drive failure and also keep local backups on a NAS. BB is my third layer of backup. I've never run into issues with BB backups so I'm happy for what I get for the price.

markdown|4 years ago

> I have 3.5TB backed up to Backblaze

The question this thread seems to be raising is... do you? Do you really? Are you sure?

janitha|4 years ago

S3 Glacier Deep Archive, $0.00099 per GB per month.

I have a ZFS based NAS. And periodically do a incremental backup (zfs send) of the entire dataset, encrypt it gpg and pipe it straight up to S3 deep archive. Works like a charm.

The catch with S3 deep archive is if you want to get the data back... It's reliable, but you will pay quite a bit more. So as a last resort backup, it's perfect.

kingcharles|4 years ago

If you want to try another sketchy service, I use iDrive which was $9.95 for 10TB for the first year.

mlangenberg|4 years ago

> get a Synology/Qnap/FreeNAS box to backup all your stuff locally, and back that up to another service (e.g. Glacier or Synology's own C2).

I'm not a big fan of backing-up a back-up and opted for a Time Machine backup to a local nas and in parallel, an off-site backup to B2 with Arq on my macs.

doublepg23|4 years ago

I have a Synology NAS currently and use their HyperBackup hosted tier. Maxes out at 1TB sadly, not even enough for my laptop backup. C2 on the other hand is well priced for bulk storage but does not have the “free” deduplication.

hedora|4 years ago

I use HyperBackup to backup multiple TB to Backblaze B2. Works great, FWIW.