top | item 41798359

End-to-End Encrypted Cloud Storage in the Wild: A Broken Ecosystem

136 points| dogtype | 1 year ago |brokencloudstorage.info

101 comments

order
[+] megous|1 year ago|reply
That was a good skim for me as someone who implemented one of the first independent mega.nz clients. Useful to know especially about structure authentication and ability to swap metadata on files and move files/chunks of files around when server is compromised, when there's no e2e authentication for this. Lots of traps all around. :)

Looks like the safest bet is still to just tar everything and encrypt/sign the result in one go.

I wonder how vulnerable eg. Linux filesystem level encryption is to these kinds of attacks...

[+] Sat_P|1 year ago|reply
I was using Boxcryptor with OneDrive for over 5 years and once they shut it down, I moved everything back to my local SSD. This had a number of advantages, the biggest one being that I could now use MacOS search to find files at lighting speed. I’ll never go back to cloud storage for files again due to latency. As a precaution, I now back up all of my data to an external HDD daily, then to a separate one on 1st of each month. Critical financial data is archived to a BluRay on the first day of each quarter.
[+] triyambakam|1 year ago|reply
Hmm, I wish the author had reviewed Proton. I think it's kind of seen as a meme here? But I heavily rely on it and generally the Proton ecosystem is getting better and better from a UX perspective
[+] canadiantim|1 year ago|reply
I think Proton is more viewed as a honeypot
[+] xarope|1 year ago|reply
I like the way you can use the tabs to check the results of each reviewed cloud storage service, and the exposition on each. Anybody know what the authors used to create this website? Custom built, or a templated version?
[+] ThePhysicist|1 year ago|reply
Nice to see that Tresorit didn't have any serious issues in this analysis, I've been using that for a long time and it works really great, also one of the few players that have a really good Linux client.

The two vulnerabilities they found seem pretty far-fetched to me, basically the first is that a compromised CA server will be able to create fake public keys, which I honestly don't know how one could defend against? Transparency logs maybe but even that wouldn't solve the issue entirely when sharing keys for the first time. The second one around unencrypted metadata is hard to assess without knowing what metadata is affected, it seems that it's nothing too problematic.

[+] tptacek|1 year ago|reply
Tresorit had a game-over vulnerability: public keys aren't meaningfully authenticated (the server can forge keys; the CA the paper discusses is operated by the service) and any attempt to share a directory allows the server to share that directory with itself.
[+] fguerraz|1 year ago|reply
It's too bad they focused on commercial closed-source solutions providers. The ecosystem would have really benefited if they had put their efforts to, for example, do the same work with NextCloud.
[+] xarope|1 year ago|reply
seafile is open source (https://github.com/haiwen/seafile), or at least was, when I looked at it years ago. Definitely a concern when the paper mentioned an acknowledge of the protocol downgrade as of 29th April 2024, yet the latest version on the seafile github is dated feb 27.
[+] iknowstuff|1 year ago|reply
curious about iCloud with Advanced Data Protection enabled
[+] MichaelZuo|1 year ago|reply
Considering iCloud does have some documented cases of silent corruption, such as of original resolution media stored in Photos, it might not be the best choice.
[+] java-man|1 year ago|reply
I want to see the response from sync.com on this, especially about

  Unauthenticated Key Material

  Unauthenticated Public Keys
attacks.
[+] V__|1 year ago|reply
Since ente.io's server is just an object storage, I feel at some point either ente or someone else is going to make a drive app for it.
[+] cobbzilla|1 year ago|reply
The sad state of E2E encryption for cloud storage is a big part of why I wrote mobiletto [1]. It supports transparent client-side encryption for S3, B2, local storage and more. Rekeying is easy- set up a new volume, mirror to it, then remove old volume.

[1] https://github.com/cobbzilla/mobiletto

[+] gertop|1 year ago|reply
> Rekeying is easy- set up a new volume, mirror to it, then remove old volume.

Right, just have to transfer those 10TB every time a key needs to be rotated, no biggie!

I think that is the reason why most systems use two levels of keys (user keys encrypting a master key. Rotating means ditching the user keys, not the master.)

[+] CPAhem|1 year ago|reply
We use Syncdocs (https://syncdocs.com) to do end-to-end Google Drive encryption.

The keys stay on the client. It is secure, but means the files are only decryptable on the client, so keys need to be shared manually. I guess security means extra hassle.

[+] nonamepcbrand1|1 year ago|reply
https://dropbox.tech/security/end-to-end-encryption-for-drop...

dropbox has been mentioned in the article and I think the author is drinking kool-aid and throwing random facts

[+] tptacek|1 year ago|reply
It's not an article, it's an academic paper, and Dropbox isn't one of the targets.
[+] mr_toad|1 year ago|reply
If you don’t trust your cloud provider to not look at your data, why would you trust them with encryption?

It’s not hard to encrypt it before you upload it.

[+] tptacek|1 year ago|reply
Because not having to trust the provider is the entire premise of these services, and without that premise, you might as well just store things in GDrive.
[+] eemil|1 year ago|reply
One downside to encryption, is it prevents the server operator from doing any deduplication (file or block level) on their end.

Maybe one reason why cloud providers aren't pushing it that heavily. Especially the big players, since more data = more duplication = more efficient deduplication.

[+] tjpnz|1 year ago|reply
Double edged sword. Mega Upload were doing it and it was argued (successfully) in court that they therefore had knowledge of what they were hosting.
[+] willis936|1 year ago|reply
That's fine. We pay for storage. I'll pay extra to not have the host spy, sell, etc. my data.

Deduplication only really shines if most data is pirated copy data. In reality the vast majority of data is in fine details of high resolution photos and videos of completely uncorrelated images.

[+] idle_zealot|1 year ago|reply
Is that true? Couldn't you run dedupe on blocks of encrypted files? I assume there would be fewer duplicate blocks compared to the cleartext, but if you have a bunch of blocks full of random bits there are bound to be repeats with a large enough number of blocks.
[+] slac|1 year ago|reply
Google Drive has allowed for client side encryption since 2022... This papers first paragraph is false.
[+] ranger_danger|1 year ago|reply
Only for enterprise workspace customers. rclone/cryptomator/etc. has always been possible with practically any solution though.
[+] thinkingofthing|1 year ago|reply
Can you provide sources? I haven't been able to find the option for true E2EE
[+] swijck|1 year ago|reply
The world changes once you realize why usually encryption is capped at AES256...
[+] oconnore|1 year ago|reply
256 bit symmetric cryptography keys are a bit like picking one atom in the universe (10^80 atoms, or 100000000000000000000000000000000000000000000000000000000000000000000000000000000). Your opponent would have to test half of the atoms in the universe to have a reasonable chance of getting the right key.

That's generally understood to be not feasible.

[+] ziddoap|1 year ago|reply
Care to enlighten us? What did you realize?