top | item 33227043

(no title)

xenoscopic | 3 years ago

The general philosophy with Mutagen is to (a) delegate encryption to other tools and (b) use secure defaults (especially for permissions).

So, for example, Mutagen doesn't implement any encryption, instead relying on transports like OpenSSH to provide the underlying transport encryption. In the Docker case, Mutagen does rely on the user securing the Docker transport if using TCP, but works to make this clear in the docs, and Mutagen is generally using the Docker Unix Domain Socket transport anyway. When communicating with itself, Mutagen also only uses secure Unix Domain Sockets and Windows Named Pipes.

When it comes to permissions, Mutagen doesn't do a blanket transfer of file ownership and permissions. Ownership defaults to the user under which the mutagen-agent binary is operating and permissions default to 0700/0600. The only permission bits that Mutagen transfers are executability bits, and only to entities with a corresponding read bit set. The idea is that synchronizing files to a remote, multi-user system shouldn't automatically expose your files to everyone on that system. These settings can be tweaked, of course, and in certain cases (specifically the Docker Desktop extension), broader permissions are used by default to emulate the behavior of the existing virtual filesystems that Mutagen is replacing.

discuss

order

cassianoleal|3 years ago

So, transport-wise they're the same.

For files at rest on the remote, I guess I assumed files would be encryted on the remote with a local key since GP said "one can develop on untrusted remote machine" and "VSCode remote always assumes that the remote part is trusted".

On an actually untrusted remote, removing group read permissions doesn't do much to secure my code.

The only scenario where it's helpful is a system with multiple non-admin users, perhaps like a university lab computer but who's doing sensitive work on those anyway?

xenoscopic|3 years ago

In many ways Mutagen and VSCode's remote extensions are the same idea, with trade-offs in terms of flexibility vs. integration.

Shared systems with multiple non-admin users was one of the original motivating use cases for tighter default permissions.

I don't think there's any scenario where one can perform truly secure development work on an untrusted system. You could certainly store encrypted code in an untrusted location, but there's not much you could do with it on that system (without a hypothetical compiler or tool that maybe supported some sort of homomorphic-encryption compilation operations?). Even decryption on-the-fly for processing by regular tools wouldn't be secure on an untrusted system. And running any code there would be equally insecure.

I'd imagine that for any seriously sensitive work, one would only want to work in highly controlled, trusted, and firewalled environments. If there's a scenario I'm missing though, definitely let me know.