(no title)
jauer | 6 months ago
LFS does break disconnected/offline/sneakernet operations which wasn't mentioned and is not awesome, but those are niche workflows. It sounds like that would also be broken with promisors.
The `git partial clone` examples are cool!
The description of Large Object Promisors makes it sound like they take the client-side complexity in LFS, move it server-side, and then increases the complexity? Instead of the client uploading to a git server and to a LFS server it uploads to a git server which in turn uploads to an object store, but the client will download directly from the object store? Obviously different tradeoffs there. I'm curious how often people will get bit by uploading to public git servers which upload to hidden promisor remotes.
IshKebab|6 months ago
I dunno if their solution is any better but it's fairly unarguable that LFS is bad.
jayd16|6 months ago
ozim|6 months ago
Mostly I did not run into such use case but in general I don’t see any upsides trying to shove some big files together with code within repositories.
AceJohnny2|6 months ago
In other words, if you migrate a repo that has commits A->B->C, and C adds the large files, then commits A & B will gain a `.gitattributes` referring to the large files that do not exist in A & B.
This is because the migration function will carry its ~gitattributes structure backwards as it walks the history, for caching purposes, and not cross-reference it against the current commit.
actinium226|6 months ago
gradientsrneat|6 months ago
Yea, I had the same thought. And TBD on large object promisors.
Git annex is somewhat more decentralized as it can track the presence of large files across different remotes. And it can pull large files from filesystem repos such as USB drives. The downside is that it's much more complicated and difficult to use. Some code forges used to support it, but support has since been dropped.
cma|6 months ago
remram|6 months ago