(no title)
henrydark | 2 years ago
Forget that you know git, github, git-lfs, even software engineering for a moment. All you know is that you're developing a general project on a computer, you are using files, and you want version history on everything. What's wrong with that?
The major issue with big files is resources: storage, and network bandwidth. But for both of these it is the sum of all object sizes in a repo that matters, not any particular file, so it's weird to be harking on big files being bad design or evil.
iaresee|2 years ago
Perforce handled it all like a champ.
People who think large files don't belong in SCC are...wrong.
ChrisMarshallNY|2 years ago
I don't know if they still do it, but Unreal used to ship a Perforce license with their SDK.
rewmie|2 years ago
I don't think that is true. You do see people warn that having large files in Git repositories, or any repository that wasn't designed with support for large files in mind, is "wrong", in the sense that there are drawbacks for using a system that was not designed to handle them.
Here's a historical doc of Linus Torvalds commenting Git's support for large files (or lack thereof)
https://marc.info/?l=git&m=124121401124923&w=2
otp209|2 years ago
THANK YOU. Fucking prescriptivists ruin everything.
nightfader|2 years ago
nirvdrum|2 years ago
I've tried many different SCM over the years and I was happy when git took root, but its poor handling of large files was problematic from the beginning. Git being bad at large files turned into this best practice of not storing large files in git, which was shortened to "don't store large files in SCM." I think that's a huge source of our availability and/or supply chain headache.
I have projects from 20 years ago that I can build because all of the dependencies (minus the compiler -- I'm counting on it being backwards compatible) are stored right in the source code. Meanwhile, I can't do that with Ruby projects from several years ago because gems have been removed. I've seen deployments come to a halt because no startup runs its own package server mirror and those servers go offline or a package may get deleted mid-deploy. The infamous leftpad incident broke a good chunk of the web and that wouldn't have happened if that package was fetched once and then added to an appropriate SCM. Every time we fetch the same package repeatedly from a package server we're counting on it having not changed because no one does any sort of verification any longer.
iaresee|2 years ago
git has its place but it's really broken the world for how to think about SCC. There are other ways to approach it that aren't the ways git approaches it.
gizmo|2 years ago
tom_|2 years ago
If the files are particularly large, they can be excluded from the clone, depending on discipline and/or department. There are various options here. Most projects I've worked on recently have per-discipline streams, but in the past a custom workspace mapping was common.
incrudible|2 years ago
Not just the artifacts, but their entire history. That is a problem that Git has out of the box, but there is no reason it needs to work that way by default. LFS should be a first class citizen of a VCS, not an afterthought.
ghosty141|2 years ago
Some projects need the ability to version big files, there is a good reason why perforce exists and is widely used in the gaming industry.
goku12|2 years ago
unknown|2 years ago
[deleted]
ajb|2 years ago
unknown|2 years ago
[deleted]