top | item 13560034

(no title)

anon987 | 9 years ago

It's because the 'problem' it solves is a corner case that's rarely encountered. I love their absurd examples of repos that take 12 hours to download. How many people have that problem, really?

All they did is create a caching layer.

discuss

order

ska|9 years ago

   How many people have that problem, really?
An easy lower bound is 10s of thousands of engineers : developers at several large tech companies (e.g. MS, facebook, google, ?)

nine_k|9 years ago

If you deal with code, the case is marginal for you.

If you deal with graphics, audio assets, etc, the binary-blob type of data, the case is central.

aanm1988|9 years ago

This is about code, and code history. Just insane volumes.

rplnt|9 years ago

Well it's a problem for thousands of employees of Microsoft, isn't it? We've had much smaller repository (10GB IIRC) and it really was annoying how long everything took, even with various caches and what not enabled.

sebastos|9 years ago

"I don't have this problem, so nobody does."

Lacking support for large binary blobs is, like, THE #1 reason that an engineer might have to use an alternative.

daxelrod|9 years ago

Ok, but you'll encounter similar git limitations with repos several orders of magnitude smaller than that too.

All you need is several hundred engineers and your monorepo becomes unwieldy for git to handle.

aanm1988|9 years ago

It's not a caching layer, it's lazy evaluation.

testUser69|9 years ago

[deleted]

sctb|9 years ago

We've already asked you to please stop this, so we've banned the account.

SippinLean|9 years ago

The recent Windows 10 thread was full of criticism for MS