top | item 23181077

(no title)

agounaris | 5 years ago

The article says that monorepos are more efficient but also that "As the monorepo grew, the build target list increased to a point where it became too long to pass it through Bazel’s command line interface.".

Monorepos are not efficient. They are easier to manage when a team is small but as the team grows and you have more and more deliverables with separate versioning you are introducing control structures in your automation. Complexity explodes!

Anyway, all this does matter if you don't make any profit :)

discuss

order

marcinzm|5 years ago

>Monorepos are not efficient. They are easier to manage when a team is small but as the team grows and you have more and more deliverables with separate versioning you are introducing control structures in your automation. Complexity explodes!

The complexity exists whether it's a monorepo or many separate repos. A monorepo lets you encode that complexity as versioned code in your build system. Separate repos encode it across people's heads, wikis and who knows what else. Hiding complexity doesn't mean it doesn't exist, just that it will bite you 10 times as hard eventually.

jeffbee|5 years ago

That's a "holding it wrong" kind of affair, isn't it? Who expects to be able to "bazel test `shell thing with more than 4MB of output`"? Wouldn't a judicious application of either `...` or `xargs` fix this?

elteto|5 years ago

Agreed. Although the number of targets seems excessive tbh, but to each their own.

luckydata|5 years ago

Google is a monorepo as far as I know, and they are doing fine.

jeffbee|5 years ago

Indeed, in their paper from 5 years ago Google claimed 300000 commits per day across 9 million source files, compared to this article claiming 10000 commits per month on 70000 source files at Uber. Whatever the differences are between blaze and bazel, it must be the case that the former can easily scale to this size of repo.

I like to look at Google's GitHub commit messages to get an idea of the pace of their revision history. Yesterday they committed something with a Piper revision of 311324901. A month ago it was 306514102, and a year ago it was 248381230. That's about 160k revision numbers per day.

tehlike|5 years ago

Google monorepo is a productivity boon.

Disclaimer: ex-googler.

rantwasp|5 years ago

huh. last i checked with the people that work there the tooling google has around managing the repo, build, test etc is insanely good. if you’re thinking you can do monorepos because google think again. you don’t have the tooling and you don’t even know what the tooling is

elteto|5 years ago

But the number of targets is a function of _their_ code, not of using a monorepo, no?

If you need to build N libraries and M executables then you are going to have N+M targets (assuming building for one arch only) whether you use a monorepo or not... Either way you are going to issue N+M build commands.

Also, in bazel you can do

  bazel build //...
to do a full build of all targets in a workspace. If they are not doing this but instead passing each target name individually then that probably means that they are only building a subset of everything, and even that subset is too much to pass in a single command line invocation. I'll grant you that it seems excessive to have that many targets, but again I don't see having so many targets as an explicit issue of the monorepo.

jayd16|5 years ago

A monorepo just lets you have commits across projects. It says nothing about how you build things.