I was happy when yarn first came onto the scene and gave npm the kick in the butt it needed to improve.
Now I wish yarn could be deprecated and we could go back to a single package manager. There's unfortunately segmentation in different areas around package managers, e.g. electron seems to prefer yarn. And for package maintainers there's extra overhead to document and test installation with both npm and yarn.
I hear you, but things are not really moving in that direction, because it's not that simple. The closer you look into what they do and how, the clearer it becomes that [npm7 vs yarn1 vs yarn2 vs pnpm] is the current set of legit choices, for various reasons.
Yarn v2 PnP is simply a lifesaver if you have a medium+ sized monorepo.
We have a monorepo with 180 packages here. Without pnp, it takes 1h+ just to npm install any new third party package in any local package, it’s a joke. With pnp it takes 18s.
So yes, from my point of view NPM is completely inadequate for any serious JS codebase.
We have a pretty large monorepo codebase (460 packages and counting) that we're migrating from yarn v1 to yarn v2. I'll say it's definitely not a plug-n-play migration (pardon the pun).
Some issues we ran into:
- it can be difficult to reason about the layout of peer dependencies. Often times libraries that rely on Symbols or referential equality break and you need to mess with packageExtensions or add resolutions or unplug. Debugging mostly consists of throwing stuff at a wall and see what sticks
- file watching at large enough projects breaks w/ file descriptor exhaustion errors, forcing you to find and unplug the offending libraries
- there's a number of known incompatible libraries (flow being one of the most prominent) and the core team approach to these at this point follows paretto (20% effort for 80% results, e.g. special casing for typescript), meaning I don't believe there will ever be a 100% compatibility milestone w/ the ecosystem
- it's much more black-boxy in terms of day-to-day debugging (e.g. it's much harder to manually edit files in node_modules to trace some root cause)
- we ran into inscrutable errors deep in packages that interface w/ C++ and basically were only able to fix by pinning to an earlier version of a library that did not depend on said package.
- migration cost is heavily proportional to codebase complexity. My understanding is that Facebook gave up on it completely for the foreseeable future and ours has similarly been a significant time investment (and we're not even targeting strict mode yet)
The pros:
- install times and project switching times are indeed fast, even in our codebase that contains multiple major versions of various packages
- yarn v2 has many interesting features, such as protocols (though it's debatable if you want to commit to lock-in to benefit from those)
I tried yarn 2 on a greenfield project, but discovered:
- pnp is made possible in part by mysterious “patches” to certain dependencies that don’t work well with it. Mysterious as in they’re obfuscated, and there isn’t much detail besides commit history. This is blocking if you wanna try out, say, the TypeScript 4.1 beta and the patch isn’t ready yet. But more importantly um... I do not want my dependency manager mysteriously patching stuff with obfuscated code?????
- it applies these patches even if you disable pnp, so same objections to the entire yarn 2 approach (currently)
So I’m back on yarn 1 and apparently gonna need to look at npm 7 at this point.
Did you try out pnpm, by chance? I’ve read a few good things, but it doesn’t seem to get mentioned all that often. So I’m curious what people with larger projects think about it.
Cannot figure out why you are being down voted. YarnV1 and NPM are horrible if you have a large dependency tree. YarnV2 was the first time I enjoyed the package manager.
> The package-lock.json file is not only useful for ensuring deterministically reproducible builds. We also lean on it to track and store package metadata, saving considerably on package.json reads and requests to the registry. Since the yarn.lock file is so limited, it doesn’t have the metadata that we need to load on a regular basis.
So I guess there are some performance benefits with npm 7 compared to Yarn 1?
And interestingly, Yarn 2 actually goes quite a long way off what a lot of Node users wanted from it originally (at least we have no interest in moving to it).
If you just use "yarn" as you'd think of it, you are probably still using Yarn 1, so I guess it's being thought of as a different parallel project
Are there reasons to go back to npm? I switched back when yarn came out and haven't looked back. Been super happy with yarn. Can't say the same about npm.
People are more likely to already have npm installed and to be familiar with it. So there's an argument to be made that all else being equal, picking npm lowers the barrier to entry for new contributors. This consideration could be especially important for open source projects.
I've been considering switching to pnpm for political reasons since using open source projects that are ultimately at the mercy of big corps (npm > Microsoft, yarn > Facebook) makes me slightly uneasy. But I'm hesitant to because pnpm seems so new.
Have you encountered any regularly occurring issues or headaches regarding pnpm?
patwolf|5 years ago
Now I wish yarn could be deprecated and we could go back to a single package manager. There's unfortunately segmentation in different areas around package managers, e.g. electron seems to prefer yarn. And for package maintainers there's extra overhead to document and test installation with both npm and yarn.
chrisweekly|5 years ago
crubier|5 years ago
We have a monorepo with 180 packages here. Without pnp, it takes 1h+ just to npm install any new third party package in any local package, it’s a joke. With pnp it takes 18s.
So yes, from my point of view NPM is completely inadequate for any serious JS codebase.
lhorie|5 years ago
Some issues we ran into:
- it can be difficult to reason about the layout of peer dependencies. Often times libraries that rely on Symbols or referential equality break and you need to mess with packageExtensions or add resolutions or unplug. Debugging mostly consists of throwing stuff at a wall and see what sticks
- file watching at large enough projects breaks w/ file descriptor exhaustion errors, forcing you to find and unplug the offending libraries
- there's a number of known incompatible libraries (flow being one of the most prominent) and the core team approach to these at this point follows paretto (20% effort for 80% results, e.g. special casing for typescript), meaning I don't believe there will ever be a 100% compatibility milestone w/ the ecosystem
- it's much more black-boxy in terms of day-to-day debugging (e.g. it's much harder to manually edit files in node_modules to trace some root cause)
- we ran into inscrutable errors deep in packages that interface w/ C++ and basically were only able to fix by pinning to an earlier version of a library that did not depend on said package.
- migration cost is heavily proportional to codebase complexity. My understanding is that Facebook gave up on it completely for the foreseeable future and ours has similarly been a significant time investment (and we're not even targeting strict mode yet)
The pros:
- install times and project switching times are indeed fast, even in our codebase that contains multiple major versions of various packages
- yarn v2 has many interesting features, such as protocols (though it's debatable if you want to commit to lock-in to benefit from those)
eyelidlessness|5 years ago
- pnp is made possible in part by mysterious “patches” to certain dependencies that don’t work well with it. Mysterious as in they’re obfuscated, and there isn’t much detail besides commit history. This is blocking if you wanna try out, say, the TypeScript 4.1 beta and the patch isn’t ready yet. But more importantly um... I do not want my dependency manager mysteriously patching stuff with obfuscated code?????
- it applies these patches even if you disable pnp, so same objections to the entire yarn 2 approach (currently)
So I’m back on yarn 1 and apparently gonna need to look at npm 7 at this point.
vosper|5 years ago
https://pnpm.js.org/
georgyo|5 years ago
untog|5 years ago
Maybe it's just me but a monorepo with 180 packages sounds like a hole you've dug yourself into and you're propping yourself up with yarn.
I certainly don't think that anyone who keeps their packages separate (you could do that even within a monorepo, surely) has a "non-serious" codebase.
cute_boi|5 years ago
madjam002|5 years ago
croes|5 years ago
benologist|5 years ago
https://guides.sonatype.com/repo3/quick-start-guides/proxyin...
https://hub.docker.com/r/sonatype/nexus/
unknown|5 years ago
[deleted]
imedadel|5 years ago
> The package-lock.json file is not only useful for ensuring deterministically reproducible builds. We also lean on it to track and store package metadata, saving considerably on package.json reads and requests to the registry. Since the yarn.lock file is so limited, it doesn’t have the metadata that we need to load on a regular basis.
So I guess there are some performance benefits with npm 7 compared to Yarn 1?
wayneftw|5 years ago
BillinghamJ|5 years ago
If you just use "yarn" as you'd think of it, you are probably still using Yarn 1, so I guess it's being thought of as a different parallel project
moogly|5 years ago
kjaer|5 years ago
azangru|5 years ago
Ships with Node.
cwp|5 years ago
OTOH, yarn handles this just fine.
Sheepsteak|5 years ago
https://github.com/npm/cli/issues/1984
arxpoetica|5 years ago
toastercat|5 years ago
Have you encountered any regularly occurring issues or headaches regarding pnpm?