(no title)
gnoway | 9 years ago
My major complaint about npm is the choice to allow version range operators on dependency declarations. We know the node.js ecosystem places a high value on composability, so using lots of tiny modules which themselves depend on lots of tiny modules is the norm. This is a problem though because range operators get used liberally everywhere, so getting reproducible builds is like winning the lottery.
There are other things I don't like about using npm: node_modules/ is big and has a lot of duplication (even with npmv3), it's pretty slow, historically it has been unstable, its still crap on Windows, etc. - but for someone who has 'ensures reproducible builds' as part of their job description, the way its modules get versioned is its worst feature.
nilliams|9 years ago
The range operators are important, else you'd never be able to resolve 2 packages that want a similar versioned sup-dependency e.g. jquery 1.12 because without range operators those 2 packages would have declared minor version differences (1.12.1 and 1.12.3) depending on when they were published. This would mean you'd always end up with duplicated dependencies.
I'd argue 'node_modules is big' is not a fault of npm. If the package or app you're trying to install generates a large node_modules dir, that is something you should take up with the package maintainer. See buble vs babel - buble has a way smaller dep tree.
npm is only slow in the ways that all other package managers are, when installing large dependency trees or native dependencies (like libSass) and it is way faster than say pip and rubygems in this regard. When I 'pip install -r requirements.txt' at work, I literally go and make a coffee.
Also never experienced any instability, though I may have been lucky. Certainly it has been very stable for the last year or so when I've been working with a lot. Could you elaborate on why it is crap on Windows? I did think all major issues (e.g. deep nesting problem) were now fixed ...
mordocai|9 years ago
It shrinkwraps everything in your current node_modules directory.
This includes platform specific dependencies that may not work on other platforms but now will cause npm install to fail instead of just printing a message about it.
So our current workflow has to be:
1. Update package.json 2. rm -rf node_modules/ 3. npm install --production # This doesn't include any of those pesky platform specific packages 4. npm shrinkwrap 5. npm install # Get the dev dependencies
As far as the other comments about npm, I just generally have more problems with it than rubygems/bundler and the general OS package managers.
gnoway|9 years ago
Deep nesting is not 'solved' it just doesn't happen 100% of the time anymore. If you have conflicts, you still have deep trees. I suppose range operators help with this a little, but looking at what gets installed it doesn't seem to help that much; I still have duplicated dependencies.
I was mentally comparing npm to tools like maven, ivy and nuget, all of which are faster but also not interpreted. Not a fair comparison I guess.