top | item 22884586

GitHub has completed its acquisition of NPM

312 points| 0xedb | 5 years ago |github.blog

199 comments

order
[+] throwaway894345|5 years ago|reply
For those who were having deja vu, this is a notification that GitHub completed its acquisition of NPM.
[+] VonGuard|5 years ago|reply
This is a good thing. When they were independent, NPM was a disaster area. The company spent 100% of its time chasing down social issues and insanity in the community and never figured out how to make money, or at least, it took them FOREVER to figure that out.

Years ago, they introduced "orgs" which they sat there and explained to me with slides and pictures and concepts and business bullshit for an hour. I did not understand a thing they'd said. Finally, they were like "We're selling private namespace in the npm registry for blessed packages for groups or businesses." I understood that. If they'd just said that up front....

They had some great people, some very smart folks like CJ, but they completely biffed every business decision they ever made, and when you'd go in and talk to the leadership, they were always acting as if they had some sort of PTSD from the community. I mean, people were putting spam packages in NPM just to get SEO on some outside webpage through the default NPM package webpages. People were squatting and stealing package names. Leftpad... the community management here is nightmarishly hard, and I was never convinced they'd ever make money on it. MS doesn't NEED to make money on it. They can just pump in cash and have a brilliant tool for reaching UX developers around the world, regardless of whether they use Windows or not.

I feel like the GitHub group at Microsoft is now some sort of orphanage for mistreated developer tool startups. GitHub had similar management issues: they refused to build enterprise features at all for years unless they were useful to regular GitHub.com. And there were other people issues at the top for years. Chris seemed more interested in working with the Obama administration on digital learning initiatives than with running GitHub, for example.

[+] zozbot234|5 years ago|reply
> I mean, people were putting spam packages in NPM just to get SEO on some outside webpage through the default NPM package webpages. People were squatting and stealing package names. Leftpad... the community management here is nightmarishly hard

It's not NPM's fault (well, other than wrt. the leftpad thing), it's all about the "community". The Javascript open source community is a dumpster fire.

[+] Cthulhu_|5 years ago|reply
They could have made a LOT of money (I think) by offering a "secure" registry - only packages that were reviewed, verified and signed would end up in there, and they would be sealed and made available forever. Companies could have their developers use only that because there is still a huge risk of a malicious actor pushing a patch version of a library with a security vulnerability in it, and at the moment security is still a reactive action in the Node ecosystem.
[+] pjc50|5 years ago|reply
> I feel like the GitHub group at Microsoft is now some sort of orphanage for mistreated developer tool startups.

That seems .. not so bad really? Microsoft gets to buy them cheaply, and they don't get obliterated or acquihired, and they don't seem to have Yahoo'd them into slow death either.

[+] redthrowaway|5 years ago|reply
npm is still a disaster, but for other reasons:

    $ time rm -rf node_modules/

    real 1m2.969s 
    user 0m0.409s
    sys  0m15.853s
[+] gowld|5 years ago|reply
It's not a bad idea for MS to sponsor this stuff as a PR play, but it needs a strong commitment to ombudsing and community oversight, or it becomes just another Embrace/Extend/Extinguish.
[+] sytse|5 years ago|reply
Someone asked "Would this have made sense for a company like GitLab if they didn't have the corporate backing of something like MS?" and deleted their comment while I was writing the answer below:

Being the canonical registry for a language (Rubygems) or technology (DockerHub) tends to be a huge expense.

The main expenses are cloud costs (bandwidth and storage) and security (defense and curation).

I've not seen examples of organizations turning this into a great business by itself. For example Rubygems is sponsored by RubyCentral http://rubycentral.org/ who organize the annual RubyConf and RailsConf software conferences.

Please note that running a non-canonical registry is a good business. JFrog does well with Artifactory https://jfrog.com/artifactory/ and we have the GitLab Package Registry https://docs.gitlab.com/ee/user/packages/ that includes a dependency proxy and we're working on a dependency firewall.

[+] montroser|5 years ago|reply
I never quite got a warm-fuzzy feeling from npm -- the tool, the service, the company. This announcement does nothing to help, from my perspective. Is my dependency on this or that JavaScript library something that really needs to be owned by a for-profit company?

I also kind of wonder what is the real value of a centralized repository versus just directly referencing git repos. I haven't used this gpk[0] project yet, but it looks like an interesting alternative, on paper.

[0]: https://github.com/braydonf/gpk

[+] coderzach|5 years ago|reply
You'd be surprised how often git repos disappear when you have 100s or 1000s of deps.
[+] Waterluvian|5 years ago|reply
You can still reference repos directly with npm.
[+] KenanSulayman|5 years ago|reply
Much better: mandatory vendoring of packages. Can't break and being forced to push the packages to the repo makes you appreciate the lack of transient dependencies.
[+] Touche|5 years ago|reply
Immutability and semver are the reasons
[+] animalCrax0rz|5 years ago|reply
This brought up in my mind the thought that while Deno is still WIP (for example, packaging of Rust plugins is not yet resolved) and the ecosystem around it barely exists it was designed to have no dependency on 3rd party tools like npm and yarn.
[+] jakear|5 years ago|reply
It also provides none of the benefits of npm/yarn. In my understanding it’s as if every package you used pinned all of their deps.
[+] tracker1|5 years ago|reply
I'm a bit mixed... that said, it's only a small step apart from what rust does for packages...

I would like to see a package repo for deno, if only to ease publishing/finding modules.

[+] mtm7|5 years ago|reply
Out of curiosity, what benefits does Microsoft/GitHub get from owning a package registry? I'd be fascinated to learn more about their long-term strategy here.
[+] rl3|5 years ago|reply
Curious world we live in, where the infrastructure behind so many OSS projects can simply be acquired.

What's preventing the dream of decentralization from taking off? We have the technology.

[+] mjibson|5 years ago|reply
Money prevents it. It takes money to host things and pay people to work on infrastructure. While people often volunteer to contribute to OSS products because they like or use them, not many are willing to write infrastructure that can handle this kind of traffic in their spare time. Even if you can find someone to donate the time, you'd still need to fund that infra in some way. Having an infra company (say, Google donates a bunch of GCP credits) to cover the hosting costs still puts the project at risk if the host company decides to stop funding.
[+] toomuchtodo|5 years ago|reply
Start a non-profit 501(c)(3). Only a non-profit can acquire the assets of another non-profit, so it's somewhat of a poison pill. Budgets for orgs listed below are anywhere between $100k up to a few million dollars per year. This doesn't mean you can't run on a shoe string; Hacker News and Pinboard run on single servers with hot spares.

Examples: Let's Encrypt (Certs), Internet Archive (Culture), Quad9 (DNS), Wikipedia (Knowledge), OpenStreetMap (GIS), Python Packaging Authority ["PyPi"] (as part of the Python Software Foundation)

EDIT: Seriously, start non-profits whenever considering implementing technology infrastructure you're unlikely to want to extract a profit from and are seeking long term oversight and governance.

[+] paxys|5 years ago|reply
There's nothing stopping someone from pulling code from an alternate package registry, or directly from someone's computer. So all the required decentralization infrastructure already exists. For people to use it, though, it has to be convenient.

The average developer isn't interested in showcasing their social/political views, starting a revolution or building the future of the internet. They just want to get the job done as quickly and effectively as possible and go home.

[+] ilaksh|5 years ago|reply
I remember some years ago when npm was having a lot of stability issues because the guy just was not being given adequate time/resources from Joyent or whatever, I made a post on r/node saying now was the time for a fully decentralized package registry.

The post if I remember was mostly ignored, but received a few downvotes and maybe a couple of negative comments.

Based on that, it seems that what's preventing decentralization from taking off is ignorance and apathy.

A day or two later the guy announced npm, Inc. if I remember.

There are actually a lot more developers that have accepted a federated services worldview than a peer-based fully distributed one. But there are package registry projects along both of those lines.

https://github.com/orbs-network/decentralized-npm

https://blog.aragon.one/using-apm-to-replace-npm-and-other-c...

https://github.com/entropic-dev/entyropic

https://github.com/axic/mango

But again, ignorance, apathy, and the status quo remain the most popular options.

[+] ocdtrekkie|5 years ago|reply
NPM by it's very nature is a centralized repository, not really an agent of decentralization, where you'd get code from the authors' sites/servers directly.
[+] xrendan|5 years ago|reply
Who pays for that decentralized infrastructure though?
[+] DeathArrow|5 years ago|reply
>What's preventing the dream of decentralization from taking off? We have the technology.

Time and money.

[+] tracker1|5 years ago|reply
discoverability mostly... the social aspect a second, smaller issue.

I think this is probably a good thing in general, and should maybe lead to some interesting enhancements, and maybe even finally solve the distribution of binary modules at a better level.

[+] doctoboggan|5 years ago|reply
Question from a new JS developer: Should I be using NPM to manage my dependencies?

I have recently started getting into JS programming. I have thus far avoided NPM, because I've been trying to use CDNs for all my external dependencies.

My thinking is that it saves me bandwidth costs and potentially saves my user's bandwidth as well if they get a cache hit.

I get the downsides are that I don't control the CDN and they could go offline, but honestly I expect I am much more likely to go down from some mistake in my own deployment rather than a well known CDN being offline.

I am wondering if I am missing something though, because absolutely every JS package I read about suggests you use NPM (some also link a CDN, many don't). Should I be using NPM to manage my JS dependencies instead of using CDNs?

[+] giantDinosaur|5 years ago|reply
IIRC it turns out the cache hits from CDN'ed Javascript files ended up being fairly low and neglible, due to how many different versions there are of everything. Better just reduce the file size.
[+] ehnto|5 years ago|reply
If you can afford to load an image, you can afford to send a JS file. Compile all your JS in to one file and host it yourself. The original sell for JS CDNs was that the library would already be cached from the user visiting another site, and that it would serve it from a local edge. It's really not that big of a sell, and comes with a bunch of risk.

CDNs are far less reliable than my own site, and if my own site is down it's not much help that the CDN is up. Pull in two libraries from CDNs and suddenly you have three points of failure instead of one. Their traffic spikes become your traffic spikes, their downtime is your downtime. And for what? The possibility that maybe the user had that one tiny js library cached, or that the CDN has a node 100ms closer? Not worth it.

[+] lioeters|5 years ago|reply
Personally, I only use external library CDNs during early stages of development, or quick prototypes.

There are advantages to having all needed assets locally. The main point for me is to minimize external dependencies during runtime - fewer points of failure. Also, vendor libraries can be a single minified bundle served from the same domain. In production they can be moved to a CDN, i.e., CloudFlare.

Using NPM makes sense once you start having more than a few dependencies, or a build step.

On the other hand, if you can get by with library CDNs and don't feel the need for NPM - I'd say that sounds fine, to keep it simple and practical.

[+] armatav|5 years ago|reply
Good luck not using NPM - it's completely pervasive. CDN should be used in certain contexts, for everything else you're best off using NPM.
[+] fzil|5 years ago|reply
dang, Microsoft going around acquiring dev tools like its a monopoly game
[+] Pmop|5 years ago|reply
I don't have a good feeling about this kind of centralization.
[+] pavlov|5 years ago|reply
“DEVELOPERS DEVELOPERS DEVELOPERS!” — Steve Ballmer, 2000
[+] judge2020|5 years ago|reply
I hope this only goes as far as being able to sign up with and link a GitHub account to NPM. Any tighter integration seems like it would be in bad faith, in terms of allowing integration with other git services/non-GH package hosting.
[+] rhacker|5 years ago|reply
They just made Github teams free, so I imagine npm private repos is next?
[+] aforty|5 years ago|reply
I like how Microsoft basically just acquired a whole slew of open source tools and no one seems to notice or care.
[+] cbhl|5 years ago|reply
I feel like Microsoft has actually built a sustainable, developer-friendly, OSS-friendly business around Azure, Visual Studio, and GitHub.

I could imagine other potential acquirers just wanting to do an acquihire, and deprecating the service a few months or years later (when the technical debt is too high to continue running the code in maintenance mode).

[+] metreo|5 years ago|reply
Why should anyone? Open source has always been about freedom. MS hasn't acquired the tools to my knowledge, they've acquired popular infrastructure built around tools. Anyone can still start GutBub based on the same `git` source tomorrow if they wanted.

To be sure a lot of this may be about talent as well. They may need to be liked in the community otherwise they have a hard time hiring top talent. They are buying a community since they probably couldn't build it themselves.

[+] exolymph|5 years ago|reply
You realize that you're commenting on a story that either is or was on the front page of Hacker News, right? Are we no one?
[+] wp381640|5 years ago|reply
I'd say the three biggest namespaces in dev are github, npm and docker hub - will Microsoft go 3 for 3?

Docker Hub feels a bit neglected - it could be aliased to docker.pkg.github.com and that'd be a huge improvement

[+] paxys|5 years ago|reply
Docker is definitely getting acquired in the near future, and Microsoft is as good a guess as any.
[+] kalium_xyz|5 years ago|reply
NPM is joining GitHub => NPM has joined GitHub
[+] chvid|5 years ago|reply
It would be nice to have free private npm repositories like the free private github repositories ...
[+] asiachick|5 years ago|reply
Hopefully they'll revisit the decision to allow ads in install scripts that NPM sanctioned.
[+] anm89|5 years ago|reply
Npm has joined Microsoft