It will mean the death of Maven Central, about which I have mixed feelings. On the one hand, Sonatype deserves enormous thanks for what they have done for the open source world, as does mvnrepository.org. Their central repository has been free and maintained for a long time. Thank you, Sonatype.
On the other hand, it took me three days to release a new version of one of my artifacts the other day. The process for doing a Maven deploy is very complex. It took hours to get my private key to work because the key registries were slow. Then the staging server was slow, and kept timing out. Support was responsive, and said they were dealing with a DDOS attack. On top of that, it takes a while for artifacts to show up in the registry even after they have been uploaded. I'm glad that getting that artifact out wasn't an emergency.
This new Github service separates the registry from the artifact storage, which is the right way to do it. The registry should be quick to update because it's only a pointer. The artifact storage will be under my control. Credentials and security should be easier to deal with. I really hope this works out.
This is pretty interesting. Github really is becoming the social network that MS never seemed to be able to create. We already use it as our portfolio of work for potential employers. We collaborate with fellow enthusiasts and maybe even make new friends. We host our websites from it. Abuse it to store binaries, too. And now, along side, source code we can use it as a CDN of sorts to serve packages, for free, sounds pretty great. All they need now is a place to get coding questions answered (a la stackoverflow) and along with Github jobs it could be really compelling.
Pure speculation, it would not surprise me to wake up someday and see MS has bought Stackoverflow. Given their direction of integrating the entire developer experience, it would make sense. MS is upgrading technical docs across the board, organizing and linking to SO content would make sense.
The term "social network" has become too vague in 2019. You will have to append a purpose to each one. ie. Yelp is a social network for food, LinkedIn is a social network for professionals, and GitHub is another for developers.
Each one will serve a niche which is much harder to supplant because there's a common purpose. In contrast, when people think of Facebook, people just associate it as 'the' social network but not one for a special purpose.
> All they need now is a place to get coding questions answered
I think Github issues has already started doing that. Personally, I've been finidng more help from Github issues than Stack Overflow, plus I find myself asking questions or submitting bugs on GH a lot more than asking something on Stack Overflow. In fact, I've not asked anything on SO for years now.
I definitely see some people (ab)using Issues as a way to ask fairly generic coding questions. It might be time they open up another avenue for questions generally.
Or you know, it could just focus on its core competencies and be good (great?) at what it does. They don't need to eat the world to provide a positive impact to it...
It's a really nice project overall, having a registry that supports many different projects and run by a company that today is good, is always nice.
But we been here before. We trusted npm and now they are trying to squeeze out a profit, and it ruins it for the users. I'm happy to be proven wrong, but every for-profit company that runs a package registry, eventually stagnates, and ends up implementing things that are not for the users, but for their own profits.
I think package management, especially for open source, should not be run by for-profit entities. We need to have something similar to public utilities, where the community funds the registry itself, and the community can own it as well, where the only changes allowed, are changes that are good for the users.
This is not that. npm and docker are already run by for-profit companies, so this move by GitHub just adds another centralized package registry for those. It's not worse, by it's not better either. I'm a bit mad about the RubyGems part though, as RubyGems is a community project, and they are trying to make it not so, making it worse.
It's basically a community funded decentralized package registry, where the community funds it, and is a part of the ownership of the registry, handled via a governance followed by the contributors. All the finances, development and planning is happening in the open, and Open-Registry is committed to never making changes that are for increasing profits, only changes for making the service better for users.
Please, if you have some free minutes, check it out and write down some feedback. We might not be the perfect package registry over night, but I'm hard at work getting as close as possible, without compromising the user value for it.
There's something slightly concerning about ceding responsibility for distributing the world's open-source projects from a family of strong independent repositories to a centralized platform owned by a tech giant.
Yes, but that's not a new concern - to some, GitHub has always represented an anathema to what git was supposed to be and bring. Centralization at a proprietary vendor, instead of open systems interacting. Then locking people in further by network effect and adding centralized products around git. That it's become so popular many people equate GitHub with git adds insult to injury.
I completely understand why this all happened (centralization is just so easy and convenient; federation is hard), and it was probably inevitable in its timeframe, but I also wish it wasn't so. It's not quite what we imagined when we made the leap to dscms in the early aughts.
All the good stuff is still in there, though, and it's still as possible as ever to do different things, so it's not a bleak situation.
That is, indeed, a fair point of concern. But in practical terms, I would place Github very high in any ranking of good things that happened to open source.
It's possible many people have forgotten, or are to young to remember, how the ecosystem worked pre-Github. There was sourceforge, which wasn't quite the disaster it is today, but also not very good. But mostly I remember every project using different, often hand-rolled systems. PRs had to be sent in by mail. Every project had their own conventions of where to send patches, what formats to use, what additional information to provide etc.
Just try figuring out how to get a patch into Debian, which is still where most projects were ca. 2005. I won't wait.
I never contributed to OSS pre-Github. These days, I routinely send in a patch for smaller things I encounter a few times per week. Over time, I have also started becoming a more involved contributed to two projects. I doubt this would have happened without the flat learning curve that Github provides.
I wouldn't be surprised if both the number of contributors and total contributions to OSS have soared by a factor like 5x even above just the growth in OSS usage, and Github is the obvious reason for it.
Hypothetically, and, currently, only hypothetically:
If Microsoft is still of the old spirit, then what we see now, would be the biggest "Embrace, Extend, Extinguish" coup, they have ever done.
It won't happen now, it won't happen tomorrow. For that, this would be too big of an effort. But Microsoft is trying big to win back the hearts of "The Community" and "The Market". As people, especially developers, have gotten more clever about the computers, since the advent of the web has made it possible, to live in IT without "getting shown" and "taught" by "Big Daddy" type companies, since all and everybody is much more self-organizing these days, there is much more competition to MS, that has been in the past. So they try to get it back, what they have lost.
* VSCode, is very sweet and candy, with lots of bells and whistles, major software companies writing plugins for it (the most active being Microsoft). As a programmer's text editor it sits right at the core of every development.
VSCode, especially, is attractive to people outside of MS Windows (they might use VisualStudio). I am talking about web- and "App" developers. Mostly frontend or mobile.
* By buying Github, they bought the "source of all sources". They won't ever own the code, but as long as they own the popular infrastructure, everybody is playing on their grounds. The next step, in two years, or so, may be
the need for a MS account to log into Github. They integrate it.
Out of curiosity, what else did MS buy in the last years, that would fit into this pattern?
Hopefully something along these lines will also be added to Gitab.
I share your concerns, but I've also long had the feeling that both NPM and Maven are a security disaster in the making.
Having the dependencies being published from the same place that stores the actual code, gives me a little hope that things will improve from the security and design perspective.
Ideally I would like the social aspects of GitHub (trending/popular repository, staring projects, notifications, etc.) but with decentralized hosting. Something that would be to GitHub what Mastodon is to Twitter
While the technical side of the news is interesting, the organisational repercussions worry me. Microsoft (who owns GitHub) is already one of the largest tech companies, and I would not be surprised if this move was intended to weaken NPM and Docker in an attempt to acquire them.
I fear a future where everything one requires to develop "socially" depends on a single super-entity. GitHub and VSCode were the first steps in that direction, and now package management. My guess would be for CI/CD to be next on their list, with more integration of Azure somehow (potentially under the hood).
I'm glad you brought up Docker, but I think this is a move against GitLab, more than it is against NPM or Docker.
Lots of us use GitLab at work because it's such a complete product. Source code, container registry, CI/CD, Issues (via GitLab or Jira), Maven repository, NPM repository, etc. etc.
Microsoft is trying to build out GitHub so that they can more effectively compete for GitLab's corporate customers. Since buying GitHub they've added many of GitLab's key features to GitHub and these are some of the biggest adds so far.
You might be right that this hurts NPM and Docker, but I think it'll hurt GitLab more.
Microsoft has been in this game for a while with Visual Studio, TFS, and other tools. The same strategy is just now catching up to a larget set of better tools.
IBM I believe tried to do this with their 'Rational' tool line and they're still buying into the game (UrbanCode).
That would make sense as a worst case scenario but I'm not sure the evidence suggests that's the route they're going. If they wanted to acquire a CI/CD product, they would've bought Travis when it was being shopped around for a buyout.
I guess this is the risk of working on a product that could be easily added as a feature to a much more popular product. But, hey, Dropbox is still successful.
The npm registry started out as a hobby project that was eventually backed by the company its creator worked for. He then decided to pull out his project into his own startup, which raised some eyebrows because suddenly it looked like there was a lot of hostility between him and the company that previously footed the bill for little more than marketing value. Also it was completely unclear how the startup was supposed to make enough money to be viable.
Additionally during its "nice people matter" phase npm Inc seemed to be more focussed on creating a nice environment for its employees and maintaining its ethical values than creating anything that might generate a profit.
The two most obvious monetisation options were private packages and enterprise self-hosting. But when private packages had become a thing there were already third-party open source clones of the npm registry that offered this feature (first sinopia, now verdaccio).
There's really no way to monetise the registry itself directly because users simply aren't willing to pay for a service they expect to be free (like maven, rubygems, PyPI, etc). It would have been more logical to create a non-profit (or at this point transferring the registry to the OpenJS Foundation being the more obvious choice) instead of a for-profit startup.
npm Inc was doomed from the start. Even after acquihiring ^Lift to build the security audit feature there's simply no significant value in what npm Inc offers for money compared to what's already available for free.
The recent CEO change feels like a desperate move by the stakeholders to avoid becoming the next RethinkDB (which also ultimately failed to come up with a way to make money other than support licensing, i.e. renting out access to their developer time).
This is what people criticised when npm Inc was initially spawned: investor money isn't free money and having investors doesn't mean you can perpetually operate at a loss. Investors want significant return on investment, at least eventually. That means either selling out by being acquired (and likely killed) or becoming massively profitable (or surviving long enough while generating enough "value" to go public).
A package registry is a cost center that in order to be valuable needs to be practically guaranteed to exist forever. Maybe if GitHub manages to kill npm Inc they'll finally admit this and transfer the registry and client to a non-profit like the OpenJS Foundation.
We were contacted by NPM to switch our Enterprise account from self hosted, to hosted by NPM.
There is two problems with this for where I work.
1. What if NPM goes bust? what happens to our packages?
2. What if NPM gets hacked? what happens to our packages?
3. The increase in price was HUGE.. which was probably the reason for forcing us to migrate to their new cloud hosted option.
Look it's an open secret at this point that NPM is in trouble, it's fired a bunch of staff, other staff have quit. The new CEO is all about profit, and its just the beginning.
Actually, it has never felt natural to me to publish a Node.js package to two web sites, both Github and NPM. Moreover, when Google lands me on NPM's web site I prefer to navigate right away to Github. If this new thing from Github is going to replace NPM so that there's only one place for that matter - I would not mind.
I'm worried about the resiliency of code distribution as we continue the trend of centralizing distribution in a few large companies. GitHub has had service outages in the past, so what happens when not just our repositories but also now packages are not accessible the next time that happens? It would be great if they'd implement it using an open/decentralized protocol such as IPFS, so that even if GitHub went down the content would still be accessible.
The problem is that hosting and bandwidth aren’t free and abuse is a big problem. Managing a distributed petabyte-scale archive which gets updated so frequently is a significant engineering problem even for a single party — now consider how you’d handle redundancy and routing when you can’t rely on any of the parties involved, and you have enough different objects being accessed to turn away most participants unless you can guarantee that participating won’t blow your ISPs data caps, interfere with other use, etc.
Abuse is the other huge problem: think about what happens when you’re hosting some BLOBs and the FBI shows up at your door because someone uploaded some kind of contraband and some of it was available from your IP address. How many people are going to setup completely independent hosting accounts to avoid fallout from something like that which happens so regularly?
The closest thing which comes to mind is the Debian mirror network and that is something of a historical fluke, predating centralized hosting being possible, and scoped to a much smaller set of more trusted participants. That also hits the big problem that even with a fair amount of infrastructure backing it, it’s hard to match the user experience of something like Github or NPM so the most likely case is spending a lot of time in hard problems but not overcoming the basic economics, as seems to be happening to IPFS.
Doesn't this bifurcate the namespace of literally every packaging system they are supporting, or are they requiring `@author/`-namespaced package names?
In the livestream he pokes around a github repo, sees it's one author, and decides that what makes it trustworthy? No GPG signing?
The new Actions support (about 50 minutes into the live stream) for auto-publishing from master is pretty sweet. From the very cursory demo, it seems very much like Gitlab's CI pipelines.
Is centralization of open source a good thing for the world or not? This thread seems to be overwhelmingly positive. And in the end we all will be critisizing it if all package repositories will be handled by a single entity. And that entity that is being applauded here in this case happens to be the most valuable corporation in the world right now. Healthy skepticism seems to be a disappearing attribute in the tech world
Seriously. I love Github but I don't know how to feel about a megacorp becoming the de facto source for packages in the open source ecosystem. It could be great but many of us thought that consolidating all of our social activities under the Facebook umbrella was going to be great
As usual it will take a disaster for people to realise it was a bad idea. Microsoft tried to destroy Linux in the past. Literally. Linux is what gave us git in the first place, and docker, and so much technology that we love today. Oh how quickly the past is forgotten when convenience is on the table.
Do I want to use Github for this? I kind of like the npm model where they say "don't cache it, we guarantee as much capacity as you want to re-download packages". I use a lot of go modules, and each of our container builds ends up fetching them all. Github rate limits this and you have to either vendor the modules or provide a caching go module proxy (Athens, etc.). Meanwhile, npm just uses Cloudflare which seems happy to serve as many requests as I desire.
In general, I find that caching/vendoring dependencies is the most sane thing to do, but it's not what, say, the Javascript world appears to be doing. Do we want to move towards a service that already rate-limits package fetches when we already have a service that doesn't?
This is big. For a while we have needed a simple, intuitive and centralized artifact storage system for the modern age. I’ve been wanting to build something for ages but never made the time.
I also think that this will also have the side effect of exposing a lot of people to package/build/dist tools from other ecosystems, which might help disseminate best practices outside of their walled gardens. Github helped do this with code, helping to put the spotlight on less popular or more cutting-edge languages
This is going to solve a lot of problems for a lot of people.
This is super cool, but I worry that we've basically let a proprietary closed-source service be the de-facto standard for open source software. That really hampers my enthusiasm here.
This is going to be a huge hit for things like NPM Enterprise and Artifactory. Especially useful for small/medium teams that's want to start from the get-go with an easy way to share modules that will scale as they grow.
I'm disappointed it doesn't support Python. There's not a lot of options available for private Python package hosting, it would have been good to have another one.
[+] [-] dang|6 years ago|reply
[+] [-] ccleve|6 years ago|reply
It will mean the death of Maven Central, about which I have mixed feelings. On the one hand, Sonatype deserves enormous thanks for what they have done for the open source world, as does mvnrepository.org. Their central repository has been free and maintained for a long time. Thank you, Sonatype.
On the other hand, it took me three days to release a new version of one of my artifacts the other day. The process for doing a Maven deploy is very complex. It took hours to get my private key to work because the key registries were slow. Then the staging server was slow, and kept timing out. Support was responsive, and said they were dealing with a DDOS attack. On top of that, it takes a while for artifacts to show up in the registry even after they have been uploaded. I'm glad that getting that artifact out wasn't an emergency.
This new Github service separates the registry from the artifact storage, which is the right way to do it. The registry should be quick to update because it's only a pointer. The artifact storage will be under my control. Credentials and security should be easier to deal with. I really hope this works out.
[+] [-] gigatexal|6 years ago|reply
[+] [-] jackfoxy|6 years ago|reply
[+] [-] esturk|6 years ago|reply
Each one will serve a niche which is much harder to supplant because there's a common purpose. In contrast, when people think of Facebook, people just associate it as 'the' social network but not one for a special purpose.
[+] [-] owaislone|6 years ago|reply
I think Github issues has already started doing that. Personally, I've been finidng more help from Github issues than Stack Overflow, plus I find myself asking questions or submitting bugs on GH a lot more than asking something on Stack Overflow. In fact, I've not asked anything on SO for years now.
[+] [-] sambroner|6 years ago|reply
I definitely see some people (ab)using Issues as a way to ask fairly generic coding questions. It might be time they open up another avenue for questions generally.
[+] [-] ilaksh|6 years ago|reply
[+] [-] rdiddly|6 years ago|reply
[deleted]
[+] [-] ben_jones|6 years ago|reply
[+] [-] diggan|6 years ago|reply
But we been here before. We trusted npm and now they are trying to squeeze out a profit, and it ruins it for the users. I'm happy to be proven wrong, but every for-profit company that runs a package registry, eventually stagnates, and ends up implementing things that are not for the users, but for their own profits.
I think package management, especially for open source, should not be run by for-profit entities. We need to have something similar to public utilities, where the community funds the registry itself, and the community can own it as well, where the only changes allowed, are changes that are good for the users.
This is not that. npm and docker are already run by for-profit companies, so this move by GitHub just adds another centralized package registry for those. It's not worse, by it's not better either. I'm a bit mad about the RubyGems part though, as RubyGems is a community project, and they are trying to make it not so, making it worse.
What I'm currently working on, is how I think a Open Source Public Utility would look like. I just submitted a Show HN to show it off, you can see the submission here: https://news.ycombinator.com/item?id=19885502 Website is https://open-registry.dev
It's basically a community funded decentralized package registry, where the community funds it, and is a part of the ownership of the registry, handled via a governance followed by the contributors. All the finances, development and planning is happening in the open, and Open-Registry is committed to never making changes that are for increasing profits, only changes for making the service better for users.
Please, if you have some free minutes, check it out and write down some feedback. We might not be the perfect package registry over night, but I'm hard at work getting as close as possible, without compromising the user value for it.
[+] [-] _bxg1|6 years ago|reply
[+] [-] sho_hn|6 years ago|reply
I completely understand why this all happened (centralization is just so easy and convenient; federation is hard), and it was probably inevitable in its timeframe, but I also wish it wasn't so. It's not quite what we imagined when we made the leap to dscms in the early aughts.
All the good stuff is still in there, though, and it's still as possible as ever to do different things, so it's not a bleak situation.
[+] [-] IfOnlyYouKnew|6 years ago|reply
It's possible many people have forgotten, or are to young to remember, how the ecosystem worked pre-Github. There was sourceforge, which wasn't quite the disaster it is today, but also not very good. But mostly I remember every project using different, often hand-rolled systems. PRs had to be sent in by mail. Every project had their own conventions of where to send patches, what formats to use, what additional information to provide etc.
Just try figuring out how to get a patch into Debian, which is still where most projects were ca. 2005. I won't wait.
I never contributed to OSS pre-Github. These days, I routinely send in a patch for smaller things I encounter a few times per week. Over time, I have also started becoming a more involved contributed to two projects. I doubt this would have happened without the flat learning curve that Github provides.
I wouldn't be surprised if both the number of contributors and total contributions to OSS have soared by a factor like 5x even above just the growth in OSS usage, and Github is the obvious reason for it.
[+] [-] zmix|6 years ago|reply
Hypothetically, and, currently, only hypothetically:
If Microsoft is still of the old spirit, then what we see now, would be the biggest "Embrace, Extend, Extinguish" coup, they have ever done.
It won't happen now, it won't happen tomorrow. For that, this would be too big of an effort. But Microsoft is trying big to win back the hearts of "The Community" and "The Market". As people, especially developers, have gotten more clever about the computers, since the advent of the web has made it possible, to live in IT without "getting shown" and "taught" by "Big Daddy" type companies, since all and everybody is much more self-organizing these days, there is much more competition to MS, that has been in the past. So they try to get it back, what they have lost.
* VSCode, is very sweet and candy, with lots of bells and whistles, major software companies writing plugins for it (the most active being Microsoft). As a programmer's text editor it sits right at the core of every development. VSCode, especially, is attractive to people outside of MS Windows (they might use VisualStudio). I am talking about web- and "App" developers. Mostly frontend or mobile. * By buying Github, they bought the "source of all sources". They won't ever own the code, but as long as they own the popular infrastructure, everybody is playing on their grounds. The next step, in two years, or so, may be the need for a MS account to log into Github. They integrate it.
Out of curiosity, what else did MS buy in the last years, that would fit into this pattern?
[+] [-] yarg|6 years ago|reply
I share your concerns, but I've also long had the feeling that both NPM and Maven are a security disaster in the making.
Having the dependencies being published from the same place that stores the actual code, gives me a little hope that things will improve from the security and design perspective.
[+] [-] gouh|6 years ago|reply
[+] [-] franky47|6 years ago|reply
I fear a future where everything one requires to develop "socially" depends on a single super-entity. GitHub and VSCode were the first steps in that direction, and now package management. My guess would be for CI/CD to be next on their list, with more integration of Azure somehow (potentially under the hood).
[+] [-] keytarsolo|6 years ago|reply
Lots of us use GitLab at work because it's such a complete product. Source code, container registry, CI/CD, Issues (via GitLab or Jira), Maven repository, NPM repository, etc. etc.
Microsoft is trying to build out GitHub so that they can more effectively compete for GitLab's corporate customers. Since buying GitHub they've added many of GitLab's key features to GitHub and these are some of the biggest adds so far.
You might be right that this hurts NPM and Docker, but I think it'll hurt GitLab more.
[+] [-] tkahnoski|6 years ago|reply
IBM I believe tried to do this with their 'Rational' tool line and they're still buying into the game (UrbanCode).
[+] [-] memmcgee|6 years ago|reply
[+] [-] pornel|6 years ago|reply
But npm has recently changed their nice-people-matter CEO to a now-print-money dude, so I suspect investors' patience has run out.
And now GitHub went directly after the one thing that npm is supposed to be making money on.
[+] [-] cjbprime|6 years ago|reply
[+] [-] ne01|6 years ago|reply
[+] [-] soulofmischief|6 years ago|reply
[+] [-] manojlds|6 years ago|reply
[+] [-] pluma|6 years ago|reply
The npm registry started out as a hobby project that was eventually backed by the company its creator worked for. He then decided to pull out his project into his own startup, which raised some eyebrows because suddenly it looked like there was a lot of hostility between him and the company that previously footed the bill for little more than marketing value. Also it was completely unclear how the startup was supposed to make enough money to be viable.
Additionally during its "nice people matter" phase npm Inc seemed to be more focussed on creating a nice environment for its employees and maintaining its ethical values than creating anything that might generate a profit.
The two most obvious monetisation options were private packages and enterprise self-hosting. But when private packages had become a thing there were already third-party open source clones of the npm registry that offered this feature (first sinopia, now verdaccio).
There's really no way to monetise the registry itself directly because users simply aren't willing to pay for a service they expect to be free (like maven, rubygems, PyPI, etc). It would have been more logical to create a non-profit (or at this point transferring the registry to the OpenJS Foundation being the more obvious choice) instead of a for-profit startup.
npm Inc was doomed from the start. Even after acquihiring ^Lift to build the security audit feature there's simply no significant value in what npm Inc offers for money compared to what's already available for free.
The recent CEO change feels like a desperate move by the stakeholders to avoid becoming the next RethinkDB (which also ultimately failed to come up with a way to make money other than support licensing, i.e. renting out access to their developer time).
This is what people criticised when npm Inc was initially spawned: investor money isn't free money and having investors doesn't mean you can perpetually operate at a loss. Investors want significant return on investment, at least eventually. That means either selling out by being acquired (and likely killed) or becoming massively profitable (or surviving long enough while generating enough "value" to go public).
A package registry is a cost center that in order to be valuable needs to be practically guaranteed to exist forever. Maybe if GitHub manages to kill npm Inc they'll finally admit this and transfer the registry and client to a non-profit like the OpenJS Foundation.
[+] [-] intothemild|6 years ago|reply
There is two problems with this for where I work.
1. What if NPM goes bust? what happens to our packages? 2. What if NPM gets hacked? what happens to our packages? 3. The increase in price was HUGE.. which was probably the reason for forcing us to migrate to their new cloud hosted option.
Look it's an open secret at this point that NPM is in trouble, it's fired a bunch of staff, other staff have quit. The new CEO is all about profit, and its just the beginning.
Here's what we did.
We spun up a nexus instance https://www.sonatype.com/download-oss-sonatype
It has a NPM plugin, so we get our private repo .. we host it ourselves.. it's exactly what we want.
Honestly if you're an Enterprise customer, this is something to consider
[+] [-] Alexander473|6 years ago|reply
[+] [-] PureParadigm|6 years ago|reply
[+] [-] acdha|6 years ago|reply
Abuse is the other huge problem: think about what happens when you’re hosting some BLOBs and the FBI shows up at your door because someone uploaded some kind of contraband and some of it was available from your IP address. How many people are going to setup completely independent hosting accounts to avoid fallout from something like that which happens so regularly?
The closest thing which comes to mind is the Debian mirror network and that is something of a historical fluke, predating centralized hosting being possible, and scoped to a much smaller set of more trusted participants. That also hits the big problem that even with a fair amount of infrastructure backing it, it’s hard to match the user experience of something like Github or NPM so the most likely case is spending a lot of time in hard problems but not overcoming the basic economics, as seems to be happening to IPFS.
[+] [-] ilaksh|6 years ago|reply
When I have tried to promote them I've been downvoted. It seems pretty strange to me.
[+] [-] nkkollaw|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] mceachen|6 years ago|reply
In the livestream he pokes around a github repo, sees it's one author, and decides that what makes it trustworthy? No GPG signing?
The new Actions support (about 50 minutes into the live stream) for auto-publishing from master is pretty sweet. From the very cursory demo, it seems very much like Gitlab's CI pipelines.
[+] [-] jermo|6 years ago|reply
Say I wan't to install artifacts from two GitHub users. I would have to add these two Maven repositories:
In that case USER1 can publish an artifact with the same groupId/artifactId as USER2 and my Maven will happily install it without suspecting anything.Another case - someone deletes their GH account and another user takes it: https://blog.sonatype.com/hijacking-of-a-known-github-id-go-...
Docs: https://help.github.com/en/articles/configuring-maven-for-us...
[+] [-] soulofmischief|6 years ago|reply
As for account hijacking... I guess GH needs to track account deletions and append incrementing suffixes to usernames under the repository.
[+] [-] yes_man|6 years ago|reply
[+] [-] neurotrace|6 years ago|reply
[+] [-] ilaksh|6 years ago|reply
Also, people strangely do not seem to be aware of the potential of truly decentralized p2p technologies to provide alternatives.
[+] [-] rich-tea|6 years ago|reply
[+] [-] selrond|6 years ago|reply
[+] [-] mikepurvis|6 years ago|reply
[+] [-] sambroner|6 years ago|reply
The ability to provide any one with access to the repo access to the packages created from it removes an often frustrating management step.
[+] [-] jrockway|6 years ago|reply
In general, I find that caching/vendoring dependencies is the most sane thing to do, but it's not what, say, the Javascript world appears to be doing. Do we want to move towards a service that already rate-limits package fetches when we already have a service that doesn't?
[+] [-] whalesalad|6 years ago|reply
I also think that this will also have the side effect of exposing a lot of people to package/build/dist tools from other ecosystems, which might help disseminate best practices outside of their walled gardens. Github helped do this with code, helping to put the spotlight on less popular or more cutting-edge languages
This is going to solve a lot of problems for a lot of people.
[+] [-] freedomben|6 years ago|reply
[+] [-] hn_throwaway_99|6 years ago|reply
[+] [-] burtonator|6 years ago|reply
Remember this when you guys all rush to sign up for their new services because it's easier for you now ;)
[+] [-] luhn|6 years ago|reply