(no title)
mjw1007 | 5 months ago
But that would compete with the commercial offerings of at least one of the organisations sponsoring that message. So I expect they won't do that.
mjw1007 | 5 months ago
But that would compete with the commercial offerings of at least one of the organisations sponsoring that message. So I expect they won't do that.
TheRealBrianF|5 months ago
I covered some of this in one of my previous blogs where i talked about the systemic challenges here that I've uncovered. The heavy users that I spoke to, 100% of them had a repository manager, some Nexus, others Artifactory. And yet the high levels of consumption still persisted. I discussed some of the reasons for this in the blog link below... but I think this refutes the theory that simply having yet another caching proxy solves the problem. It really doesn't. Additionally as Mike discussed, bandwidth is only part of the challenge. Without the people behind the repositories doing the malware response, the curation of namespaces etc, there wouldn't be anything to proxy anyway.
https://www.sonatype.com/blog/free-isnt-free-the-hidden-cost...
michaelw|5 months ago
That said, I would love to see more organizations implement private staging repositories for their upstream package supply. This is where they can and should apply policies to protect their applications.
Developing a single multi-protocol or even multiple open source caching proxies will cost real time and money. I'd love to see more solutions here but at this stage it will take more than a few volunteers and a "PRs welcome" in the README.
pabs3|5 months ago