(no title)
mc4ndr3 | 3 years ago
Not to mention vertically integrating the entire Docker layer set defeats the whole point of using Docker in the first place.
mc4ndr3 | 3 years ago
Not to mention vertically integrating the entire Docker layer set defeats the whole point of using Docker in the first place.
tehbeard|3 years ago
What they're suggesting is basically setting up a cache for it locally in-between them and the "main repo" and ensuring the cache doesn't delete after x days and/or keep backups of the images they depend on.
If the package disappears, or the main repo falls over (cough github, cough), your devs, CI & prod aren't sat twiddling thumbs unable to work...
and if the package is nuked off the planet? You've got some time then to find an alternate / see where they move to.
aprdm|3 years ago
chaxor|3 years ago
I would expect the security and quality of images in a decentralized system to be far superior to any centralized system spun up by some for profit entity.
* malware and spyware could be defined here as software that allows remote keylogging, camera activation, installation of any executables, etc - i.e. root access - which is precisely what most corporate entities make software to do (e.g. "security solutions" that you have to install on your work computers). This is also most web services which are 90% tracking with an occasional desired application or feature these days.
twblalock|3 years ago
Not doing that is unusual, and actually less secure. Do you think it's sane or secure for all of your builds to depend on downloading packages from the public internet?
wlesieutre|3 years ago