(no title)
flowerbreeze | 1 year ago
The build/deployment time difference is maybe the least relevant, but also there most of the time, because Docker performs more work than simple zip+scp and an scp copy of the version to archive somewhere. Docker needs to copy far more than just the application files. Avoiding an extra copy of 100MB data (OS + required env) during deployment, if application files are only ~1-2MB tends to add quite a few seconds to the process, although how much it matters depends on network speed of course. For example on my modest connection it'd be ~8-10 seconds vs <1 second.
There are of course great reasons to use Docker such as a larger team that needs a common environment setup or when using languages that don't have great dependency management system (eg they have non-transferrable builds between systems), but it is something "extra" to maintain.
sepositus|1 year ago
> Adding an additional layer also means that layer needs to be managed at all times, and additional setup is required to use it. This starts at installing docker related tooling, having to do extra work to access logs inside containers, additional infrastructure management/maintenance (eg private repository), Docker compatibility between versions (it's not very good at maintaining that) etc
Docker is available on every major distribution. Installing it once takes seconds. Accessing logs (docker logs mycontainer) takes just as long as systemctl (journalctl -u myservice). Maintaining a registry is optional, there are dozens of one-click SaaS services you can use you instantly get a registry, many of them free. Besides, I would consider the registry to have significantly more time-savings benefits due to being able to properly track builds.
> Docker needs to copy far more than just the application files. Avoiding an extra copy of 100MB data (OS + required env) during deployment
This is only partially true. Images are layered, and if the last thing you do is copy your binary to the image (default Docker practice), than it's possible for it to be exactly the same time as it's only downloading one new layer (the size of the application). Only on brand new machines (an irrelevant category to consider) is it fully true.
worik|1 year ago
Unless it is not, then you should use Docker.
But many (all?) of us have had the experience of a manager insisting on some "new thing" (LLMs are the current fad of the day) and if we are not using it we are falling behind. That is not true, as we all know.
It is very hard for the money people to manage the tech stack, but they need to, it is literally their job (at the highest level). We desperately need more engineers, who are suited (I am not!) to go into management
the__alchemist|1 year ago
Seconds are an eternity in the domain of computing.