(no title)
andyonthewings | 3 years ago
First of all while maybe desktops and mobile phones are more powerful now, but we're getting more and more lower spec devices, like smart watches. Even when smart watches will be powerful enough one day, there will be eventually smaller computers, like smart contact lens, nano robots that run in blood vessels etc.
Secondly efficiency is still favourable for powerful computers. A small percentage of cost reduction can be big money save for large corps. A small percentage of energy save of all (or just a portion of) computers in the world can be a big win for the environment.
Lastly, we also run VMs and containers everywhere. Notice how we've already come up with all kind of ways to minimise VMs and containers size and footprint, in order to run more of them, to start them faster, and to transfer images quicker.
wtetzner|3 years ago
josephg|3 years ago
The only argument for dynamic linking being more efficient is that each dynamic library can be shared between all programs that use it. But not a net win in all cases. When you dynamically link a library, the entire library is loaded into RAM. When you static link, dead code elimination means that only the code you actually run in the library needs to be loaded.
But honestly, none of these arguments are strong. Dyld is fast enough on modern computers that we don't notice it. And RAM is cheap enough these days that sharing libraries between applications for efficiency feels a bit pointless.
The real arguments are these:
- Dynamic linking puts more power in the hands of the distribution (eg Debian) to change the library that a program depends on. This is used for security updates (eg OpenSSL) or for UI changes on Apple platforms. Dynamic linking is also faster than static linking, so compile times are faster for large programs.
- Static libraries put power in the hands of the application developer to control exactly how our software runs. VMs and Docker are essentially wildly complicated ways to force static linking, and the fact that they're so popular is evidence of how much this sort of control is important to software engineers. (And of course, statically linked binaries are usually simpler to deploy because static binaries make fewer assumptions about their runtime environment.)
Kranar|3 years ago
astrange|3 years ago
pjmlp|3 years ago
Both approaches have plus and minus regarding efficiency.
mike_hearn|3 years ago
acomjean|3 years ago
Honestly a ton of code is utility code, not run that often and nobody wants to take the time or expense to rewrite that Perl/python/bash script.
Some code efficiency is much more important than others.
norman784|3 years ago
[0] https://news.ycombinator.com/item?id=34575561
saghm|3 years ago
My desktop/laptop distro dynamically link everything by default just so it can be slightly less work to port to a smartwatch? That sounds like a reasonable argument for having dynamically linking be possible, but it doesn't seem compelling for it being the default for the entire world.