Imagine you have a team migrating the federal tax system of a smaller country - you can choose to develop the project in C and have X amount of devs work on it for 3-4 years 40-50h a week. The resulting code base will be huge and probably hard to maintain.
Or you could choose to hire devs in a higher level language, maybe X/2 or X/1.5 amount of devs - saving you the environmental cost of running possibly hundreds of workstations every single day for years at a time and doing administration for more employees. The resulting code base will also be smaller and (hopefully) easier to maintain.
I don't think you could call this immeasurable at all. (This is based on a real example)
Not just the workstations. You really need to factor in the other carbon output of those developers for the time spent on the project too, not just the tools.
And you then need to consider to what extent any performance saving will be efficiently captured as reduced power use.
But, yes, fully agree.
One system I worked on recently involved about 20 developer years of effort, and about 40 core years of computation... I'm pretty sure the developers machines combined spent far more energy than the production systems, for a Ruby deployment, before factoring in any other energy use relating to difference in effort.
IME development resources will always expand to the available budget and time anyway, the programming language or programmer skills really don't matter much.
Also, whether C is actually more or less "productive" than other languages for specific tasks isn't all that clear either. For the things I pick C for (for instance cross-platform libraries sitting between OS APIs and user code, and home computer emulators), it is also the most productive option (in the sense that higher level programming languages wouldn't make me more productive, because all that's needed for this type of stuff is functions, structs, loops, conditionals and a handful of math operators - and all those things are in C). High level features like automatic memory management don't make much sense when there's hardly any heap memory to be managed, and a rich stdlib also isn't needed when all you do is number crunching and bit twiddling.
rcbdev|2 years ago
Or you could choose to hire devs in a higher level language, maybe X/2 or X/1.5 amount of devs - saving you the environmental cost of running possibly hundreds of workstations every single day for years at a time and doing administration for more employees. The resulting code base will also be smaller and (hopefully) easier to maintain.
I don't think you could call this immeasurable at all. (This is based on a real example)
vidarh|2 years ago
And you then need to consider to what extent any performance saving will be efficiently captured as reduced power use.
But, yes, fully agree.
One system I worked on recently involved about 20 developer years of effort, and about 40 core years of computation... I'm pretty sure the developers machines combined spent far more energy than the production systems, for a Ruby deployment, before factoring in any other energy use relating to difference in effort.
flohofwoe|2 years ago
Also, whether C is actually more or less "productive" than other languages for specific tasks isn't all that clear either. For the things I pick C for (for instance cross-platform libraries sitting between OS APIs and user code, and home computer emulators), it is also the most productive option (in the sense that higher level programming languages wouldn't make me more productive, because all that's needed for this type of stuff is functions, structs, loops, conditionals and a handful of math operators - and all those things are in C). High level features like automatic memory management don't make much sense when there's hardly any heap memory to be managed, and a rich stdlib also isn't needed when all you do is number crunching and bit twiddling.