(no title)
hyporthogon | 6 years ago
In these situations, certain 'core' parts of the code quickly become the only accurate spec. (Whether or not it's worth updating the spec docs is a management decision. But the test scripts will pass if the code is correct, and under sufficient time and money pressure etc..) Other developers then treat these 'core' parts of the code (usually some fairly high-level classes, but usually something more concrete than an interface) as the true documentation of the business requirements. If the company respects developers enough, this means that the developers that worked on those 'core' bits of code are also treated as domain experts in future business discussions.
On the purely technical side: sheesh, how much heat is generated by horrifyingly algorithmically inefficient or vastly I/O-wasteful or just redundant design (for instance religious/unnecessary use of immediate-mode GUI) -- stuff that quite possibly the IT managers don't care about at all (because of e.g. cheap horizontal scaling and inadequate measures of software project success)? The heat is bad for ecological reasons (locally at least), but also intrinsically (why are you destroying information, O Information Worker?? -- and again e.g. Toffoli gates fix this only locally). Based on code I've seen and, sadly, written (laziness, time pressure and all that) -- there must be many, many orders of magnitude of unnecessary heat/information-destruction happening because of purely technical decisions that on-the-ground developers (not even architects/designers, I mean the people that write the stuff that gets compiled/interpreted) make. @OP if you know some way of measuring this I'd love to hear more.
sansnomme|6 years ago
mprev|6 years ago