top | item 32473897

(no title)

huetius | 3 years ago

Not doubting this, but it seems to cut both ways. I can just as easily justify an overreaction by claiming to have averted some worse outcome. It seems to be a general problem of counterfactuals.

discuss

order

anonporridge|3 years ago

This is why it's often good resource management to wait until something breaks before committing resources to fix it. Especially true in software systems.

One might think that constant firefighting is a waste of resources, and we'd be better off solving problems before they happen. That's true if and only if you know for sure that the problem and eventual breakage is really going to happen AND that it's worth fixing. At least in my experience, it's more often true that people overestimate the risk of calamity and waste resources fixing things that aren't actually going to break catastrophically. Or fix things that we don't actually need, but only figure out that we don't need them when they finally break and we realize that the cost of fixing or replacing it outweighs whatever value it was providing.

The engineer in me hates saying this, but sometimes things don't have to be beautifully designed and perfectly built to handle the worst. Duct tape and superglue often really is good enough.

Of course, this doesn't apply to problems that are truly existential risks. If the potential systemic breakage is so bad that it irreparably collapses the system, then active preparedness can certainly be justified.

makeitdouble|3 years ago

This is the no-brainer choice for anything that can be immediately replaced/ordered. Most of us aren’t keeping a stash of computer monitors in case of failure.

On firefighting…huge swaths of burned down land can’t be reordered on Amazon and delivered next day. People quip “just replant the trees” but of course that doesn’t rebuild an ecosystem, we might not even replant the right trees, and the things that lived there are now dead.

On personal scales, waiting for your car to break to fix it isn’t a good strategy either, nor would you wait for you gas pipes to leak, or see if the thunder actually hits your home before preparing for it.

Basically I feel “don‘t fix until it breaks” is a good strategy for day to day small scale decisions, but problematic for most stuff beyond that.

quickthrower2|3 years ago

This is why I think most of the 'absolutes' that programmers, software architects, managers etc. talk about are not so.

For example you must never declare 'magic numbers' in code. Or you must always obey S.O.L.I.D. or get 100% TDD. There will be be people who believe in these dogmatically to the point they won't employ anyone who says different (it becomes an interview question).

I am not arguing that these are wrong!

I am arguing that they are not evidence driven (they cannot be, software is to complex, it is not a narrow experiment on a lab mouse). So they must be culture/preference/worldview driven.

When there is no evidence driven approach to 99% of your decisions on software it becomes: an art. And that is fine.

That said it might be possible to show evidence that an approach is good for your code base, for your team, as that is a more limited scope, rather than "in general".

What isn't fine is the number of overly confident global assertions we hear from software people about how to build software.

asdff|3 years ago

I think it would depend on the context which often depends on what the real risk is. Building 5000 nuclear missiles? Overreaction. Overbuilding flood control systems such that the region has not experienced major flooding in 100 years? Justified preparation. The tell for what is justified and what isn't is through what you can remove from the system and not see any ill effect, like a jenga tower. We've already decommissioned thousands of nukes and the sky didn't fall, so that goes to show all that preparation was useless. Take away flood control systems OTOH and that would probably result in thousands of lives lost before long given the odds of a bad storm in the area. Likewise with pandemic preparations (mentioned in the intro); what are the odds of a pandemic? High, so the preparations are justified.

izabera|3 years ago

>The tell for what is justified and what isn't is through what you can remove from the system and not see any ill effect, like a jenga tower. We've already decommissioned thousands of nukes and the sky didn't fall, so that goes to show all that preparation was useless.

Bad example. It absolutely wasn't useless at the time to build those thousands of nukes. The whole concept of mutual assured destruction breaks down if the other side has 20x more nukes.

tshaddox|3 years ago

I mean, it’s not some unique problem of counterfactuals, right? It doesn’t seem like counterfactuals have some unique epistemological status. You can use reason to propose and criticize counterfactuals the same as any other kinds of explanations.