top | item 33094044

(no title)

mojzu | 3 years ago

I think the argument is less related to hacking, as that would be like making something tamper proof which is a much higher bar. But more like if software can cause hardware damage, that means software bugs can cause hardware damage and potentially pose a safety risk, which should be seen as a critical hardware bug

For example the safety critical systems you mention should absolutely fail-safe at the bare minimum, all kinds of things can adversely affect running software like equipment generating EM noise nearby or someone tripping over the wrong cable

discuss

order

throwawaylinux|3 years ago

> I think the argument is less related to hacking, as that would be like making something tamper proof which is a much higher bar.

I meant hacking as-in messing around, poor choice of word.

> But more like if software can cause hardware damage, that means software bugs can cause hardware damage and potentially pose a safety risk, which should be seen as a critical hardware bug

Hardware bugs can cause hardware damage.

> For example the safety critical systems you mention should absolutely fail-safe at the bare minimum, all kinds of things can adversely affect running software like equipment generating EM noise nearby or someone tripping over the wrong cable

They're just not. The ABS, stability, and collision avoidance systems in your car can't fail safe if the software fails because the software is required to control the dynamic situation. It can't just say stop everything. Same as control software in airliners. Or industrial control and monitoring systems (although they can have mechanical interlocks in more cases, not all).

And very little that can be _absolutely_ fail-safe, not even purely mechanical devices. How do you make a fail safe bridge?

mojzu|3 years ago

> Hardware bugs can cause hardware damage.

That's true, although I think the original point was that if software can damage the hardware it's running on then that should be seen as a fault/bug, but with cost reduction/market pressures/etc. it is often ignored

And fail-safe doesn't necessarily mean everything turns off because that can be just as dangerous, I take it more as a systems mindset where thought, care and attention are paid to failure conditions and making sure those outcomes pose the least risk. Again something which can often be ignored for cost or expediency reasons

Perfection is probably an unattainable goal but I've been around software long enough that I wouldn't want someones safety to depend solely on one piece of software