top | item 29474828

(no title)

OldHand2018 | 4 years ago

The author seems to know about submarines, so let's give him (?) the benefit of the doubt... But without knowing the cause of the accident, the author is calling for a specific change. This seems problematic. Perhaps there is a very good reason that the computer system performs like this.

Also, if you have the computer checking everything, then those 5 people that are supposed to be redundantly computing the navigation plan are highly likely to be less diligent. Human nature and all. Isn't that likely to result in a worse outcome?

discuss

order

kbenson|4 years ago

> Also, if you have the computer checking everything, then those 5 people that are supposed to be redundantly computing the navigation plan are highly likely to be less diligent. Human nature and all. Isn't that likely to result in a worse outcome?

I'm not sure I've even seen a situation in practice where an additional safety check made the situation worse. Those same people that shirk their duties and half-ass their job under the assumption the computer will just find the problems generally make a plethora of other mistakes if a computer isn't there to double check.

Computer verification of work, usually done by applying rules and heuristics, is useful and when done well, and roughly analogous to an additional human checker IMO. If policies and expectations are set right, it's a better outcome.

This may or may not follow for the initial calculation being done by computer and then checked by a human. Some of the competitiveness of people to make sure they do the job well and don't need fixes from a computer/human checker go right out the window and perhaps that does lead to complacency.

soneil|4 years ago

> Also, if you have the computer checking everything, then those 5 people that are supposed to be redundantly computing the navigation plan are highly likely to be less diligent.

This is surprisingly easy to fix. If the computer notices before the human, call that a failure. Say, if the computer spots terrain higher than the current depth within X radius (that wasn't intentionally planned for), that's a failure.

I assume the military already has a regime in place to handle "you dun goofed". You can string failsafes after goofed but before the wall.

toast0|4 years ago

> Those same people that shirk their duties and half-ass their job under the assumption the computer will just find the problems generally make a plethora of other mistakes if a computer isn't there to double check.

If they half-ass, but follow the computer fixes, maybe nobody knows they were half-assing. If they half-ass and other people fix it, their half-assing is known and remediations are available.

zentiggr|4 years ago

There is a specific reason for the system performing like that... a development process that lets somebody in a far off office choose software components for financial/political/office politics reasons, wires together separate programs doing each task that are developed separately with barely any integration testing until it's too damn late to fix anything, and a whole host of other "our team is going to do this part using X" bullshit that winds up with the overall system looking like somebody tried to use Legos in one part, Lincoln logs in another, pottery cast clay elsewhere, and has three different interconnection schemes because each level of bureaucracy involved mandated a different buzzword when it got to review things two to five years after the last level saw it.

At least, that's what it looked like when I saw it in '98. It doesn't sound like it's gotten any better.

post_from_work|4 years ago

>>>At least, that's what it looked like when I saw it in '98. It doesn't sound like it's gotten any better.

It's 10x worse now. Now there are half-a-dozen different browser-based apps that all do the same thing, and that's BEFORE you mention the AI data fusion initiatives people want to integrate.

riskneutral|4 years ago

Everything about this hilarious and terrifying. It's hilarious and terrifying that they couldn't drive a $3 billion nuclear submarine without crashing it. That the navigation software on the sub is so bad that there is a website about it describing how its navigation software takes "minutes" to zoom in and out of maps. That if the sub were to be equipped with smarter and smarter autopilot software, the humans operators would eventually forget how to pilot the sub themselves the way that commercial airline pilots keep forgetting how to fly.

GekkePrutser|4 years ago

> That if the sub were to be equipped with smarter and smarter autopilot software, the humans operators would eventually forget how to pilot the sub themselves the way that commercial airline pilots keep forgetting how to fly.

Well the lack of situational awareness with EFIS (glass cockpit) systems is a known issue. Pilots tend to 'switch off' because of the low workload and then when something goes wrong they're not aware of the situation because they haven't been following along.

This has contributed so some incidents such as the AF 447 crash where the pilots were basically unaware of the actual situation of the plane and flew it straight into the ground (well, sea).

I can understand the Navy wants its crews to be more involved for that reason (and perhaps also because the enemy might deliberately instill confusion by messing with navaids etc), but I think there should be at least a warning if you try to do something that's known to be stupid.

jwithington|4 years ago

I think you're identifying one of the strongest argument against my claims. Automated, or computer assisted, reviews will only increase error rate because humans will assume that computers took care of it all.

You're probably on to something. When radar was first rolled out to all Navy ships to assist navigation (post WWII), accident rates actually increased.

My hunch is that Sailors drove ships riskier thinking that radar would save them. A bit like the findings of seat-belt safety laws: no impact on fatalities.

But I'm not agitating for full-blown computer reviews. It just feels like the software should have the computing capabilities of Excel lol

Jtsummers|4 years ago

It's a tradeoff. Trusting computers too much gets you into trouble (loss of navigation skills, over reliance on a potentially faulty system), but having to do things manually and depending on discipline also doesn't scale. You need to maintain enough discipline (validate the computer's results) but still have a better source than "Well, myself and three others looked at this chart for an hour and couldn't find anything above 350FT".

Discipline works until time pressure causes discipline to relax, and then the loosened discipline becomes the norm (normalization of deviance). There is no reliable way to restore discipline (in a timely fashion) after that happens, and then a collision would become inevitable. If you only rely on discipline, you're bound to fail. If you have means of relieving the reliance on discipline and don't use them, you're making a tragic mistake.

OldHand2018|4 years ago

Fair enough. Perhaps something like doing checks after the plans have been manually computed, and then errors/warnings are flagged and used for evaluations and training. But then again, how is the culture of the Navy? Would such data be used exclusively to punish people?