top | item 38172018

(no title)

nrfulton | 2 years ago

For Hackers, the money quote, IMO, is the one about competing technical cultures within Software^1:

> Koopman, who has a long career working on AV safety, faulted the data-driven culture of machine learning that leads tech companies to contemplate hazards only after they’ve encountered them, rather than before.

I haven't yet figured out how to effectively and efficiently communicate this mindset shift to folks educated primarily in ML culture. I am not sure I ever will. The closest I've come to an elevator pitch of the mindset shift is something like: "when human lives on the line and you're taking an absolutely massive number of samples IRL, doesn't it make sense to stop thinking in terms of analysis/probability and start thinking in terms of topology/nondeterminism? Ie, when you sample A LOT, unlikely shit happens. Manageable risks if you're selling ads I suppose. But not so acceptable if you're deciding whether a giant piece of unforgiving steel saw a bollard or your daughter."

[^1]: I read the quote-block in your post and immediately thought "Koopman". Then read the article and, sure enough, they're quoting Koopman. What a tragedy. The message was not only out there, but out there so loudly that the vague shape of the warning has a particular person's name attached to it in my mind. Yet, here we are.

discuss

order

steveBK123|2 years ago

Autonomous car devs seem to lack a mindset of "things that never happened before happen all the time"

No matter how many miles your car drives and how much data it collects, it will encounter a novel situation on the road. Unless it has higher levels of context / overriding safeguards / etc, a data driven only ML approach is going to fail dangerously.

One favorite example is the year old video of Tesla FSD attempting to unprotected left turn thru an oncoming trolley car while the center display 3D rendered the trolley car in motion. Clearly there is no overriding safety guardrail model above the path finding model. If the car can 3d render the object it is aware it exists.

And so we go on being perpetually "five years away" from self driving.

tmpz22|2 years ago

Is it safe to say ML people make a boatload of money that strongly incentivizes them to look the other way the second they get a related role?

SheinhardtWigCo|2 years ago

If this was the problem, wouldn't the entire industry be having these issues?

nrfulton|2 years ago

No. The "effectively and efficiently" is important. It's not like you can't get the point across. It's just hard to do without lots of communication effort.

(In particular: self-driving car companies are already highly motivated to not kill children.)