top | item 44354972

(no title)

trust_bt_verify | 8 months ago

> they just stop if they ever encounter a situation where they don’t know what to do.

Oh my god that’s terrifying if true. I can think of many situations when driving when slamming on the breaks is the absolute wrong choice. Tesla is pushing this out way before it’s safe enough to operate in public.

discuss

order

averageRoyalty|8 months ago

Not justifying it, but there is a reason the person behind is almost always responsible in accidents. You are responsible for maintaining a safe distance in case the person in front of you (for whatever reason) stops or acts erratically.

jfoster|8 months ago

What would be a better option if the system doesn't understand the environment? Can happen in any system, right? There's really only two options:

1. Keep driving in some way

2. Stop

Which one do you think Waymo (or any other system) does when it doesn't understand a situation?