top | item 44288249

(no title)

angusb | 8 months ago

This has done the rounds on other platforms. A couple of important points:

- the failure here is that the car didn't stop for the bus on the other side of the road with the extended stop sign. (Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)

- the FSD version for the robotaxi service is private and wasn't the one used for this test. The testers here only have access to the older public version, which is supposed to be used with human supervision

- the dawn project is a long-time Tesla FSD opponent that acts in bad faith - they are probably relying on false equivalence of FSD beta vs robotaxi FSD

Nevertheless this is a very important test result for FSD supervised! But I don't like that the dawn project are framing this as evidence for why FSD robotaxi (a different version) should not be allowed without noting that they have tested a different version.

discuss

order

fabian2k|8 months ago

I don't see why Tesla would deserve the benefit of the doubt here. We cannot know how well the actual Taxi software will work, I think it is fair to extrapolate from the parts we can observe.

angusb|8 months ago

re. extrapolation: I agree with that, but remember there's sampling error. The crashes/failures go viral but the lives saved get zero exposure or headlines. I don't think that means you can just ignore issues like this but I think it does mean it's sensible to try to augment the data point of this video with imagining the scenarios where the self driving car performs more safely than the average human driver

locococo|8 months ago

I thin their test is valid and not in bad faith because they demonstrate the Teslas self driving technology has no basic safety standards.

Your argument that a newer version is better simply because it's newer does not convince me. The new version could still have that same issue.

angusb|8 months ago

> Your argument that a newer version is better

I actually didn't say that and am not arguing it formally - I said what I said because I think that the version difference is something that should be acknowledged when doing a test like this.

I do privately assume the new version will be better in some ways, but have no idea if this problem would be solved in it - so I agree with your last sentence.

interloxia|8 months ago

It is also a failure that it does a cruise and continues to drive over the thing/child that was hit.

angusb|8 months ago

that should have been in my list, you're right

chneu|8 months ago

Everytime Tesla's FSD is shown to be lacking someone always says "well that's not the real version and these people are bias!"

ryandrake|8 months ago

Don't forget the standard "And the next version surely will be much better!"

mxschumacher|8 months ago

it's always the next version that will go from catastrophic failure to being perfect. This card has been played 100 times over the last few years.

jantissler|8 months ago

Exactly what I wanted to add. Every single time there is hard evidence what a failure FSD is, someone points out that they didn't use the latest Beta. And of course they provide zero evidence that this newer version actually addresses the problem. Anyone who knows anything about Software and updates understand how new versions can actually introduce new problems and new bugs …

bestouff|8 months ago

> Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid

That's why you slow down when you pass a bus (or a huge american SUV).

sokoloff|8 months ago

In this case, you stop for the bus that is displaying the flashing red lights.

Every driver/car that obeys the law has absolutely no problem avoiding this collision with the child, which is why the bus has flashing lights and a stop sign.

Sharlin|8 months ago

Exactly. If your view is obstructed by something, anything, you slow down.

ndsipa_pomu|8 months ago

> Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid

Not really - it's a case of slowing down and anticipating a potential hazard. It's a fairly common situation with any kind of bus, or similarly if you're overtaking a stationary high-sided vehicle as pedestrians may be looking to cross the road (in non-jay-walking jurisdictions).

burnt-resistor|8 months ago

Yes. And given the latency of cameras (or even humans) not being able to see around objects and that dogs and kids and move fast from hidden areas into the path, driving really slow next to large obstacles until able to see behind them becomes more important.

One of the prime directives of driving for humans and FAD systems must be "never drive faster than brakes can stop in visible areas". This must account for such scenarios as obstacles stopped or possible coming the wrong way around a mountain turn.

Zigurd|8 months ago

The simulation of a kid running out from behind the bus is both realistic, and it points out another aspect of the problem with FSD. It didn't just pass the bus illegally. It was going far too fast while passing the bus.

As for being unrepresentative of the next release of FSD, we've had what eight years ten years of it's going to work on the next release.

arccy|8 months ago

sounds like a tesla problem for naming their crappy tech "full self driving"

enragedcacti|8 months ago

Some important counter-points:

- FSD has been failing this test publicly for almost three years, including in a Super Bowl commercial. It strains credulity to imagine that they have a robust solution that they haven't bothered to use to shut up their loudest critic.

- The Robotaxi version of FSD is reportedly optimized for a small area of austin, and is going to extensively use tele-operators as safety drivers. There is no evidence that Robotaxi FSD isn't "supposed" to be used with human supervision, its supervision will just be subject to latency and limited spatial awareness.

- The Dawn Project's position is that FSD should be completely banned because Tesla is negligent with regard to safety. Having a test coincide with the Robotaxi launch is good for publicity but the distinction isn't really relevant because the fundamental flaw is with the companies approach to safety regardless of FSD version.

- Tesla doesn't have an inalienable right to test 2-ton autonomous machines on public roads. If they wanted to demonstrate the safety of the robotaxi version they could publish the reams of tests they've surely conducted and begin reporting industry standard metrics like miles per critical disengagement.

BobaFloutist|8 months ago

>(Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)

Shouldn't that be the one case where self driving system has an enormous natural advantage? It has faster reflexes, and it doesn't require much, if any, interpretation or understanding of signs or predictive behavior of other drivers. At the very worst, the car should be able to detect a big object in the road and try to brake and avoid the object. If the car can't take minimal steps to avoid crashing into any given thing that's in front of it on the road, what are we even doing here?

ndsipa_pomu|8 months ago

I agree - it's why I think some type of radar/lidar is necessary for autonomous vehicles to be sufficiently safe. I get the theory that humans can process enough info from two eyes to detect objects, so multiple cameras should be able to do the same, but it looks like it's a tough problem to solve.

WillAdams|8 months ago

Does the Tesla taxi option afford radar/lidar?

My understanding is that Tesla is the only manufacturer trying to make self-driving work with just visual-spectrum cameras --- all other vendors use radar/lidar _and_ visual-spectrum cameras.

ndsipa_pomu|8 months ago

They've painted themselves into a corner really, as if they start using anything other than just cameras, they'll be on the hook for selling previous cars as being fully FSD-capable and presumably have to retrofit radar/lidar to anyone that was mis-sold a Tesla.

Zigurd|8 months ago

...and radar and mics and Google's geospatial data and 15 years of experience.

rsynnott|8 months ago

... Wait, so you think that Tesla have a child-killing and a non-child-killing version, but are only providing the child-killing version to consumers?

... eh? I mean, what?

handsclean|8 months ago

I’ve seen this “next version” trick enough times and in enough contexts to know when to call BS. There’s always a next version, it’s rarely night-and-day better, and when it is better you’ll have evidence, not just a salesman’s word. People deserve credit/blame today for reality today, and if reality changes tomorrow, tomorrow is when they’ll deserve credit/blame for that change. Anybody who tries to get you to judge them today for what you can’t see today is really just trying to make you dismiss what you see in favor of what you’re told.

aeurielesn|8 months ago

Sorry, I am failing to see how any of these three points are relevant or even important.