(no title)
angusb | 8 months ago
- the failure here is that the car didn't stop for the bus on the other side of the road with the extended stop sign. (Obviously a kid running out from behind a car this late is fairly difficult for any human or self driving system to avoid)
- the FSD version for the robotaxi service is private and wasn't the one used for this test. The testers here only have access to the older public version, which is supposed to be used with human supervision
- the dawn project is a long-time Tesla FSD opponent that acts in bad faith - they are probably relying on false equivalence of FSD beta vs robotaxi FSD
Nevertheless this is a very important test result for FSD supervised! But I don't like that the dawn project are framing this as evidence for why FSD robotaxi (a different version) should not be allowed without noting that they have tested a different version.
fabian2k|8 months ago
angusb|8 months ago
locococo|8 months ago
Your argument that a newer version is better simply because it's newer does not convince me. The new version could still have that same issue.
angusb|8 months ago
I actually didn't say that and am not arguing it formally - I said what I said because I think that the version difference is something that should be acknowledged when doing a test like this.
I do privately assume the new version will be better in some ways, but have no idea if this problem would be solved in it - so I agree with your last sentence.
interloxia|8 months ago
angusb|8 months ago
chneu|8 months ago
ryandrake|8 months ago
mxschumacher|8 months ago
jantissler|8 months ago
locococo|8 months ago
bestouff|8 months ago
That's why you slow down when you pass a bus (or a huge american SUV).
sokoloff|8 months ago
Every driver/car that obeys the law has absolutely no problem avoiding this collision with the child, which is why the bus has flashing lights and a stop sign.
Sharlin|8 months ago
ndsipa_pomu|8 months ago
Not really - it's a case of slowing down and anticipating a potential hazard. It's a fairly common situation with any kind of bus, or similarly if you're overtaking a stationary high-sided vehicle as pedestrians may be looking to cross the road (in non-jay-walking jurisdictions).
burnt-resistor|8 months ago
One of the prime directives of driving for humans and FAD systems must be "never drive faster than brakes can stop in visible areas". This must account for such scenarios as obstacles stopped or possible coming the wrong way around a mountain turn.
Zigurd|8 months ago
As for being unrepresentative of the next release of FSD, we've had what eight years ten years of it's going to work on the next release.
arccy|8 months ago
throwaway984393|8 months ago
[deleted]
enragedcacti|8 months ago
- FSD has been failing this test publicly for almost three years, including in a Super Bowl commercial. It strains credulity to imagine that they have a robust solution that they haven't bothered to use to shut up their loudest critic.
- The Robotaxi version of FSD is reportedly optimized for a small area of austin, and is going to extensively use tele-operators as safety drivers. There is no evidence that Robotaxi FSD isn't "supposed" to be used with human supervision, its supervision will just be subject to latency and limited spatial awareness.
- The Dawn Project's position is that FSD should be completely banned because Tesla is negligent with regard to safety. Having a test coincide with the Robotaxi launch is good for publicity but the distinction isn't really relevant because the fundamental flaw is with the companies approach to safety regardless of FSD version.
- Tesla doesn't have an inalienable right to test 2-ton autonomous machines on public roads. If they wanted to demonstrate the safety of the robotaxi version they could publish the reams of tests they've surely conducted and begin reporting industry standard metrics like miles per critical disengagement.
BobaFloutist|8 months ago
Shouldn't that be the one case where self driving system has an enormous natural advantage? It has faster reflexes, and it doesn't require much, if any, interpretation or understanding of signs or predictive behavior of other drivers. At the very worst, the car should be able to detect a big object in the road and try to brake and avoid the object. If the car can't take minimal steps to avoid crashing into any given thing that's in front of it on the road, what are we even doing here?
ndsipa_pomu|8 months ago
WillAdams|8 months ago
My understanding is that Tesla is the only manufacturer trying to make self-driving work with just visual-spectrum cameras --- all other vendors use radar/lidar _and_ visual-spectrum cameras.
ndsipa_pomu|8 months ago
Zigurd|8 months ago
rsynnott|8 months ago
... eh? I mean, what?
handsclean|8 months ago
aeurielesn|8 months ago