The most damning thing is that the most advanced version, with the most modern hardware, with perfectly maintained vehicles, running in a pre-trained geofence that is pre-selected to work well [1] with trained, professional safety drivers, with scrutinized data and reporting average a upper bound of 40,000 miles per collision (assuming the mileage numbers were not puffery [3]).
Yet somehow they claim that old versions, using old hardware, on arbitrary roads, using untrained customers as safety drivers somehow average 2.9 million miles per collision in non-highway environments [2], a ~72.5x difference in collision frequency, and 5.1 million miles per collision in all environments, a ~175x(!) difference in collision frequency, when their reporting and data are not scrutinized.
I guess their most advanced software and hardware and professional safety drivers just make it 175x more dangerous.
That's not really like for like. You are comparing "level 4" where the car is supposed to do everything to driver assist where the driver is supposed to take over if things go off track.
I'm not sure what the guys in the taxis with their hands on the arm rest do. I guess they have a button that either stops the car or connects it to a remote control operator?
It's because of selection bias. In the older vehicles, customers won't turn on autopilot if they think it won't handle the situation. So, they turn it on highways and easier paths.
There's a lot of editorializing going on. Now that the title has been restored, hopefully things calm down a bit.
Ultimately, Tesla has two problems going on here:
1. Their crash rate is 2x that of Waymo.
2. They redact a lot of key information, which complicates safety assessments of their fleet.
The redactions actually hurt Tesla, because the nature of each road incident really matters: EVERY traffic incident must be reported, regardless of fault (even if it's a speeding car from the other direction that hits another vehicle which then hits the robotaxi - yes, that's actually in one of the Waymo NHTSA incident reports). When Tesla redacts the way they've been doing, it makes it very difficult to do studies like https://www.tandfonline.com/doi/full/10.1080/15389588.2025.2... which show how much safer Waymo vehicles are compared to humans WHEN IT COMES TO ACTUAL DAMAGE DONE.
We can't get that quality of info from Tesla due to their redaction practices. All we can reliably glean is that Tesla vehicles are involved in 2x the incidents per mile compared to Waymo. https://ilovetesla.com/teslas-robotaxi-dilemma-navigating-cr...
The redactions also indicate they are hiding something and deprioritize safety below (whatever they are hiding). That is, it makes Tesla not trustworthy with something that risks life and limb.
If you only count robotaxis and not all Tesla, isn't the crash rate 20X per driven mile? I remember doing the math a few months ago and finding 20, but I might be mistaken.
When you have a CEO like Elon who swears up and down that you only need cameras for autonomous driving vehicles, and skimping out on crucial extras like Li-DAR, can anyone be surprised by this result? Tesla also likes to take the motto of "move fast and break things" to a fault.
I just find it distracting to pretend like we know exactly what albatross to hang around the neck of the problem here. While I do tend to think lidar is probably useful, I also think this isn't a solved shut knowable case where the lidar absolutely is essential & makes all the difference. Making assertions like this rests more assurity than I think can be granted, and harms the overall idea: that Tesla doesn't seem to have serious proof that their things are getting better, that they are more trustworthy.
The data is just not there for us outsiders to make any kind of case, and thats the skimping out crucial baseline we need.
I'm not surprised, more because there was info on Reddit that Tesla FSDs were having disengagements every 200 miles or some such in urban environments. Camera only probably could work in the future but seemingly not yet.
Musk's success story is taking very bold bets almost flippantly. These things have a premium associates with them, because to most people they are so toxic that they would never consider them.
Every time when he has the choice to do something conservative or bold, he goes for the latter, and so long as he has a bit of luck, that is very much a winning strategy. To most people, I guess the stress of always betting everything on red would be unbearable. I mean, the guy got a $300m cash payout in 1999! Hands up who would keep working 100 hour weeks for 26 years after that.
I'm not saying it is either bad or good. He clearly did well out of it for himself financially. But I guess the whole cameras/lidar thing is similar. Because it's big, bold, from the outset unlikely to work, and it's a massive "fake it till you make it" thing.
But if he can crack it, again I guess he hits the jackpot. Never mind cars, they are expensive enough that Lidar cost is a rounding error. But if he can then stick 3d vision into any old cheap cameras, surely that is worth a lot. In fact wasn't this part of Tesla's great vision - to diversify away from cars and into robots etc. I'm sure the military would order thousands and millions of cheapo cameras that work 90% as well as a fancy Lidar - while being fully solid state etc.
That he is using his clients as lab rats for it is yet another reason why I'm not buying one. But to me this is totally in character for Musk.
The more I've looked into the topic the less I think the removal of lidar was a cost issue. I think there are a lot of benefits to simplifying your sensor tech stack, and while I won't pretend to know the best solution removing things like lidar and ultrasonic sensors seem to have been a decision about improving performance. By doubling down on cameras your technical team can remain focused on certain sensor technology, and you don't have to deal with data priority and trust in the same way you do when you have a variety of sensors.
The only real test will be who creates the best product, and while waymo seems to have the lead it's arguably too soon to tell.
This is the sort of thing that occurs when the interests of the public become subordinate to the interests of a lawless aristocracy. Financial, social and public safety considerations are costs that can be transferred to the public to preserve the wealth of a few individuals.
I’m still waiting until I see little X Æ A-Xii playing in the street while Tesla Robotaxis deliver passengers before I buy these arguments. Until then, my children are playing in the street while these autonomous vehicles threaten their safety. I’m upset that this is forced upon the public by the government.
The title makes it sound way worse than the 7 reported crashes listed in the article. I’d be interested to see a comparison with Waymo and other self driving technologies in the same area (assuming the exist).
I couldn't find Waymo's stats for all crashes in 12 seconds of googling, but they have breakdowns for "crashes with serious injury" (none yet) "crashes resulting in injury" (5x reduction) and "crashes where airbag deployed" (14x reduction), relative to humans in Austin
Austin has relatively low miles so the confidence intervals are wider but not too far from what they show for other cities
How does it make sense that Tesla are allowed to redact information about an RTC on public roads? If the information is proprietary then keep the vehicle away from public roads until they stop crashing or the info can be released in the event of a collision.
Search Google news and just this week and you will see Waymo blocked a parade for 45 minutes, drove into a police crime scene in LA, and two crashed into each other blocking a third. Stuff happens, but obviously Fred rarely reports the Waymo incidents but he is quck to write about every Tesla one.
I spent a little bit of time poking at Gemini to see what it thought the accident rate in an urban area like Austin would be, including unreported minor cases. It estimated 2-3/100k miles. This is still lower than the extrapolation in the article, but maybe not notably lower.
We need far higher quality data than this to reach meaningful conclusions. Implying conclusions based upon this extrapolation is irresponsible.
I can imagine why they redact the reports so much: Elon hating NGOs would gladly pay a lawyer to spend as much time suing Tesla for each crash, even if completely frivolously and with no hope of recouping any of the time and money spent, and think they were doing the great work of social justice.
Electrek notoriously lies and fibs and stretches the truth to hate on Tesla and Elon as much as possible when it serves their own best interests.
This one is misleading both in that 8 "crashes" is statistically insignificant to draw conclusions as to its safety compared to humans, but also because these 'crashes' are not actually crashes and instead a variety of things, including hitting a wild animal of unknown size or potentially minor contact with other objects of unspecified impact strength.
They make other unsubstantiated and likely just wrong claims:
> The most critical detail that gets lost in the noise is that these crashes are happening with a human safety supervisor in the driver’s seat (for highway trips) or passenger seat, with a finger on a kill switch.
The robotaxi supervisors are overwhelmingly only the passenger seat - I've never actually seen any video footage of them in the driver seat, and Electrek assuredly has zero evidence of how many of the reported incidents involved someone in the driver seat. Additionally, these supervisors in the passenger seat are not instructed to prevent every single incident (they arent going to emergency brake for a squirrel) and to characterize them as "babysitting to prevent accidents" is just wrong.
This article is full of other glaring problems and lies and mistruths but it's genuinely not worth the effort to write 5 pages on it.
If you want some insight on why Fed Lambert might be doing this, look no further than the bottom of the page: Fred gives (sells?) "investment tips" which, you guessed it, are perpetually trying to convince people to sell and short Telsa: https://x.com/FredLambert/status/1831731982868369419
Feel free to look at his other posts: it's 95% trying to convince people that Telsa is going bankrupt tomorrow, and trying to slam Elon as much as possible - sometimes for good reasons (transphobia) but sometimes in ways that really harms his credibility, if he actually had any
That same source also touches on Fred and Seth's long history of swinging either side of the bandwagon in attempts to maximize personal gain off bullshit reporting. And basically being a massive joke in automotive reporting.
So Seth Weintraub sold $TSLA at $35 a share in Jan 2020. Today $467.
Then Seth missed out on gains of 1,200%. And Fred selling in Sep, 2024 missed out on 100% gains.
With 7 reported crashes at the time, Tesla’s Robotaxi was crashing roughly
once every 40,000 miles [...]. For comparison, the average human driver
in the US crashes about once every 500,000 miles.
This means Tesla’s “autonomous” vehicle, which is supposed to be the future of safety,
is crashing 10x more often than a human driver.
That is a possible explanation for why Musk believes in people having 10x as many children. /s
Electrek were huge Tesla boosters until about 2019 or so when Tesla voided the 2 Tesla Roadsters that Electrek staff earned through the referral program.
Most minor fender benders are not reported by the involved people, whereas even the most minor ones often caused by other humans must be assiduously reported by any company doing such a rollout.
A responsible journalist with half a clue would mention that, and tell us how that distorts the numbers. If we correct for this distortion, it’s clear that the truth would come out in Tesla’s favor here.
Instead the writer embraces the distortion, trying to make Tesla look bad, and one is left to wonder if they are intentionally pushing a biased narrative.
Every 40,000 miles is every 2nd year for the average American. Every 500,000 miles is once in a lifetime for the average American.
Using your own personal experience, it should be obvious that trivial fender benders are more common than once per lifetime but significantly less common than one every couple of years.
FTA: "For comparison, the average human driver in the US crashes about once every 500,000 miles."
Does anyone know what the cite for this might be? I'm coming up empty. To my knowledge, no one (except maybe insurance companies) tallies numbers for fender bender style accidents. This seems like a weirdly high number to me, it's very rare to find any vehicle that reaches 100k miles without at least one bump or two requiring repair.
My suspicion is that this is a count of accidents involving emergency vehicle or law enforcement involvement? In which case it's a pretty terrible apples/oranges comparison.
This NHTSA report agrees with those numbers[1]. It reports 6,138,359 crashes and 3,246,817,000,000 Vehicle Miles Traveled in the US for 2023, which comes to about 530k miles per crash. The data comes from FARS which only reports fatalities, and CRSS which only includes crashes reported to the police[2]. It also only includes crashes on roadways (or from cars driving off roadways), not parking lots and other private property.
Yeah as much as I think that Tesla is full of shit, there’s no way this is true. I don’t know a single person that’s driven 500k miles lifetime but everyone I know has been in at least one minor accident.
> This seems like a weirdly high number to me, it's very rare to find any vehicle that reaches 100k miles without at least one bump or two requiring repair.
It goes seem like a high number to me - in 30 years of pretty heavy driving I've probably done about 500k miles and I've definitely had more than one incident. But not THAT many more than one, and I've put 100k miles on a few vehicles with zero incidents. Most of my incidents were when I was a newer driver who drove fairly recklessly.
Somewhat amusingly, the human rate should also be filtered based upon conditions. For years people have criticized Tesla for not adjusting for conditions with their AP safety report, but this analysis makes the same class of mistake.
1/500k miles that includes the interstate will be very different from the rate for an urban environment.
IMHO, this is not too bad! But obviously, coming from the software product industry, everyone knows that building features isn't the same as operating in practice and optimizing based on the use case, which takes a ton of time.
Waymo has a huge head start, and it is evident that the "fully autonomous" robotaxi date is far behind what Elon is saying publicly. They will do it, but it is not as close as the hype suggests.
Veserv|2 months ago
Yet somehow they claim that old versions, using old hardware, on arbitrary roads, using untrained customers as safety drivers somehow average 2.9 million miles per collision in non-highway environments [2], a ~72.5x difference in collision frequency, and 5.1 million miles per collision in all environments, a ~175x(!) difference in collision frequency, when their reporting and data are not scrutinized.
I guess their most advanced software and hardware and professional safety drivers just make it 175x more dangerous.
[1] https://techcrunch.com/2025/05/20/musk-says-teslas-self-driv...
[2] https://www.tesla.com/fsd/safety
[3] https://www.forbes.com/sites/alanohnsman/2025/08/20/elon-mus...
[3.a] Tesla own attorneys have argued that statements by Tesla executives are such nonsense that no reasonable person would believe them.
tim333|2 months ago
I'm not sure what the guys in the taxis with their hands on the arm rest do. I guess they have a button that either stops the car or connects it to a remote control operator?
randoglando|2 months ago
kstenerud|2 months ago
Ultimately, Tesla has two problems going on here:
1. Their crash rate is 2x that of Waymo.
2. They redact a lot of key information, which complicates safety assessments of their fleet.
The redactions actually hurt Tesla, because the nature of each road incident really matters: EVERY traffic incident must be reported, regardless of fault (even if it's a speeding car from the other direction that hits another vehicle which then hits the robotaxi - yes, that's actually in one of the Waymo NHTSA incident reports). When Tesla redacts the way they've been doing, it makes it very difficult to do studies like https://www.tandfonline.com/doi/full/10.1080/15389588.2025.2... which show how much safer Waymo vehicles are compared to humans WHEN IT COMES TO ACTUAL DAMAGE DONE.
We can't get that quality of info from Tesla due to their redaction practices. All we can reliably glean is that Tesla vehicles are involved in 2x the incidents per mile compared to Waymo. https://ilovetesla.com/teslas-robotaxi-dilemma-navigating-cr...
mmooss|2 months ago
orwin|2 months ago
TheAmazingRace|2 months ago
jauntywundrkind|2 months ago
The data is just not there for us outsiders to make any kind of case, and thats the skimping out crucial baseline we need.
tim333|2 months ago
rich_sasha|2 months ago
Every time when he has the choice to do something conservative or bold, he goes for the latter, and so long as he has a bit of luck, that is very much a winning strategy. To most people, I guess the stress of always betting everything on red would be unbearable. I mean, the guy got a $300m cash payout in 1999! Hands up who would keep working 100 hour weeks for 26 years after that.
I'm not saying it is either bad or good. He clearly did well out of it for himself financially. But I guess the whole cameras/lidar thing is similar. Because it's big, bold, from the outset unlikely to work, and it's a massive "fake it till you make it" thing.
But if he can crack it, again I guess he hits the jackpot. Never mind cars, they are expensive enough that Lidar cost is a rounding error. But if he can then stick 3d vision into any old cheap cameras, surely that is worth a lot. In fact wasn't this part of Tesla's great vision - to diversify away from cars and into robots etc. I'm sure the military would order thousands and millions of cheapo cameras that work 90% as well as a fancy Lidar - while being fully solid state etc.
That he is using his clients as lab rats for it is yet another reason why I'm not buying one. But to me this is totally in character for Musk.
DoesntMatter22|2 months ago
guywithahat|2 months ago
The only real test will be who creates the best product, and while waymo seems to have the lead it's arguably too soon to tell.
bparsons|2 months ago
themafia|2 months ago
cosmicgadget|2 months ago
jmpman|2 months ago
dylan604|2 months ago
koinedad|2 months ago
phyzome|2 months ago
(The one thing I would like to see done differently here is including an error interval.)
Rebelgecko|2 months ago
Austin has relatively low miles so the confidence intervals are wider but not too far from what they show for other cities
tomhow|2 months ago
Please use the original title, unless it is misleading or linkbait; don't editorialize.
https://news.ycombinator.com/newsguidelines.html
rsynnott|2 months ago
> and other self driving technologies
I mean, this isn't self-driving. It has a safety driver.
ndsipa_pomu|2 months ago
hnburnsy|2 months ago
https://www.yahoo.com/news/articles/two-self-driving-waymos-...
altairprime|2 months ago
kevin_thibedeau|2 months ago
_ea1k|2 months ago
We need far higher quality data than this to reach meaningful conclusions. Implying conclusions based upon this extrapolation is irresponsible.
mmooss|2 months ago
narrator|2 months ago
dubeye|2 months ago
rsynnott|2 months ago
wizardforhire|2 months ago
In the past it took a lot less to get the situation fixed… and these were horrendous situations! [1][2] And yet tesla is a factor of 10 worse!
[1] https://en.wikipedia.org/wiki/Ford_Pinto
[2] https://en.wikipedia.org/wiki/Firestone_and_Ford_tire_contro...
93po|2 months ago
This one is misleading both in that 8 "crashes" is statistically insignificant to draw conclusions as to its safety compared to humans, but also because these 'crashes' are not actually crashes and instead a variety of things, including hitting a wild animal of unknown size or potentially minor contact with other objects of unspecified impact strength.
They make other unsubstantiated and likely just wrong claims:
> The most critical detail that gets lost in the noise is that these crashes are happening with a human safety supervisor in the driver’s seat (for highway trips) or passenger seat, with a finger on a kill switch.
The robotaxi supervisors are overwhelmingly only the passenger seat - I've never actually seen any video footage of them in the driver seat, and Electrek assuredly has zero evidence of how many of the reported incidents involved someone in the driver seat. Additionally, these supervisors in the passenger seat are not instructed to prevent every single incident (they arent going to emergency brake for a squirrel) and to characterize them as "babysitting to prevent accidents" is just wrong.
This article is full of other glaring problems and lies and mistruths but it's genuinely not worth the effort to write 5 pages on it.
If you want some insight on why Fed Lambert might be doing this, look no further than the bottom of the page: Fred gives (sells?) "investment tips" which, you guessed it, are perpetually trying to convince people to sell and short Telsa: https://x.com/FredLambert/status/1831731982868369419
Feel free to look at his other posts: it's 95% trying to convince people that Telsa is going bankrupt tomorrow, and trying to slam Elon as much as possible - sometimes for good reasons (transphobia) but sometimes in ways that really harms his credibility, if he actually had any
Lambert has also been accused of astrotrufing in lawsuits, and had to go through a settlement that required him to retract all the libel he had spread: https://www.thedrive.com/tech/21838/the-truth-behind-electre...
That same source also touches on Fred and Seth's long history of swinging either side of the bandwagon in attempts to maximize personal gain off bullshit reporting. And basically being a massive joke in automotive reporting.
The owner of Eletrek, Seth Weintraub, also notably does the same thing: https://x.com/llsethj/status/1217198837212884993
Mawr|2 months ago
smoovb|2 months ago
perrohunter|2 months ago
panarky|2 months ago
7e|2 months ago
platevoltage|2 months ago
NedF|2 months ago
[deleted]
thomassmith65|2 months ago
nodesocket|2 months ago
[deleted]
_ea1k|2 months ago
[deleted]
DoesntMatter22|2 months ago
boringg|2 months ago
[deleted]
bryanlarsen|2 months ago
Tagbert|2 months ago
youarentrightjr|2 months ago
natch|2 months ago
A responsible journalist with half a clue would mention that, and tell us how that distorts the numbers. If we correct for this distortion, it’s clear that the truth would come out in Tesla’s favor here.
Instead the writer embraces the distortion, trying to make Tesla look bad, and one is left to wonder if they are intentionally pushing a biased narrative.
bryanlarsen|2 months ago
Using your own personal experience, it should be obvious that trivial fender benders are more common than once per lifetime but significantly less common than one every couple of years.
ajross|2 months ago
Does anyone know what the cite for this might be? I'm coming up empty. To my knowledge, no one (except maybe insurance companies) tallies numbers for fender bender style accidents. This seems like a weirdly high number to me, it's very rare to find any vehicle that reaches 100k miles without at least one bump or two requiring repair.
My suspicion is that this is a count of accidents involving emergency vehicle or law enforcement involvement? In which case it's a pretty terrible apples/oranges comparison.
pavon|2 months ago
[1] https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/...
[2] https://www.nhtsa.gov/crash-data-systems/crash-report-sampli...
habosa|2 months ago
furyofantares|2 months ago
It goes seem like a high number to me - in 30 years of pretty heavy driving I've probably done about 500k miles and I've definitely had more than one incident. But not THAT many more than one, and I've put 100k miles on a few vehicles with zero incidents. Most of my incidents were when I was a newer driver who drove fairly recklessly.
_ea1k|2 months ago
1/500k miles that includes the interstate will be very different from the rate for an urban environment.
senordevnyc|2 months ago
tigranbs|2 months ago
Waymo has a huge head start, and it is evident that the "fully autonomous" robotaxi date is far behind what Elon is saying publicly. They will do it, but it is not as close as the hype suggests.
verteu|2 months ago
phyzome|2 months ago