No matter the cause of this accident or previous ones, Tesla's response highlights why I'll never buy one of their cars. When an accident occurs, the response by Tesla should be nothing more than "we will provide any and all data we have concerning this incident to the appropriate authorities when we receive a valid request for that information."
Musk tweeting about whether or not the driver purchased certain features is insane. He shouldn't have access to that data, let alone be allowed to publicize the information.
I recognise the risk — these days I assume that entirely innocuous facts about me today can become socially unacceptable in the future, having witnessed this happen to multiple people in my life already.
I also recognise that if Tesla doesn’t get out in front of every single incident, it may set back the replacement of human drivers with safer AI drivers.
I also also recognise that Musk is wildly overpromising on the self driving tech, and really only trust him as a rocket scientist and salesman (be the thing he’s selling cars or visions of the future), not digital privacy.
I don’t really trust any famous person for privacy, because they necessarily don’t have even close to as much of it as a normal person.
I have my mom Texting me that I shouldn't drive my Tesla because of this, and I should sell it and buy something else.
How many of those people will read the follow up paper from the NTSB?
This is literally the definition of smear. Shout something bad about a successful X, and by the time the actual truth comes out, the damage is already done and people have moved on.
> Musk tweeting about whether or not the driver purchased certain features is insane. He shouldn't have access to that data, let alone be allowed to publicize the information.
I agree with you up to this phrase. A company knows exactly what you purchased from them, so knowing what a certain customer bought is just a call away, and not only for the CEO.
Regarding publicizing if you bought a certain feature or not, I think withholding that information when the media, and even the police, were blaming the non-existent feature for the accident, is more than you could ask of most of us.
Choosing between risking a fine for divulging shopping data, or a costly and unfair reputation hit while the police investigates. The decision seems obvious.
The story was widely reported as a driverless Tesla crash. In fact, there are no driverless Teslas. Furthermore the Autopilot driver assistance feature was not even used and could not have been used. It's too bad you will never buy one of their cars because of Musk trying to correct this story, because their cars are quite good.
If you actually were a potential Tesla buyer before this, you'd certainly be in the minority if this is what changed your mind. Most people see the transparency as a good thing, especially people with concerns about whether their own car might kill them. By getting in front of the story, he's doing his job. Correcting a false and derogatory narrative that threatens your differentiating feature in an extremely competitive market is absolutely the most important thing he should be doing.
It feels a bit odd for them to say, autopilot could not have been used in that area. "In a reconstruction using a similar Tesla, the agency found that Autosteer was not usable on that part of the road, meaning the Autopilot system couldn't have worked." Whose to say that Tesla didn't update something between the crash and when the NTSB tested autopilot in that area? I don't think they did do that, but are there any safeguards in place to detect that happening?
Its one thing to say that a car cannot do something because of a physical issue on the car, vs some software that can be updated or have bugs that haven't been caught yet.
This point feels a little strained. There is a question: "Was autopilot in use?", and an experiment was performed to answer that question, which produced an informative result, which was reported. You seem to be attacking the linguistic structure of the statement by interpreting "coudn't" to mean some kind of existentially inviolate truth and not just... the result of the experiment.
I mean, come on. This is what people were saying within hours after the crash: autopilot as shipped simply doesn't have behavior consistent with this accident. It won't command the accelerations that would have been required on that tiny street and (as measured by the NTSB) won't even engage in that particular environment.
My wife's truck has a lane guidance system. It will gladly keep you in a well marked lane. When there are no lines or bad lines it makes a loud ding and shows a big orange cancelation message on the dash display. This is a current year F150. I can only imagine Tesla has at least as good a system as Ford. As for updates, the infotainment system has a record of the last check-in and the last update, as well as the current systems version, similar to your phone or computer. With the number of Tesla hackers out there, it seems nearly impossible for the conspiracy you're suggesting to be a reality.
The article says that it couldn't have been in autopilot because autopilot requires lane markings for it to be activated and that area has no lane markings. It's nothing to do with software versions.
also just because the driver was in the driver seat at the time of being on camera doesn't preclude some sort of stupid behavior like leaving the driver seat while on "autopilot"
All facts aside, people are still required to be in control of their cars and need to be legally able to intervene.
"But I used autopilot" is probably the most used excuse to avert fault away from the driver. Especially in the most broken insurance system on the planet where an accident like this leads to such a change in life that the driver is scared of the outcome.
Can you provide an example of a publication that "strongly implied or outright claimed" on their own that autopilot was involved?
At least in the mainstream press, they seem to have reported what the police were saying at the time, and indicated that they were quoting the police. For example, from the NYT [1]:
> Two men were killed in Texas after a Tesla they were in crashed on Saturday and caught fire with neither of the men behind the wheel, the authorities said.
> Mark Herman, the Harris County Precinct 4 constable, said that physical evidence from the scene and interviews with witnesses led officials “to believe no one was driving the vehicle at the time of the crash.”
They also attempted to get Tesla's side of the story, but Tesla didn't respond:
> Tesla, which has disbanded its public relations team, did not respond to a request for comment.
I'd be upset with the police for making such confident statements that were contradicted by later investigations. But the media, who may not have had access to the accident scene, seems to have done a reasonable job reporting, at least from what I've seen.
In an ideal world, maybe they'd do an independent investigation instead of relying on police statements, but it's a question of how to allocate resources across all things they could be reporting. It probably makes it more difficult to do this when Tesla won't talk to them.
These days, the cycle with the fake news is long gone by the time anyone knows any facts. The facts arrive too late. The clicks have been clicked. The damage has been done.
The media moves on to its next money-making event. No real repercussions are to be found.
It would also be great if we got retractions of all the "burned uncontrollably for hours" reports where the firemen "couldn't put it out, and Tesla didn't respond to calls to help".
It was "strongly implied" because no one was in the driver's seat. Now there is evidence there was a driver, after an investigation. Why should they retract anything?
another anecdote: an acquaintance let a friend borrow their Tesla X plaid (or similar, I think this was before plaid). Upon leaving their subdivision the "friend" immediately totaled the X by launching it into a drainage ditch in Florida. I've never driven a Tesla X or any car with this much power. I am surprised by how someone can get into so much trouble in 550 feet - but clearly people are surprised and don't let up on the accelerator.
I just rented a similar Model X (P100D) last week. They are remarkably fast for the size of the vehicle, but I found the accelerator pedal to be the closest thing to an ideal throttle control that I've experienced. It's extremely gentle and forgiving in slow/close situations (e.g. parallel parking), has instantaneous response and backing off of it brings an aggressive regen braking effect that is very featherable (?) and useful.
I would guess they just panicked in a strange an extremely expensive car and weren't able to lift their foot off the pedal once doom was imminent. It happened to me when I was a kid, I ran over a fence while turning my dad's girlfriends car around in a tight parking lot. I accidentally goosed the throttle while backing up, panicked, then stomped it to the floor while backing over that poor fence. Very strange experience, almost like my leg was being shocked and I couldn't control it.
And he was arguing that the plaid 0-60 time of 1.99 seconds had an asterisk: with 1-foot rollout
Rollout is when you put your car tires between the two light beams at the drag strip. When the lights change, and the car starts moving and the front tire clears the first line that is the 1-foot rollout location.
What I found interesting
@ 1 foot a tesla will be going 5-6 miles per hour
@ ~100 feet it will hit 60 mph
@ 550 feet... a P100D can probably be going pretty fast.
The thing I continue to find interesting is the fear involved with an "autopilot" crash (even if this was not one).
Every day around the world, hundreds (1000s?) of people are killed in standard car accidents, whereas, as a proportion of the market, driverlss cars cause a tiny number of crashes, yet people panic, regulators bear their teeth and protesters speak up as soon as single accident was caused by (or assumed to be caused by) a driverless car.
I guess it is an illusion of control in normal cars, despite the fact that electronics should be better in almost every regard for safer driving.
> despite the fact that electronics should be better in almost every regard for safer driving.
Maybe in the future. For now electronics are still handicapped by AI that is not anywhere even remotely close to humans.
"AIs" ability to reason about traffic situations and, by extension, planning are laughable and even if that properly worked it would still be handicapped by image recognition I wouldn't trust to tell a trashcan from a car.
At this point it's pretty obvious that until better AI comes along we're stuck with terrible. Certainly just pouring ever more resources into "current gen" NNs won't get us anywhere.
It's a matter of trust. Do you trust a robot to make decisions of life or death? You can't empathize with the robot, you know it can't empathize with you, and you instinctively don't feel like you can understand or predict the actions of agents that you don't empathize with and don't think empathizes with you. Humans make mistakes but they're understandable or comprehendible mistakes; with a robot you don't know types of mistakes to expect. You don't have a theory of mind for a robot.
> Every day around the world, hundreds (1000s?) of people are killed in standard car accidents
Hundreds? Lol. In India alone 150,000 people die in road accidents every year. That is almost 500 a day for 1 country. So over the world it must be well into multiple 1000s.
There is a group of people who are into cars and who agree that autopilot is generally safer. They consider themselves being exceptional drivers (and to be fair they often are pretty good drivers) and therefore they think that in their particular case it's safer to not use it.
Of course determining so on individual feelings is dubious (and perhaps there is or soon will be no person outperforming the AP).
I really don't understand how they can make this statement:
> The report states that when the car started, security video shows the owner in the driver's seat, contradicting reports at the time of the April 17 accident that the seat was empty when the car crashed.
Those two things aren't contradictory at all. The car's journey could have started with the driver in the driver's seat, but been empty when the crash occurred. A lot can happen in the time between the start and the end.
I suspect they tried to activate AutoSteer, didn’t realize they failed, and let TACC (which did activate) drove them straight into some trees. So much for automatic emergency braking.
Automatic emergency braking systems don't generally brake for stationary objects once the vehicle is going beyond parking speeds. They're really designed to reduce colission forces between vehicles in motion (or recently in motion), and some more recent systems try to reduce forces between vehicles and pedestrians.
A Tesla is happy to slam into a parked emergency vehicle at full speed, as has been demonstrated several times; it's not surprising they don't stop for trees either.
It's weird that the NTSB drove another Tesla car on the same road and concluded that because autopilot wasn't available for their car it couldn't have been on in the crash.
Aren't vehicle logs a thing? What if there was a bug which mistakenly turned it on in the original car? What if there was a software update sometime between the crash and the investigation which changed the behavior? The article even says that the investigators recovered the damaged "black box", but a month later we have no details about it?
> The car’s restraint control module, which can record data associated with
vehicle speed, belt status, acceleration, and airbag deployment, was recovered but sustained fire
damage. The restraint control module was taken to the National Transportation Safety Board
(NTSB) recorder laboratory for evaluation.
Also, the accident was less than two months ago, not two years. It's likely they're still recovering and analyzing data.
Teslas log a tremendous amount of data, but my guess is that it was melted in the fire and unable to upload before it was destroyed. There is a black box with some logs and I believe 8 photos that are taken when the airbags deploy but perhaps the fire destroyed that too.
- The gross generalization saying that it couldn't have been enabled because they failed to enable it I their specific situation.
- The gross extrapolation of saying that the driver must have been in the seat when it happened because the driver was in the seat when the car started.
I mean, how useless is that of an investigation.
Where is the log analysis, witness reports, and camera images of points closer to the crash site.
nothing to see here. they were just going 30 and hit a tree randomly and then got in the back seat and burned to death over 4 hours with a fire that was impossible to put out and the doors non operational
oh and i’m sure all these brake failures in china have nothing to do with it
[+] [-] tallanvor|4 years ago|reply
Musk tweeting about whether or not the driver purchased certain features is insane. He shouldn't have access to that data, let alone be allowed to publicize the information.
[+] [-] ben_w|4 years ago|reply
I recognise the risk — these days I assume that entirely innocuous facts about me today can become socially unacceptable in the future, having witnessed this happen to multiple people in my life already.
I also recognise that if Tesla doesn’t get out in front of every single incident, it may set back the replacement of human drivers with safer AI drivers.
I also also recognise that Musk is wildly overpromising on the self driving tech, and really only trust him as a rocket scientist and salesman (be the thing he’s selling cars or visions of the future), not digital privacy.
I don’t really trust any famous person for privacy, because they necessarily don’t have even close to as much of it as a normal person.
[+] [-] brianwawok|4 years ago|reply
I have my mom Texting me that I shouldn't drive my Tesla because of this, and I should sell it and buy something else.
How many of those people will read the follow up paper from the NTSB?
This is literally the definition of smear. Shout something bad about a successful X, and by the time the actual truth comes out, the damage is already done and people have moved on.
[+] [-] ASalazarMX|4 years ago|reply
I agree with you up to this phrase. A company knows exactly what you purchased from them, so knowing what a certain customer bought is just a call away, and not only for the CEO.
Regarding publicizing if you bought a certain feature or not, I think withholding that information when the media, and even the police, were blaming the non-existent feature for the accident, is more than you could ask of most of us.
Choosing between risking a fine for divulging shopping data, or a costly and unfair reputation hit while the police investigates. The decision seems obvious.
[+] [-] fastball|4 years ago|reply
[+] [-] omarforgotpwd|4 years ago|reply
[+] [-] stjohnswarts|4 years ago|reply
[+] [-] lawnchair_larry|4 years ago|reply
[+] [-] h0nd|4 years ago|reply
However, prior to any tweets by Musk, the media already claimed that autopilot was (likely) on as there was no person in the drivers seat.
Is it possible that Tesla's response would have been different in case there were no such claims being made initially?
[+] [-] majormunky|4 years ago|reply
Its one thing to say that a car cannot do something because of a physical issue on the car, vs some software that can be updated or have bugs that haven't been caught yet.
[+] [-] ajross|4 years ago|reply
I mean, come on. This is what people were saying within hours after the crash: autopilot as shipped simply doesn't have behavior consistent with this accident. It won't command the accelerations that would have been required on that tiny street and (as measured by the NTSB) won't even engage in that particular environment.
[+] [-] igetspam|4 years ago|reply
[+] [-] zik|4 years ago|reply
[+] [-] ec109685|4 years ago|reply
[+] [-] babesh|4 years ago|reply
[+] [-] _ea1k|4 years ago|reply
[+] [-] unknown|4 years ago|reply
[deleted]
[+] [-] jiofih|4 years ago|reply
[+] [-] senectus1|4 years ago|reply
[+] [-] cookiengineer|4 years ago|reply
"But I used autopilot" is probably the most used excuse to avert fault away from the driver. Especially in the most broken insurance system on the planet where an accident like this leads to such a change in life that the driver is scared of the outcome.
[+] [-] ffhhj|4 years ago|reply
[+] [-] websites420|4 years ago|reply
[+] [-] jonas21|4 years ago|reply
At least in the mainstream press, they seem to have reported what the police were saying at the time, and indicated that they were quoting the police. For example, from the NYT [1]:
> Two men were killed in Texas after a Tesla they were in crashed on Saturday and caught fire with neither of the men behind the wheel, the authorities said.
> Mark Herman, the Harris County Precinct 4 constable, said that physical evidence from the scene and interviews with witnesses led officials “to believe no one was driving the vehicle at the time of the crash.”
They also attempted to get Tesla's side of the story, but Tesla didn't respond:
> Tesla, which has disbanded its public relations team, did not respond to a request for comment.
I'd be upset with the police for making such confident statements that were contradicted by later investigations. But the media, who may not have had access to the accident scene, seems to have done a reasonable job reporting, at least from what I've seen.
In an ideal world, maybe they'd do an independent investigation instead of relying on police statements, but it's a question of how to allocate resources across all things they could be reporting. It probably makes it more difficult to do this when Tesla won't talk to them.
[1] https://www.nytimes.com/2021/04/18/business/tesla-fatal-cras...
[+] [-] myfavoritedog|4 years ago|reply
The media moves on to its next money-making event. No real repercussions are to be found.
[+] [-] grecy|4 years ago|reply
Turns out, that was all nonsense too:
https://www.caranddriver.com/news/a36189237/tesla-model-s-fi...
[+] [-] itsoktocry|4 years ago|reply
[+] [-] harshaw|4 years ago|reply
[+] [-] jcims|4 years ago|reply
I would guess they just panicked in a strange an extremely expensive car and weren't able to lift their foot off the pedal once doom was imminent. It happened to me when I was a kid, I ran over a fence while turning my dad's girlfriends car around in a tight parking lot. I accidentally goosed the throttle while backing up, panicked, then stomped it to the floor while backing over that poor fence. Very strange experience, almost like my leg was being shocked and I couldn't control it.
[+] [-] m463|4 years ago|reply
And he was arguing that the plaid 0-60 time of 1.99 seconds had an asterisk: with 1-foot rollout
Rollout is when you put your car tires between the two light beams at the drag strip. When the lights change, and the car starts moving and the front tire clears the first line that is the 1-foot rollout location.
What I found interesting
@ 1 foot a tesla will be going 5-6 miles per hour
@ ~100 feet it will hit 60 mph
@ 550 feet... a P100D can probably be going pretty fast.
[+] [-] xeromal|4 years ago|reply
[+] [-] King-Aaron|4 years ago|reply
[+] [-] new_realist|4 years ago|reply
[+] [-] ffggvv|4 years ago|reply
[+] [-] itsoktocry|4 years ago|reply
[+] [-] lbriner|4 years ago|reply
Every day around the world, hundreds (1000s?) of people are killed in standard car accidents, whereas, as a proportion of the market, driverlss cars cause a tiny number of crashes, yet people panic, regulators bear their teeth and protesters speak up as soon as single accident was caused by (or assumed to be caused by) a driverless car.
I guess it is an illusion of control in normal cars, despite the fact that electronics should be better in almost every regard for safer driving.
[+] [-] chmod775|4 years ago|reply
Maybe in the future. For now electronics are still handicapped by AI that is not anywhere even remotely close to humans.
"AIs" ability to reason about traffic situations and, by extension, planning are laughable and even if that properly worked it would still be handicapped by image recognition I wouldn't trust to tell a trashcan from a car.
At this point it's pretty obvious that until better AI comes along we're stuck with terrible. Certainly just pouring ever more resources into "current gen" NNs won't get us anywhere.
[+] [-] nerfhammer|4 years ago|reply
[+] [-] zapdrive|4 years ago|reply
Hundreds? Lol. In India alone 150,000 people die in road accidents every year. That is almost 500 a day for 1 country. So over the world it must be well into multiple 1000s.
[+] [-] yreg|4 years ago|reply
Of course determining so on individual feelings is dubious (and perhaps there is or soon will be no person outperforming the AP).
[+] [-] amckenna|4 years ago|reply
> The report states that when the car started, security video shows the owner in the driver's seat, contradicting reports at the time of the April 17 accident that the seat was empty when the car crashed.
Those two things aren't contradictory at all. The car's journey could have started with the driver in the driver's seat, but been empty when the crash occurred. A lot can happen in the time between the start and the end.
[+] [-] new_realist|4 years ago|reply
[+] [-] toast0|4 years ago|reply
A Tesla is happy to slam into a parked emergency vehicle at full speed, as has been demonstrated several times; it's not surprising they don't stop for trees either.
[+] [-] paxys|4 years ago|reply
Aren't vehicle logs a thing? What if there was a bug which mistakenly turned it on in the original car? What if there was a software update sometime between the crash and the investigation which changed the behavior? The article even says that the investigators recovered the damaged "black box", but a month later we have no details about it?
Edit: changed year to month /facepalm
[+] [-] dharmab|4 years ago|reply
Also, the accident was less than two months ago, not two years. It's likely they're still recovering and analyzing data.
[+] [-] xeromal|4 years ago|reply
[+] [-] barney54|4 years ago|reply
[+] [-] mrpippy|4 years ago|reply
[+] [-] francisduvivier|4 years ago|reply
- The gross generalization saying that it couldn't have been enabled because they failed to enable it I their specific situation.
- The gross extrapolation of saying that the driver must have been in the seat when it happened because the driver was in the seat when the car started.
I mean, how useless is that of an investigation. Where is the log analysis, witness reports, and camera images of points closer to the crash site.
I feel very disappointed in the NTSB.
[+] [-] ffggvv|4 years ago|reply
oh and i’m sure all these brake failures in china have nothing to do with it
https://www.cnbc.com/2021/04/23/tesla-in-china-pressure-moun...
[+] [-] jeremiahhs|4 years ago|reply
https://www.caranddriver.com/news/amp36189237/tesla-model-s-...