The desire to have a car that drives itself is so high that it overrides folks' self preservation. They'll pay kilodollars for unsafe systems. Regulation is the only solution I can see to prevent buggy beta software unleashing mayhem on the road.
Reading the facts of the accident central to this investigation, I imagine most meat bag driving systems would have also probably hit the car. It's not like the car ran into a stopped police car with flashing lights on.
On the other hand, it's hard to build safe self-driving cars without actually deploying them in the real world. If we can tolerate some level of risk, we can get to self-driving cars sooner.
Every year we delay having safe self-driving cars, about 50,000 people die on the road who don't need to die.
Have you used it? I find that when you use it as designed and pay attention it is much safer and easier to drive. Especially in road trips I arrive fresher than I would if I were manually driving the entire time. Certainly there are dangers but there are already huge benefits.
I'd rather have slightly unsafe self-driving cars than completely unsafe human-driven cars where the human is distracted by whatever's on their phone, kids screaming, accidents on the road, etc.
The current mayhem on the road is killing 40K people every year in the US, and I'm not scared of that. So you'll have to excuse me for not caring about this hypothetical software induced mayhem.
There needs to be law or case law so that the manufacturers are proportionally responsible for harm the driver assistance features cause. It's the only way that it'll affect their bottom line enough for them to take safely as seriously as they should.
It takes thousands of the smartest, highest paid people to make AI work (sort of) most-of-the-time in closed systems like chat interfaces and image generation. There is also no risk to life there.
Trying to get this stuff to work on the road is a fool's errand. Just work on warehouses and factory floors. There is at least a business model there and the world is usually climate controlled. What a huge waste of human capital pursuing an impossibility with this self driving car nonsense.
Huge waste of human capital kind of applies to the auto industry in general. There are so many makes and models and variations all requiring advanced engineering and manufacturing. We could solve a lot of other big problems w/ those engineers.
I don't understand the general tenor and sentiment of the comments here. I've taken dozens of self-driving cars rides in SF in the last few months. They were magical, and _they work_.
It blows my mind that the general defeatist tone is "it can't be done", while it's literally happening right now. It took a bajillion dollars and decades of work, but we're past the tipping point now.
Sure, regulation must happen, it's not like a chat bot where screwing up is worst case scenario being canceled for a week. Lives are on the line. But outright "it's impossible and must be stopped" is literally against progress.
Yes, Tesla's FSD sucked and they generated a lot of bad reputation, both for themselves and the industry. But FSD v12 (end-to-end ML) - their latest release, is leaps and bounds ahead of v11. I only used to use v11 on relatively empty highways, like cross-country road trips.
With v12, I leave it on 95% of the time - their cameras see more than I do, and process things quicker than I can. The onus is still on me to pay attention, and I do. Yes, there would be idiots who don't. But then again, there are idiot drunk drivers as well.
I am beginning to believe that in a year, Tesla v12 will be really really good, and safer on the road than an average human driver. It probably already is. I haven't researched the stats.
But the current state of the art is Waymo - at this point, a Waymo is actually safer than human drivers. People need to take a few rides in them to believe it - its almost a solved problem to navigate on city roads.
I jumped into my co-workers Tesla to go to lunch. I asked him if he ever uses the self-driving feature. He turned it on and in less than a second, the car veered directly into the middle turning lane. I watched him yank at the wheel and disable the self-driving mode, explaining, "It's good but sometimes it does that".
It's hard to look at a system where you have "AI" directly causing human deaths and not have a knee-jerk reaction that it can't be done and should be regulated out of existence, even if it's objectively safer than humans. It's an emotional position but as they say it's nigh impossible to reason someone out of a position they didn't reason themselves into.
What you are describing is essentially anecdata in the grand scheme of things.
Yes, there are absolutely scenarios and situations where FSD is a solved problem. The issues that relative that all of the situations that occur across the country (and across the world) daily, the percentage of daily miles driven where FSD can perform flawlessly is likely less than 5% of total miles.
It's not about if it can be done or not. Waymo is very impressive, and they continue to expand their scope. Good on them. But their sensor suite is very expensive.
BlueCruise is not self-driving. Neither is Tesla's Full Self Driving; despite the name, read their letter to the CA DMV.
Both of those systems will absolutely ignore stationary vehicles in the path of travel when travelling at highway speeds.
World doesn't end at SF borders, nor does it revolve around it.
I can come up with tons of corner cases where I simply won't risk life of me and my whole family just because some tech bro said so on the internet. And you know, tons of corner cases that I sometimes experience all over the world sum up into some major percentage.
By all means be a betatester, but don't force it down the throats of unsuspecting non-tech users who often trust what manufacturers claim.
[+] [-] flerchin|1 year ago|reply
[+] [-] vel0city|1 year ago|reply
[+] [-] andrewmutz|1 year ago|reply
Every year we delay having safe self-driving cars, about 50,000 people die on the road who don't need to die.
[+] [-] eagerpace|1 year ago|reply
[+] [-] drcode|1 year ago|reply
[+] [-] ActionHank|1 year ago|reply
[+] [-] nunez|1 year ago|reply
[+] [-] deadbabe|1 year ago|reply
[+] [-] fallingknife|1 year ago|reply
[+] [-] whatsakandr|1 year ago|reply
[+] [-] monero-xmr|1 year ago|reply
Trying to get this stuff to work on the road is a fool's errand. Just work on warehouses and factory floors. There is at least a business model there and the world is usually climate controlled. What a huge waste of human capital pursuing an impossibility with this self driving car nonsense.
[+] [-] hervature|1 year ago|reply
Careful with absolute statements [1].
[1] - https://www.vice.com/en/article/pkadgm/man-dies-by-suicide-a...
[+] [-] mdgrech23|1 year ago|reply
[+] [-] drcode|1 year ago|reply
[+] [-] giansegato|1 year ago|reply
It blows my mind that the general defeatist tone is "it can't be done", while it's literally happening right now. It took a bajillion dollars and decades of work, but we're past the tipping point now.
Sure, regulation must happen, it's not like a chat bot where screwing up is worst case scenario being canceled for a week. Lives are on the line. But outright "it's impossible and must be stopped" is literally against progress.
[+] [-] Foofoobar12345|1 year ago|reply
Yes, Tesla's FSD sucked and they generated a lot of bad reputation, both for themselves and the industry. But FSD v12 (end-to-end ML) - their latest release, is leaps and bounds ahead of v11. I only used to use v11 on relatively empty highways, like cross-country road trips.
With v12, I leave it on 95% of the time - their cameras see more than I do, and process things quicker than I can. The onus is still on me to pay attention, and I do. Yes, there would be idiots who don't. But then again, there are idiot drunk drivers as well.
I am beginning to believe that in a year, Tesla v12 will be really really good, and safer on the road than an average human driver. It probably already is. I haven't researched the stats.
But the current state of the art is Waymo - at this point, a Waymo is actually safer than human drivers. People need to take a few rides in them to believe it - its almost a solved problem to navigate on city roads.
I excited for what the future holds.
[+] [-] beart|1 year ago|reply
[+] [-] pc86|1 year ago|reply
[+] [-] brk|1 year ago|reply
FSD can be done, IMO, but it can't be done today.
[+] [-] toast0|1 year ago|reply
BlueCruise is not self-driving. Neither is Tesla's Full Self Driving; despite the name, read their letter to the CA DMV.
Both of those systems will absolutely ignore stationary vehicles in the path of travel when travelling at highway speeds.
[+] [-] jajko|1 year ago|reply
I can come up with tons of corner cases where I simply won't risk life of me and my whole family just because some tech bro said so on the internet. And you know, tons of corner cases that I sometimes experience all over the world sum up into some major percentage.
By all means be a betatester, but don't force it down the throats of unsuspecting non-tech users who often trust what manufacturers claim.
[+] [-] xnx|1 year ago|reply
[+] [-] unknown|1 year ago|reply
[deleted]
[+] [-] unknown|1 year ago|reply
[deleted]
[+] [-] bearjaws|1 year ago|reply
[+] [-] ImPostingOnHN|1 year ago|reply
[+] [-] flerchin|1 year ago|reply
[+] [-] skeeterbug|1 year ago|reply