top | item 43016931

Tesla Cybertruck Drives Itself into a Pole, Owner Says 'Thank You Tesla'

207 points| computerliker | 1 year ago |thedrive.com

219 comments

order

MiscIdeaMaker99|1 year ago

I've owned a Model 3 for years now, and FSD is scary as hell. We haven't paid for it -- and we won't -- but every time we get a free trial of it (mostly recently this past Fall), I give it a whirl, and I end up turning it off. Why? Because it does weird shit like slow down at an intersection with a green light. I don't feel like I can trust it, at all, and it makes me more anxious than just using standard auto-steer and cruise control (which still ghost breaks sometimes). I don't get why anyone uses FSD.

jvanderbot|1 year ago

Don't even get me started. Here's a list of things my model Y regularly does:

- Try to accelerate to 45mph in a parking lot b/c it was within 10ft of the road

- Decelerate from highway speeds suddenly to 30mph, as though it saw something it might hit (I stopped it at 30-ish and hit the gas)

- Decelerate to 50mph because of "emergency vehicles" even though there were no vehicles around (sometimes it mistakes lights that strobe b/c they are seen through median dividers as "emergency lights")

- Take up two lanes because they gradually separated and the car thinks it should stay evenly between the left and right divider line

- choose absolutely bonkers limits, like 30mph on two lane country highways.

- Stop on the highway with a big red screen and a message that says "Take control now fatal error"

- Not so much a problem any more, but when I was first getting used to it, it would beep a message at me, then scold me for looking at the message (and not the road), then ask me to do some kind of hand grip on the wheel to prove I'm paying attention, but I have to look at the message to figure out what it wants.

My wife tells me "Just keep your foot on the gas to keep up the speed and your hands on the wheel to keep it in line" and I am just left wondering what FSD is for

ipython|1 year ago

What’s amazing to me is that FSD to this day cannot recognize active school zones. Even my 6 year old Audi onboard cameras can do that. If you put FSD on and you go through a school zones, the Tesla will happily zip at full speed completely ignoring the school zone.

kypro|1 year ago

I hope this isn't too off-topic, but I'm always amazed at how many otherwise smart people hold the naive belief that FSD is remotely close because 99.9% of the time it works fine.

Self-driving in my opinion will require an AI that is, if not very close to, an AI capable of general intelligence.

Why?

Because in the real world to be able to drive a car as well as a human across all of the edge cases a human can you probably need something approaching general intelligence.

Humans understand that a person isn't just something with 4 limbs, but also can be that thing that looks like a white sheet with eyes by the side of the road on Oct 31st. And its these types of weird edge cases that humans instinctively understand because they have a deep world model to reason about which cannot be reasoned about by the narrow FSD AI systems we currently have.

When you think about what humans need to do when driving it's so much beyond just watching the road and turning a wheel that it seems almost absurd to imagine our current AI is anywhere near capable of handling all of the edge cases humans currently are.

And I also don't buy this argument that the goal should be to simply to reduce the total number of accidents per mile... I'd grant that it's very possible that FSD could reduce the total number of accidents per mile driven because most miles are driven in the much more narrow environment of highway driving. And here AI probably could do better job than a human on average when you factor into the equation human tiredness and distractibility. But no one is going to be comfortable with FSD occasionally plowing into a group of kids outside a school because statistically the total number of people who die in road traffic accidents is reduced on a per mile basis.

I'd be interested if anyone strongly disagrees.

fredfoobar|1 year ago

It's so surprising to hear these issues with FSD and it makes me nervous even though I haven't encountered any problems in v13. I regularly use it back and forth between work and home and mostly rush hour with a lot of difficult merges and weird situations.

shepherdjerred|1 year ago

I test drove a model 3 ~3 years ago and FSD was terrifying. I had no idea what it could or couldn’t do. IMO Hyundai (and others with similar features) have it perfect with adaptive cruise + active lane assist. I know exactly what it can do, it does 90% of the driving on long trips, and it doesn’t do so much that I’m tempted to put too much trust in it.

karlgkk|1 year ago

It's crazy, because any negative criticism of FSD will have a ton of fanboys pouring out of the walls to tell you how great it is, how great the latest update is, how your anecdotal "evidence" is not typical, etc.

Except all you have to do is go try it and it becomes clear to any layperson that it's probably getting there but, and this is really crucial, it's not there yet.

brightball|1 year ago

It’s a little hit or miss (pun intended). I’ve used it off and on in my model 3 for the past 2 years and the rate of improvement has been significant.

Still though it has quirks.

On long trips, I LOVE it. LOVE it. Being able to just tap in and relax, make phone calls, listen to an audiobook, etc is so nice. The first time I ever used it I had to leave early from the All Things Open conference in Raleigh because I was getting sick. Having it essentially drive me home for 5 hours when I wasn’t well, including stopping to charge, was a huge relief.

It’s also great in traffic jams where you’d otherwise be dealing with stop and go traffic until you get through it. Just tap in and relax til you’re on the other side.

Day to day driving, it’s a little more iffy. I’ve dealt with seemingly random slowdowns on otherwise empty roads. It feels odd especially because it’s sudden.

Early on it would have difficulty on roads without well marked lines too.

I’ve never felt like it was going to run into an object though. Usually it errs on the “too cautious” side and I just take over to get where I’m going quicker.

axus|1 year ago

I always slow down a little bit when speeding through intersections, just in case I need to react to someone illegally putting themselves in my path.

It was this guys fault for not monitoring the car, but also Tesla's for using a double-speak name like Full Self Driving.

If FSD is a statistically significant enough risk factor for injury above Teslas that don't use it, it should be banned.

leesec|1 year ago

I'm on AI4 v13 and havent had a safety intervention in several thousand miles. It's incredible and extremely smooth

halyconWays|1 year ago

While we're sharing anecdotes, I have FSD (13 now) on my model Y and love it. I was anxious at first and remain very guarded while using it (which you're obligated to do anyway) but it's taken away a lot of the tedium and fatigue of commuting and long highway driving. I occasionally use it door-to-door but often turn it off on certain roads, where I'll do a better job avoiding potholes, for example. It's not done anything unsafe, but it did change lanes once without signaling and I had to intervene. No one was around and the lines were hard to see, so perhaps that's why. Overall it feels like a far safer driver than a lot of people I've been in the car with.

dexzod|1 year ago

   Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.
I walked away without a scratch. This could have easily killed an innocent pedestrian or bicyclist. How is this best safety engineering? If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.

benhurmarcel|1 year ago

Clearly the owner showed that this aspect is not important to them when they ordered the Cybertruck.

Freedom2|1 year ago

The US has no pedestrian safety regulations at all for car design. Some have been proposed but it's 2025 and still nothing enacted.

porphyra|1 year ago

> If the FSD failed there should have been some secondary system to detect an imminent collision and apply brakes.

There actually is. The Automatic Emergency Braking functions separately from FSD and can prevent collisions in some cases. It doesn't work 100% of the time so I wouldn't rely on it, but at least it works as well as or better than competitors' systems.

sandworm101|1 year ago

It is passive-aggressive sarcasm. If you say mean things about Tezla on X there is a chance you may be banned/sued/delisted, especially if it involves a crash. So everything has to be couched in false praise. Nobody really thinks the cybertruck does better in a crash than a merc or bmw. Its just something said by the posfer in order to get thier story to a wider audience.

decimalenough|1 year ago

Passive safety is the art of engineering cars so that when they do crash, the occupants are unharmed.

What you're asking for, though, is definitionally impossible: obviously the cameras didn't detect the obstacle, so FSD or no, they can't react to it. The actual solution would be to do what every other car maker with self-driving pretensions does and augment the cameras with LIDAR or other sensors.

nashashmi|1 year ago

Post gets 8m views as of this writing. Owner doesn’t want this message to get viral because then tesla gets flak for this. Takes the blame. Wants to share the message of FSD’s fallibility with everyone. Praises tesla for safety.

My head hurts with how oxymoronic this is. My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner. “Thank you sir for doing treat work and for fixing this problem in the future”

MisterTea|1 year ago

> My best guess is he wants to critique tesla without triggering the ego and arrogance of its owner.

Well I imagine that since Musk was handed the keys to various government agencies and installed his henchmen you can see why you want to tread lightly and kiss the ring. Such a wonderful future this is becoming.

sirbutters|1 year ago

The very fact this is a cybetruck owner, that already tells you you're dealing with a fanboy (aka cult member). edit: over simplification of course, but not far off from the truth

ikanreed|1 year ago

And my guess is there's a touch of ideology to this person that questioning Tesla and FSD's fundamental safety would hurt. "I screwed up" does a lot less to cause cognitive dissonance than "Something I believe is wrong"

There's a lot of possible flavors to that ideology, it COULD be right wing political affinity, but it also could be a belief that technology is superior to human judgement, or that self driving cars are the future, or it could just be that spending 6 figures on an ugly pickup wasn't a waste of money.

xboxnolifes|1 year ago

I think you're reading way to much into how people think.

generalizations|1 year ago

Worth reading the actual tweet, not just the article's truncation of it

> Soooooo my @Tesla @cybertruck crashed into a curb and then a light post on v13.2.4.

> Thank you @Tesla for engineering the best passive safety in the world. I walked away without a scratch.

> It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb.

> Big fail on my part, obviously. Don't make the same mistake I did. Pay attention. It can happen. I follow Tesla and FSD pretty closely and haven't heard of any accident on V13 at all before this happened. It is easy to get complacent now - don't.

> @Tesla_AI how do I make sure you have the data you need from this incident? Service center etc has been less than responsive on this.

> I do have the dashcam footage. I want to get it out there as a PSA that it can happen, even on v13, but I'm hesitant because I don't want the attention and I don't want to give the bears/haters any material.

> Spread my message and help save others from the same fate or far worse.

https://x.com/MrChallinger/status/1888546351572726230

weaksauce|1 year ago

how is this not a cult? if a toyota or mercedes car veered off into a light pole would they write the same tweets?

unless this is sarcasm that can, at best in the current times, be construed as serious.

boringg|1 year ago

Right the whole point was not to make an article attention about all the negative news. Which is funny because this thread is immediately full of complaints.

msikora|1 year ago

"Big fail on my part, obviously" WTF??? Big fail to buy Tesla and even bigger to use FSD. With my almost 20 year old car I don't have to worry about none of this BS.

nuancebydefault|1 year ago

The cybertruck owner is clearly only interested in their own safety. Luckily, in my country cybertrucks are not allowed on the road, for other people's safety.

luma|1 year ago

Weirdly, they seem more concerned for Tesla the company than they are for themselves.

InitialLastName|1 year ago

One doesn't buy a truck like that out of concern for other people.

FirmwareBurner|1 year ago

>in my country cybertrucks are not allowed on the road

In my EU country either however they have already been spotted on the streets with valid license plates. There are loopholes everywhere usually if you classify it as a commercial utility vehicle for a business instead of a passenger car. There's plenty of people with money and no scruples.

smitelli|1 year ago

Some cars, when I see photos of them smashed up, I get very sad. NA Miata, Corvette C4, etc. A totaled Cybertruck, honestly, good riddance. It is an extraordinarily difficult vehicle to love.

Very glad to hear no pedestrians got hit. Really hope the driver takes some kind of lesson away from this experience.

spprashant|1 year ago

I really doubt they are taking any lessons from this. This is the author's second Cybertruck crash in a month.

If I didn't know better, I think they are trying to farm engagement.

qwerpy|1 year ago

I drive one and I love it! I wanted a large, robust self-driving family EV and as a bonus it looks unique compared to the blobs that everyone else drives.

It's sad that we CT drivers seem to be caught in a crossfire between Tesla and Tesla/Elon haters, when all we want to do is enjoy our cars.

arghandugh|1 year ago

This is because Tesla‘s implementation hasn’t worked, doesn’t work, can’t work, won’t work, won’t ever work, and has been a decade-long intentional fraud from a con artist that was designed to pump up a meme stock. THAT worked.

hooo|1 year ago

I am no fanboy, but I have it on a 2023 model Y. It works incredibly well. It is actually already amazing. You all are just blind from Elon hate.

padjo|1 year ago

And how

smallpipe|1 year ago

Anothe proof that self-driving cars with human backups should never be allowed on public roads. It's going to be used unsafely because it encourages such behaviour.

luma|1 year ago

Specifically, SAE level 3 should be explicitly prohibited. Humans have proven, over and over, that they can’t handle that level of alertness while also not driving.

standardUser|1 year ago

Waymo has been operating near-flawlessly for years now in some busy, complicated cities. I don't see self-driving tech as the problem. It's been proven to work. I see irresponsible companies as the problem.

thinkingtoilet|1 year ago

I wouldn't say "never", however it's clear we're not there yet.

curiouser3|1 year ago

if you want to see the true horror, check out https://comma.ai. $2000 and plugs into most cars made in the past few years, works by using cracked security for the cars it is "compatible" with. These people are on the road next to you with a car being driven by a single smartphone camera. They sell it as "chill driving" but they have a discord where people just flash custom FSD firmware.

bko|1 year ago

Ah yes, the compensating behavior theory all over again. Replace "seat belts" with "driver assist"

> This paper investigates the effects of mandatory seat belt laws on driver behavior and traffic fatalities. Using a unique panel data set on seat belt usage rates in all U.S. jurisdictions, we analyze how such laws, by influencing seat belt use, affect traffic fatalities. Controlling for the endogeneity of seat belt usage, we find that it decreases overall traffic fatalities. The magnitude of this effect, however, is significantly smaller than the estimate used by the National Highway Traffic Safety Administration. Testing the compensating behavior theory, which suggests that seat belt use also has an adverse effect on fatalities by encouraging careless driving, we find that this theory is not supported by the data. Finally, we identify factors, especially the type of enforcement used, that make seat belt laws more effective in increasing seat belt usage.

[0] http://www.law.harvard.edu/programs/olin_center/papers/pdf/3...

mingus88|1 year ago

And yet FSD and even lane assist are going to be safer than the driver near you who is scrolling and typing away on their smartphone

Don’t mistake my post as a defense of FSD or Tesla. They’ve been lying about their capabilities for what feels like a decade.

I don’t want to see FSD and human drivers share a road. I want all cars to be meshed and communicating their intents with vehicles around them to avoid collisions. We will never see that in our lifetime

mmastrac|1 year ago

At some point these reckless drivers need to start going to jail. I realize it's not going to happen in the US because the government has been captured, but there's clearly some missing messaging where these drivers don't get the point that they need to be paying attention and not tweeting on their phone while their car drives into a lamppost.

fredfoobar|1 year ago

why stop at FSD users? this should apply to all drivers in general. if they cause an accident they need to go to jail.

huijzer|1 year ago

Interesting comments from X:

Snowball: "So FSD failed but you still managed to find a way to praise Tesla. You failed too for not taking over in time. But your concern isn't for the lives of third parties that You and FSD endangered. No, you are worried about Tesla getting bad publicity. You have misplaced priorities."

Jonathan Challinger (the driver who crashed): "I am rightly praising Tesla's automotive safety engineers who saved my life and limbs from my own stupidity, which I take responsibility for. [...]"

Fair points from both sides I think.

rexpop|1 year ago

If he's taking responsibility, why did he specify a semantic version?

godelski|1 year ago

There was a post on BlueSky about this the other day. Someone linked a picture of the intersection: https://bsky.app/profile/pickard.cc/post/3lhtkghk6q224

It is worth noting that this picture is a reply to a screenshot of someone saying the following:

  > I've lived in 8 different states in my life and most roads I've seen do everything they can to prevent human error (or at least they do once the human has shown them what they did wrong). The FSD should not have been fooled this easily, but the environment was the worst it could have been, also.

  - Tweet source: https://x.com/MuscleIQ2/status/1888695047044124989
I point this out because I think probably the biggest takeaway here is how often people will bend over backwards to reach the conclusion that they want, rather than update their model to the new data (akin to Bayesian Updating for you math nerds). While this example is egregious, I think we all should take a hard look at ourselves and question where we do this too. There's not one among us that isn't resistant to change our beliefs, yet it's probably one of the most important things we can do if we want to improve things. If we have any hope of being able to not be easily fooled by hype, if we are to be able to differentiate real innovation from cons, if we are able to avoid joining Cargo Cults, then this seems to be a necessity. It's easy to poke fun at this dude, but are we all certain that we're so different? I would like to think so, but I fear making such a claim is repeating the same mistake I/we are calling out.

UncleEntity|1 year ago

> And rather than being upset with Tesla for selling him a smart dumpster on wheels, the driver took blame for the incident, saying he should have been paying attention.

In all fairness he really should have been paying attention.

You don't get to abdicate your responsibility to Team Elon because reasons. At the end of the day you will be sitting in the defendant's chair while Tesla will just quietly settle out of court.

wccrawford|1 year ago

Abdicate, not advocate.

And just because it's the driver's responsibility doesn't mean it's not also Tesla's.

srameshc|1 year ago

I never knew of such outcome. Probably that is why it is advised to not agree to anything immediately after an accident.

stretchwithme|1 year ago

I think it would be wise to physically test as many corner cases as possible under extreme conditions. At night, in the snow, going down a hill, birds flying across the road at the same moment a baby robot crawls on to it.

nomel|1 year ago

This, obviously, is one of the ongoing efforts and goals of Tesla AI, and the reason they collect so much data. There's some talk about it in the Lex Friedman podcast(s)[1].

[1] https://www.youtube.com/watch?v=cdiD-9MMpb0

Animats|1 year ago

Was this statement made before or after the driver was contacted by Tesla?

Animats|1 year ago

Musk is currently claiming that Tesla will have driverless robotaxis on the road in June 2025.[1]

Be afraid. Be very afraid.

Tesla is in a bind. They've been promising self driving Real Soon Now since 2016, with occasional fake demos. Meanwhile, Waymo slowly made it work, and is taking over the taxi and car service industry, city by city.

This is a huge problem for Tesla's stock price and Musk's net worth. Now that everybody in automotive makes electric cars, that's not a high-margin business any more. Tesla is having ordinary car company problems - market share, build quality, parts, service, unsold inventory. Tesla pretends they are a special snowflake and deserve a huge P/E ratio, but that's no longer the reality.

Tesla doesn't want to test in California because of "regulation". This is bogus. The California DMV is rather lenient on testing driverless cars, and California was the first state to allow them. There was no new legislation, so DMV just copied the procedures for human drivers with a few mods. Companies can get a "learner's permit" for testing with a safety driver easily, and quite a few companies have done that. The next step up is the permit for testing without a safety driver, which is comparable to a regular driver's license. It's harder to get, and there are tests. About a half dozen companies have reached that point. No driving for hire at that level. Finally there's the deployment license, which Waymo and Zoox have. That's like a commercial drivers license, and is hard to get and keep. Cruise had one, but it was revoked after a crash where someone was killed.

That's what really scares Tesla. The California DMV can and will revoke or suspend an autonomous driving license just like they'd revoke a human one. Tesla can't just pay off everyone involved and go on.

Waymos are all over San Francisco and Los Angeles, dealing with heavy traffic, working their way around double-parked cars, dodging bikes, skateboarders, and homeless crazies, backing out when faced with an oncoming truck in a one lane street, and doing OK in complex urban settings. Tesla has never demoed that level of performance. Not even close.

[1] https://www.reuters.com/technology/tesla-robotaxis-by-june-m...

Animats|1 year ago

News update: Musk's attempt to buy OpenAI is referred to by an analyst firm as a "distraction" from Tesla's poor performance.[1] "Even with TSLA meeting its June 2025 timeline for driverless covers in (Texas), we still see TSLA as one of several autonomous technology providers, suggesting competition on price and performance."

Tesla's P/E ratio is currently 328. Ford is around 9. GM is around 8. Evaluated as a mature car company, TSLA is maybe 20x - 30x overpriced compared to the rest of the industry. A hype injection is needed to keep the price up. Optimus and fake self driving isn't enough.

[1] https://www.investors.com/news/tesla-stock-distraction-elon-...

datadrivenangel|1 year ago

The front end of the truck is impressively smashed. It must have been going quite fast?

ec109685|1 year ago

There isn’t anything in the front that is rigid so it’s designed to crumple like that to absorb the energy maximally.

I think if a car with an engine had it, the pole would have knocked over rather than any sort of t-bone.

hermitcrab|1 year ago

Your defective autopilot will drive you into a post and you will be grateful.

Gigachad|1 year ago

xAI has determined your existence to be inefficient after you said something detrimental to the Tesla share price.

throw7|1 year ago

"Big fail on my part, obviously.”

Hello? Whether it's Full Self-Driving or not, it's always your fault.

drewg123|1 year ago

I've had a 2017 Model X since new that came with FSD. I had Tesla upgrade the FSD computer (for free), and drove like a granny during the FSD trial period when you needed to have a certain "score" (which was mostly dictated by not cornering or braking hard) to be eligable for FSD.

I try it every major release, and am disappointed every time. In situations where I'd be confident, it is overly cautious. In situations where I'd be cautious, its overly confident and dangerous.

I think its best use is to keep the car in the lane while I'm distracted by something (pulling out a sandwich to eat, etc). And it seems like newer Teslas have eye tracking, so it might not even be useful for that.

agumonkey|1 year ago

Is there a term for when society shifts the frame of reference on all dimensions so that anything that would have been deemed odd / wrong now naturally ends up positive ?

wswope|1 year ago

Overton window.

It originated as a political term, but can apply to social norms too.

r00fus|1 year ago

Doublethink? Dystopian?

andix|1 year ago

I've seen the tweet before, and the issue I have is: one person is claiming FSD crashed their car. No video, no other evidence.

I'm not saying it wasn't FSD, but it is a possibility FSD wasn't even enabled.

jeffbee|1 year ago

As a general rule, a person who knows the patch level of FSD in their Cybertruck is a person who is about to say something unimaginably stupid.

belter|1 year ago

“Think of how stupid the average person is, and realize half of them are stupider than that.”

  ― George Carlin

01100011|1 year ago

Think of how stupid the average person is, and then realize intelligence seems to be normally distributed and that most people are approximately that smart with a small number being dumber or smarter than that person.

simonjgreen|1 year ago

I often wonder how far Tesla would have progressed FSD if it wasn’t for the zealous hatred of LIDAR.

ilamont|1 year ago

“It failed to merge out of a lane that was ending (there was no one on my left) and made no attempt to slow down or turn until it had already hit the curb. Big fail on my part, obviously.”

mlinhares|1 year ago

Social media has made people so insane they actively incriminate themselves for views.

bdangubic|1 year ago

"my significant other repeatedly beat the living shit out of me over and over again... obviously my fault, I deserved it..."

rsynnott|1 year ago

This is a particularly extreme case of ‘I love my Tesla, but…’

layer8|1 year ago

I’m mostly impressed that the pole held up.

bdangubic|1 year ago

exactly my first and 1,345 thought! :)

gargalatas|1 year ago

obviously he was.. taking a nap.

ck2|1 year ago

one spooky sub is https://old.reddit.com/r/CyberStuck

FSD is clearly not even beta quality

people keep saying it's "trying to commit suicide"

and it's being fixed on the fly at everyone else's cost in life

But now they are removing federal reporting requirements so buyers will NEVER know

viraptor|1 year ago

> removing federal reporting requirements

Got a link for that one?