top | item 44359308

(no title)

owenwil | 8 months ago

I don’t think this is a surprise to anyone who actually owns a Tesla (I own a Model Y). Full self-driving is just _bad_ in comparison to technology from Waymo et al; it slams on the brakes suddenly for shadows, veers into the wrong lane, hesitates in pretty standard intersections, and doesn’t even understand basic concepts like school buses or trains. Here in BC, it completely ignores the 30kph school safety zones, which seems pretty basic.

My experience with FSD is that while it feels “magic” at times, it’s like a teenage driver that you have to baby sit constantly. It’s genuinely impressive how well it works given the really limited hardware, but if you use it routinely you know it will make at least one weird/dangerous choice on every trip.

Generally, I really don’t trust it in most situations except properly delineated highways, but even then it can be a crapshoot. If you’ve experienced FSD then get in a Waymo, they are night and day different—a lot more predictable, and able to navigate uncertainty compared with what Tesla has built. It’s likely down to a combination of both software and their insistence that radar doesn’t matter, but it clearly does.

I would never get in a Tesla that purports to drive itself, there’s no way it’s safe or worth the risk. I won’t even use it with my family in the car.

I know a handful of others who own Teslas and feel the same, despite what the fans spout online. I generally like my Model Y, but I definitely do not trust FSD—I find it hard to believe that it’s even being taken seriously in the media. Not a great endorsement if even your own customers don’t trust it after use it.

discuss

order

manjalyc|8 months ago

A fun anecdote - a lot of people may remember Roomba from forever back with their automated little vacuums. Roomba's market share declined significantly because they failed to adopt Lidar technology as quickly as their competitors, instead they depended on the bumper for as long as possible. This put them at a disadvantage in navigation and efficiency as their competitors started using Lidar. Combined with aggressive pricing from rivals, expiry of its patented roller in 2022, a weird insistence to not combine vacuum and mopping into one device, Roomba (or iRobot now) is a just little fish in the sea it made.

moogly|8 months ago

> Roomba (or iRobot now) is a just little fish in the sea it made.

Perhaps more like plankton.

> The [...] company warned in its earnings results [on 12 March 2025] that there’s doubt about whether it can continue as a going concern.

havaloc|8 months ago

https://maticrobots.com/ - Lidar seems like a stopgap, check out this robot vacuum which works with vision only. I am not conflating a car and a vacuum, but it's an interesting technological exposition.

owenwil|8 months ago

100%—I enjoy telling life-long Roomba users about how far behind the technology is when they try to convince me to buy one! I've been using Roborock for a long time and it's pretty astounding how far ahead they are; full on item analysis + avoidance (including poop!) being the big one for us, let alone just knowing their exact location within the house. And there's a number of others that have pushed it a whole bunch... the folks at Matic seem to have pushed it even further (not ironically, with just vision, which actually feels appropriate here) it's a shame it's not available in Canada and no obvious plans to roll out here, would love to buy one: https://maticrobots.com/

Meanwhile Roomba seems to have done...pretty much nothing? Reminds me of the death of Skype when everyone transitioned to literally everything else while they floundered around.

shreezus|8 months ago

I mostly agree with you. I use both FSD on my Tesla & Waymo regularly (LA region), and Waymo just feels way safer in comparison. While FSD has improved significantly the last few iterations, I have seen it do "strange" things often enough that I don't feel safe just sitting in the backseat like I would with a Waymo.

Even if it's hypothetically 99% as good as Waymo at the moment, 99% is not good "enough" when it comes to something as critical as driving.

bdamm|8 months ago

It's probably also got to do with some magic behind the curtains; Waymo hasn't expanded much beyond their initial regions, so they've been static for years in terms of region. They very well may have hand-crafted expected paths, and obviously as the region coverage goes up that kind of hang-crafting doesn't scale (due to changes in the environment, at least) economically. So we can't really say how much work Waymo is putting into each mile. That's true of Tesla also, except that kind of work is totally antithetical to their entire approach from the very beginning, so it would be really surprising to find Tesla getting stuck in that specific local minima, wheras we can almost expect it from Waymo.

As a counter anecdote, I do use FSD with my family in the car, I also have used it on snowy roads, logging roads, and it does quite well. Not unsupervised well, but better than I expected given that I'm running FSD on a nearly 6-year old car. The number of trips around town that have been totaly interventionless has definitely been going up lately, and usually interventions have been because I wanted to be more aggressive, not because the car was making a major error or even being rough.

FireBeyond|8 months ago

> doesn’t even understand basic concepts like school buses or trains

Yeah, it would be hilarious if it wasn't so horrifying, I remember watching a level crossing be represented as a weird traffic light that would go from red to off to red erratically, with a similarly erratic convoy of trucks representing the train.

Mind you I remember people claiming FSD was "nearly done" because they'd "tackled all the hard problems, and were now in clean up", and how as a result that meant they could let their FSD take itself through a roundabout, not just straightlining it through. Never underestimate the power of denial.

BrandonLive|8 months ago

Today’s “FSD” has its limitations and requires supervision, but your description of it is not anything like my experience even on a HW3 vehicle. In fact, in many years of using Autopilot and various “FSD Beta” and “FSD (Supervised)” versions for several tens of thousands of miles I’ve literally never seen it “slam on the brakes suddenly for shadows” or “veer into the wrong lane”. I’m not a cult member and my next car won’t be a Tesla because I cannot support Musk after the horrible things he has done these last 2-3 years, but “FSD” is phenomenal when used appropriately and with the right expectations about what it is and what it isn’t. And it has improved a ton over the years, too.

The end-to-end solution was a real game changer, and while the previous solution was still useful and impressive in its own right, moving to the new stack was a night and day difference. With V13 finally taking advantage of HW4, and all the work they’ve been doing since then (plus upcoming HW5 introduction), it’s totally within the realm of possibility that they achieve viable L4 autonomy beyond this kind of small scale demo (and I hope some form of L3 maybe on HW4 before long for customer vehicles).

owenwil|8 months ago

I can give you a number of locations to visit in B.C. and the time of day for the shadows if you want to experience it for yourself! Hasn’t been fixed in four years yet. It has gotten less frequent in general though.

mebizzle|8 months ago

I have a Model 3 and it drives amazingly well, in Florida of all places. Ive taken it all over back roads in FL/GL with a little bit of AL and MS as well. The issues you described were much more prevalent when I got my car in '23, and I have genuinely been watching them become fewer and further between as time passed. I've driven at least 10000 miles on it in the last two years and I have only had to intervene twice.

I have no motivation to be positive; I own no Tesla stock or position and just like it because its the best car for me currently. I cannot emphasize enough just how different my lived experience has been from how you describe it.

owenwil|8 months ago

I am confident that two things can be true: a) it can be better significantly in some places than others, especially like Florida, which has a lot of large, wide roads, that are probably mapped more than a lot of places which creates a more stable experience and b) Their choice of hardware and software approach is obviously less safe given their limitations, and has a number of compromises that introduce unpredictability vs other approaches.

It definitely has come a good way since I first got my car, but it's still _unpredictable_ and even seems to progress, then randomly regress, between releases. The big one is just navigating unpredictable environments, which is where Waymo is clearly far, far ahead.

In the real world, I think their approach has clearly hit a ceiling and I definitely feel a lot safer sitting in a Waymo than a Tesla, I'm not sure the gap is going to narrow unless something drastic changes.

misiti3780|8 months ago

I have had the exact opposite experience. I literally dont drive anymore and I never have phantom breaking problems. Im on HW3

nwienert|8 months ago

Shift some nouns and this is basically exactly what LLMs for coding are like, except the downside risk is “git revert”