top | item 33087247

(no title)

STRML | 3 years ago

There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.

If this were a planned move, the software would already be ready to replace it. It is not. Cars without radar still aren't at parity with the car that have it, although they "fixed" this a few weeks back by simply turning off the radar on the older cars.

The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.

The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.

Even more concerning: some features simply cannot be replaced by camera. Tesla infamously does not have a front bumper camera, making it impossible to detect obstructions occluded from view by the hood. For 3/S, which are low, this means it will be impossible to detect low obstructions like too-high concrete wheel stops and similar obstructions. Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.

Laughable that this is being done on cars from a high-end brand, and that the remedy for customers who are getting worse cars today than you would get yesterday is to wait for these features to "be restored" at some indeterminate time in the future.

discuss

order

aeternum|3 years ago

With the removal of radar, they showed some pretty convincing data that the radar was too noisy to be useful, especially with discriminating things like highway overpasses vs. stopped cars. They showed that vision could already outperform radar.

With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.

rrss|3 years ago

> They showed that vision could already outperform radar.

important to note that much like there is a wide range from bad cameras (“filmed with a potato”) to high resolution cameras, there is a wide variety of radars with different capabilities.

the radar unit in Teslas was pretty limited (basically designed for adaptive cruise control), and they showed that vision could outperform that radar (and have no interest in exploring non-vision approaches because “humans can drive with just eyes”)

Qworg|3 years ago

I'd love to see the convincing data RE: radar - they're using the same radars as every other car that has emergency braking from the car in front of the car that's in front of you. I've not heard of many "phantom braking" sessions from these other vehicles - I may have missed something.

TheDudeMan|3 years ago

Radar has a longer wavelength than visible light. Thus making it better at some stuff and worse at others. It's a crime to not be using as much of the EM spectrum as possible for an application like this.

ohgodplsno|3 years ago

Some pretty convincing data that they handpicked to make themselves look good and pretend they don't need a radar, a lidar and that vision only is good enough.

Perpetuating a pattern of lies from Tesla, ever since they started on self driving.

cma|3 years ago

More data shouldn't actually hurt, it can be use as bayesian prior on the camera data when the vision stuff is uncertain if nothing else.

darkwater|3 years ago

> With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.

I'm especially worried by dark garages / parkings. Tesla cams have already enough problems when outside is too dark.

sangnoir|3 years ago

> They showed that vision could already outperform radar.

Except on emergency vehicles parked in a lane at an oblique angle, which Teslas did not recognize, and plowed into at speed. I wonder what unknown secondary effects this change will bring.

LeifCarrotson|3 years ago

> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.

I have observed that this is how vision is currently implemented, but does it have to be this way? I can pull up to a concrete wheel stop in my Toyota without vision and without the sonar enabled even though my eyesight is occluded by the hood because I know where it was and how far I have moved. Concrete wheel stops do not flicker out of existence when you stop looking at them, they should be able to monitor the wheel speed sensors and shift the 3D map of the world into your camera blind spots, perhaps showing a "hidden" wireframe on the cameras.

It would be inconvenient if you were unable to pull forward because a tumbleweed or (more likely) plastic bag rolled through, but you could back up and try again or the human could decide to ignore the beeping.

Granted, I'm not a domain expert, but we do this in my field of industrial robotics when building models with 3D vision. The computer can composite multiple profiles into a single higher-resolution image, can return data about that model when the camera on the EOAT has moved such that the field of view is limited, and can provide faults when the model does not match a previous image because something has been added or removed while the camera wasn't watching.

gizmo|3 years ago

Possible, sure. But not trivial, and there is no pressing need to get rid of these sensors right now. Why the rush, if there is no supply chain issue?

stetrain|3 years ago

The inverse is an issue though.

The car parks and turns off, with no obstacles in front.

While parked a child/dog/bucket of concrete materializes directly in the front blindspot.

Next time the car drives, it has no knowledge of this obstruction.

This isn't a huge problem for a car driven by a human, since you can and should check around the car before driving, or will have other context clues that there is a child or dog or whatever running around your car, but this seems incompatible with the stated goal of making all current cars into Robotaxis.

m463|3 years ago

> The parking sensors on the Tesla are really good and very useful.

I respectfully disagree.

I think their primary use case is parking. Tesla does a really good job with automatic parking, and it IS useful especially for parallel parking.

But honestly, I have curbed my rims and hung up my underbody front spoiler on a curb and they did not help.

Really there is an opportunity for cameras to protect the car from nearby curbs and park in the same way.

But like you said, current telsa camera placement doesn't have enough coverage, especially in front.

At a minimum, I think they should add a front camera if they're going to remove the ultrasonic sensors.

and realistically, they should solve the ultrasonic problem of curbs with the cameras.

dagmx|3 years ago

Tesla is probably the WORST at automatic parking. Here’s a comparison against some other competitors.

https://youtu.be/nsb2XBAIWyA

My own model Y has maybe successfully automatically parked 2/10 times. Most of my friends and coworkers have similarly low success rates.

MarkMarine|3 years ago

I don't use the automatic parking, but I find the ultrasonics quite useful in my own parking, maybe I've been more lucky or just more hyper careful, but I've never scratched the front or rims on a curb. The feedback about the shape of the obstruction and the distance it is away from you is great, much better than my 2014 era vehicle. I would like to see some lower sensors, maybe a single pixel lidar to look for curbs or something, but I think that is going the wrong way with component count for them.

To just say that these features that are pretty standard on any other car at this price point are "coming soon" is laughable with Telsa's delivery cadence. I'll wait for vision based parking, I'm sure it's coming right on the heels of full self driving in 2019 (not here,) smart summon in 2019 (delivered in what, 2021, and the only thing I've seen it do is nervously back out of a spot then try to merge into a surface street instead of pick me up,) The Tesla Semi, Tesla Cyber truck, and now a robot, which is hilarious because they couldn't even get the million dollar industrial robots to work on their line, can't say I've got a lot of faith in some 20k robot from Tesla.

No, they overpromise and basically just don't deliver, why anyone would take Tesla's word for this I don't know.

yreg|3 years ago

Automatic parking is part of a premium package, I don't have it. I do rely on the sensors often, particularly on the front ones.

This removal is mindboggling without adding a front camera.

pengaru|3 years ago

> downright dishonest to characterize it as some kind of leap forward

Does anyone expect anything else from Tesla at this point?

capableweb|3 years ago

Does anyone expect anything else from any corporation? Of course they are not gonna say "I'm sorry, we had to make your experience worse", of course they will try to spin it into something positive. Almost every for-profit company does this. Not to take away that it's shitty, but it's reality.

rob74|3 years ago

> The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.

My 4 year old Ford Focus doesn't show an outline of obstacles around the car, but still provides Park Assist and Autopark. Especially Park Assist is something that everyone expects from every reasonably equipped car nowadays. So yeah, maybe understandable if it's due to supply difficulties, but still a bad move...

a_t48|3 years ago

My 2021 Jetta GLI doesn't have these features. :(

pclmulqdq|3 years ago

It looks like they are transparently trying to raise their margins. I guess they are more supply-constrained than they thought. The "smart car of the future" from Tesla now has no sensors other than cameras, and not very many of those.

tablespoon|3 years ago

> It looks like they are transparently trying to raise their margins. I guess they are more supply-constrained than they thought. The "smart car of the future" from Tesla now has no sensors other than cameras, and not very many of those.

Don't worry, it won't be a problem as long as they can keep the marketing budget up.

rbanffy|3 years ago

> There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.

I like the idea of coalescing multiple sensors into one, but I can't shake off the idea that relying on vision alone when you can sense depth through US or LIDAR is a terrible idea. You can fake depth data from multiple cameras, but it takes more processing and additional input should be a good thing. Did anyone ever try to fool a Tesla using the Looney Tunes painted tunnel trick?

> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.

Any sensible navigation software should "know" objects don't cease to exist. I hope Tesla's does it.

peteradio|3 years ago

High-end as defined by what besides price? Not necessarily snark, I haven't heard too much good about fit and finish...

pclmulqdq|3 years ago

People who have never driven a car worth more than $50,000 love the luxury feel of their Teslas. It's been really interesting to see how many people had a Tesla as their first expensive car.

esotericimpl|3 years ago

This is the thing people dont realize, as someone who has owned a luxury car before, the Tesla Model 3 is not what i would call a "luxury car" despite them playing in the same price space as the BMW 3 Series.

It's still one of the best cars i've ever owned but the fit, finish and feel are not luxury. My 2013 Ford Focus felt more solid than the model 3 i own.

drcross|3 years ago

When you have a several month wait list for your product, legions of devoted potential customers, if you're head and shoulders above any of your competitors, you're able to get away with things like this.

hansonkd13|3 years ago

> if you're head and shoulders above any of your competitors, you're able to get away with things like this.

The hubris of Tesla is still believing this to be the case. I personally cancelled my cybertruck order from lack of interest. better competition exists in 2022 that wasn’t the case years ago.

throwaway5959|3 years ago

Especially if your customers don’t give a shit about the safety of others.

davedx|3 years ago

> The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this,

Perhaps the occupancy network is a good replacement? You're characterizing this as if Tesla Vision won't ever be able to replace it: "impossible to detect obstructions occluded from view by the hood".

What if the neural net persists where everything in the world is, and can extrapolate where objects are as the car moves? Then it's better than cameras, blind spots would actually be reduced.

I agree it's a dick move to remove it on cars that were already ordered though. Very Tesla.

ethanbond|3 years ago

I mean how do we know they’re not going to just staple human eyes and brains to the car and then it’ll be way better than what’s on the market today?

Why dream up hypotheticals of how this is good when it’s actually just… bad?

zaptrem|3 years ago

What happens when a small object rolls into that blindspot? What happens when the car goes into deep sleep and the AP computer state is lost? Reloading all those nets takes like 40 seconds on startup from deep sleep.

captainmuon|3 years ago

It would be possible to use cameras instead of ultrasound, but that would require mounting a lot more cameras in the dead zones as you said. One of the coolest things I've seen was on a BYD, which has cameras mounted pointing downwards all around. It then creates a fake birdseye view when you are parking, or close to an obstacle. It looked really convincing, as if there was a drone above the car.

ryukoposting|3 years ago

I had a Lexus ES 350 loaner the last time my (22 year old) Lexus was in the shop- it had the same feature you describe in the BYD. There was obviously some stretching/warping trickery being done to create the birds-eye image, but it was really impressive regardless.

luckydata|3 years ago

just a mention that I have a Model 3 with USS and it can't detect low obstacles like concrete stops either.

ajross|3 years ago

So... your first sentence is very likely true. But it doesn't match your argument two paragraphs down:

> The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.

If cheap "bargain-bin" cars have these things, then there is no supply chain problem. In fact it's likely these parts aren't available in quantity anywhere right now and every manufacturer is having to deal with it. It's just not news when Ford has to rejigger options on a bunch of models to adjust to demand. But with Tesla, yikes.

It's just so exhausting the level of emotion this brand drives. Every decision they make is An Affront to All Sensibility to someone, it seems like.

FWIW: I like the parking sensors too, they're helpful. But good grief, maybe tone down the outrage?

hef19898|3 years ago

One company can have supply chain issues while another does not, even in the same region for the same parts from the same suppliers. Crazy, I know.

justapassenger|3 years ago

It’s not a news, because Ford doesn’t sell supply chain shenanigans as a feature. Tesla does, multiple times already.

ethanbond|3 years ago

Could you share a source about Ford removing safety sensors on cars that were already ordered?

jmartin2683|3 years ago

Not to be rude, but this is so far off base as to be laughable. You don’t have to understand much about how the system works (really no more than what is contained in the last ai day presentation) to understand why none of this is a problem (specifically the occlusion thing).

panick21_|3 years ago

> Cars without radar still aren't at parity with the car that have it

Tesla Vision outperformed Tesla with Radar in the official tests.

fredoliveira|3 years ago

I guess I'll do the obvious and ask for a source.

I saw the Tesla Vision keynote from Karpathy, if that's what you mean. Wouldn't call that a source for "official" tests.

nopenopenopeno|3 years ago

like being first in line to buy a touch bar macbook pro

throwawaylinux|3 years ago

In what world is cost and supply chain not engineering?!?

petilon|3 years ago

> There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.

Not true. Elon Musk has long maintained that cameras/vision is all that is needed [1]. After all, that's all a human has.

[1] https://techcrunch.com/2019/04/22/anyone-relying-on-lidar-is...

ouEight12|3 years ago

> After all, that's all a human has.

As a human that has only eyes, I've smashed the underside of my front bumper on many a curb over the years in various cars because just sight alone, blocked by the hood and front end of a car is a crappy way of judging things like that.

I say this as someone who absolutely loves his Model3.. but a step back is a step back, no matter how one spins it.

Radar-delete was a user experience negative for me as well. My autopilot experience today is far more 'jerky jerk' in terms of stop/go traffic than it ever was in the glory days of 'radar distancing data', so much so that AP is now banned on anything but 'smooth sailing highway driving' when I have certain motion sick vulnerable individuals in the car. Add on to that it's now (rightfully, given it's lost the radar available data) far more touchy about weather conditions before it'll even engage. :(

pclmulqdq|3 years ago

Your eyes are an incredible set of cameras. They have huge dynamic range, extremely accurate and fast depth sensing capabilities, huge focal distance range, and mounts that move incredibly quickly and precisely to point at details.

There are a lot of good reasons why you shouldn't think that today's cameras are equivalent to human eyes. There are a lot of good reasons to believe that cameras will never be equivalent to human eyes.

phinnaeus|3 years ago

I love the "humans do it with only two cameras!" comment. It's so funny to see people disregard the human brain, as if any computer today can hold a candle the the visual processing prowess it contains.

I say this as a happy Model 3 owner: Elon is remarkably stupid (or perhaps outright disingenuous) sometimes.

seanhunter|3 years ago

  > After all, that's all a human has.
And a general intelligence. Let's not forget about that tiny little detail.

jaimex2|3 years ago

Not really, both USS and Radar give unreliable output. The cars you mention would all have phantom beeping from the USS and unreliable adaptive cruise where the radar suddenly doesn't see the car ahead.

The Tesla vision on the other hand works like a more focused human driver. It can remember that wheel stop is there.

Agree they shouldn't have jumped the gun with production till it was ready though.

While the USS sensors are crap they probably should keep them for things like small children playing behind cars.

justapassenger|3 years ago

Vision also gives you unreliable output. Real world is messy and hard - sensor fusion and dealing with unreliable data is bread and butter.

No one said it’s easy to build that stuff. Well, no one expect for Elon Musk to be more precise.