There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.
If this were a planned move, the software would already be ready to replace it. It is not. Cars without radar still aren't at parity with the car that have it, although they "fixed" this a few weeks back by simply turning off the radar on the older cars.
The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.
The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.
Even more concerning: some features simply cannot be replaced by camera. Tesla infamously does not have a front bumper camera, making it impossible to detect obstructions occluded from view by the hood. For 3/S, which are low, this means it will be impossible to detect low obstructions like too-high concrete wheel stops and similar obstructions. Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Laughable that this is being done on cars from a high-end brand, and that the remedy for customers who are getting worse cars today than you would get yesterday is to wait for these features to "be restored" at some indeterminate time in the future.
With the removal of radar, they showed some pretty convincing data that the radar was too noisy to be useful, especially with discriminating things like highway overpasses vs. stopped cars. They showed that vision could already outperform radar.
With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.
> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
I have observed that this is how vision is currently implemented, but does it have to be this way? I can pull up to a concrete wheel stop in my Toyota without vision and without the sonar enabled even though my eyesight is occluded by the hood because I know where it was and how far I have moved. Concrete wheel stops do not flicker out of existence when you stop looking at them, they should be able to monitor the wheel speed sensors and shift the 3D map of the world into your camera blind spots, perhaps showing a "hidden" wireframe on the cameras.
It would be inconvenient if you were unable to pull forward because a tumbleweed or (more likely) plastic bag rolled through, but you could back up and try again or the human could decide to ignore the beeping.
Granted, I'm not a domain expert, but we do this in my field of industrial robotics when building models with 3D vision. The computer can composite multiple profiles into a single higher-resolution image, can return data about that model when the camera on the EOAT has moved such that the field of view is limited, and can provide faults when the model does not match a previous image because something has been added or removed while the camera wasn't watching.
> The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.
My 4 year old Ford Focus doesn't show an outline of obstacles around the car, but still provides Park Assist and Autopark. Especially Park Assist is something that everyone expects from every reasonably equipped car nowadays. So yeah, maybe understandable if it's due to supply difficulties, but still a bad move...
It looks like they are transparently trying to raise their margins. I guess they are more supply-constrained than they thought. The "smart car of the future" from Tesla now has no sensors other than cameras, and not very many of those.
> There are multiple tells that this move, just like the removal of radar, is supply-chain and cost-driven and not actually driven by engineering.
I like the idea of coalescing multiple sensors into one, but I can't shake off the idea that relying on vision alone when you can sense depth through US or LIDAR is a terrible idea. You can fake depth data from multiple cameras, but it takes more processing and additional input should be a good thing. Did anyone ever try to fool a Tesla using the Looney Tunes painted tunnel trick?
> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Any sensible navigation software should "know" objects don't cease to exist. I hope Tesla's does it.
When you have a several month wait list for your product, legions of devoted potential customers, if you're head and shoulders above any of your competitors, you're able to get away with things like this.
> The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this,
Perhaps the occupancy network is a good replacement? You're characterizing this as if Tesla Vision won't ever be able to replace it: "impossible to detect obstructions occluded from view by the hood".
What if the neural net persists where everything in the world is, and can extrapolate where objects are as the car moves? Then it's better than cameras, blind spots would actually be reduced.
I agree it's a dick move to remove it on cars that were already ordered though. Very Tesla.
It would be possible to use cameras instead of ultrasound, but that would require mounting a lot more cameras in the dead zones as you said. One of the coolest things I've seen was on a BYD, which has cameras mounted pointing downwards all around. It then creates a fake birdseye view when you are parking, or close to an obstacle. It looked really convincing, as if there was a drone above the car.
So... your first sentence is very likely true. But it doesn't match your argument two paragraphs down:
> The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.
If cheap "bargain-bin" cars have these things, then there is no supply chain problem. In fact it's likely these parts aren't available in quantity anywhere right now and every manufacturer is having to deal with it. It's just not news when Ford has to rejigger options on a bunch of models to adjust to demand. But with Tesla, yikes.
It's just so exhausting the level of emotion this brand drives. Every decision they make is An Affront to All Sensibility to someone, it seems like.
FWIW: I like the parking sensors too, they're helpful. But good grief, maybe tone down the outrage?
Not to be rude, but this is so far off base as to be laughable. You don’t have to understand much about how the system works (really no more than what is contained in the last ai day presentation) to understand why none of this is a problem (specifically the occlusion thing).
Not really, both USS and Radar give unreliable output. The cars you mention would all have phantom beeping from the USS and unreliable adaptive cruise where the radar suddenly doesn't see the car ahead.
The Tesla vision on the other hand works like a more focused human driver. It can remember that wheel stop is there.
Agree they shouldn't have jumped the gun with production till it was ready though.
While the USS sensors are crap they probably should keep them for things like small children playing behind cars.
Cars delivered today without USS will not have: "Park Assist: alerts you of surrounding objects when the vehicle is traveling <5 mph." (among other features like Summon)
This is the most basic feature that almost every car has had for years - but not your $60k Model Y. Why can't they wait to remove sensors until they've achieved feature parity with the vision only system? This seems crazy.
I don't really understand the push to do everything through the cameras. This has caused other problems too - like their decision to control automatic wipers with just the camera. Which meant in my Model 3, the automatic wipers were erratic and ineffective while in every other car with a normal $5 rain sensor the feature worked great.
I guess they think their computer vision and AI can catch up to and exceed the visual processing capabilities of vertebrates with their millions of years of evolution. I think they are wrong. Multi-media sensor data fusion is the only sane path for the foreseeable future, in my lay opinion.
Because if the hardware isn't available they can't deliver the car and recognize the revenue. They can wait, but if they're gonna remove it anyway it lets them make more cars with less costs in shorter time. Seems logical if it is indeed parity.
My 2018 VW GTI locked that functionality behind a several thousand dollar interior package or a few hundred dollar dealer software update :( . I literally have the sensors and software, it's just turned off because I don't like leather seats.
This does not make me optimistic for Tesla's future. Unless this is truly a temporary thing due to a supply shortage and they are planning to return to normal when they can.
Tesla's track record is poor. They cheaped out on a few dollars for an IR rain sensor like everyone else uses, and as a result the wipers on a Model 3 are psychotic. Combine that with their hatred for manual controls, and I had a terrible experience every winter when the rains really started.
Now they don't have radar, which everyone else does. AFAIK they still haven't got that reliable IR sensor. No CarPlay yet, which 99% of all other cars have. No 360 degree camera view. No ultrasonic parking sensors. Why again would I buy another Tesla? The competition has more features.
The cherry on top is Tesla's demonstrated willingness to screw over their customers well after the sale, and not even for the benefit of the company. Just because they hate their customers, I guess. That alone is a deal killer for me now.
As an engineer, I see little value in removing any sensors. No matter how noisy or limited they are, they bring additional information which, if processed, could improve overall accuracy. This is commonly referred to as "sensor fusion". So this is probably is purely cost-saving move. While I understand how radar could be expensive, I am not sure the saving on much cheaper ultrasound sensors is worth the loss of additional information about the environment.
As Telsa Model 3 owner, I can attest that their automatic parking rarely works anyway. They should be adding more sensors to improve it, not remove them.
I've worked on 3d vision systems before and have experimented using different types of sensors in multi-sensor setups, and I'd argue that overwhelmingly it's a software problem.
More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.
I do agree though - assuming processing power isn't a performance bottleneck here, more data is always better. But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.
“In the near future, once these features achieve performance parity to today’s vehicles”
Right... like how vision vehicles have parity with autopilot speeds and following distance? Oh wait they don't.
Vision-only vehicles are still limited to a max of 85mph and 2 car minimum following distance for autosteer while radar is 90mph and 1 car following distance.
Texas has 85mph highways. The two car following distance constantly leads to cars cutting in-between you and the car in front of you because the gap is so large.
edit:
Just to clarify, when it says "1 car length", the car adjusts the amount of space depending on speed. So its not literally just one car length.
I have had a model 3 for four years and the autopilot has steadily gotten worse.
In summer 2019 we went on a road trip that was 10 hours of total driving (each way), where I actually drove for maybe 30 mins total. It was flawless and downright magical.
Since then we often have the car slam brakes out of the blue (once waking up my sleeping family and causing both kids to cry), and a few other times.
I have lost trust with the autopilot system. It’s that simple. And I LOVE my model 3, the supercharger network, all else.
I don’t know what’s going on over there but I am not remotely convinced that they have their act together with vision-only based autonomous driving.
This would potentially work... if the cameras could see the places where the ultrasonic sensors were. But they can't! There are big blind spots near the bumpers. I guess they are planning to just have the car fill in the blind spots by remembering what it saw from other perspectives? But that only works when the car is moving and can't account for small obstacles that can move unpredictably, like animals or toddlers.
I'm sympathetic to the idea that it's possible to drive a car with just cameras, but the specific cameras that Tesla has aren't good enough. They need more coverage, better quality, and self-cleaning.
> Will vehicles equipped with ultrasonic sensors have their functionality removed?
> At this time, we do not plan to remove the functionality of ultrasonic sensors in our existing fleet.
I really really do not like the use of the phrase "our existing fleet" here. It heavily implies that Tesla has some level of ownership over the vehicles they sell post-purchase.
Does anyone else think they're running out of compute power on their DL computer and trying to eliminate sensor fusion in order to preserve precious cycles for vision processing?
I've long wondered if TSLA screwed up by choosing the hardware before fully solving the problem, leaving millions of vehicles incapable of performing safe FSD without HW upgrades.
This will absolutely not work, I have FSD and it messes up distance of large well defined objects, think about how this will work with a concrete wall, or curbs that it can probably no longer see.
When I pull into my garage sometimes it recognizes my shelf as a car ON FSD....
My rear view camera regularly gets covered with dirt and grime after a highway run during the monsoons. The car has both camera and radar so that I can have reliable assistance while reversing. With this experience in mind I fail to see how Tesla’s camera-only move is reliable for the use cases that they have mentioned.
When parking in tight spaces, getting exact-distance feedback from the ultrasonic sensors is pretty handy. I can't see how vision will replace that; I don't think any of the cameras can see that close to the bumper.
I read this Somewhere(tm), if anyone can confirm it I'd be grateful:
The issue with Tesla's camera only neural network thing is that it needs a TON of cycles to determine the distance of every point in the image.
With LIDAR, for example, you get the direction and distance in the same datapoint without any extra calculation.
And the current generation of Tesla cars are operating pretty much at the limit of their power and will run out of resources unless Tesla manages to do some crazy optimisations on their detection algorithms/neural network training.
As a Tesla customer I’m so used to being constantly gaslighted that it’s a pleasant surprise when they’re only removing features from new cars rather than my existing one.
Given that my Model 3 seems to think (visualized on the screen) that my garage door is a semi-trailer pointing towards me, I'm inclined to not believe that they can't do all of this with cameras only.
[+] [-] STRML|3 years ago|reply
If this were a planned move, the software would already be ready to replace it. It is not. Cars without radar still aren't at parity with the car that have it, although they "fixed" this a few weeks back by simply turning off the radar on the older cars.
The parking sensors on the Tesla are really good and very useful. They show a rough outline of what is around you and are displayed very nicely compared to some other cars. It is a real shame to remove this, and downright dishonest to characterize it as some kind of leap forward.
The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.
Even more concerning: some features simply cannot be replaced by camera. Tesla infamously does not have a front bumper camera, making it impossible to detect obstructions occluded from view by the hood. For 3/S, which are low, this means it will be impossible to detect low obstructions like too-high concrete wheel stops and similar obstructions. Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Laughable that this is being done on cars from a high-end brand, and that the remedy for customers who are getting worse cars today than you would get yesterday is to wait for these features to "be restored" at some indeterminate time in the future.
[+] [-] aeternum|3 years ago|reply
With these parking sensors it's different. They have no current alternative and it will cause a pretty significant loss of functionality. Disappointing move.
[+] [-] LeifCarrotson|3 years ago|reply
I have observed that this is how vision is currently implemented, but does it have to be this way? I can pull up to a concrete wheel stop in my Toyota without vision and without the sonar enabled even though my eyesight is occluded by the hood because I know where it was and how far I have moved. Concrete wheel stops do not flicker out of existence when you stop looking at them, they should be able to monitor the wheel speed sensors and shift the 3D map of the world into your camera blind spots, perhaps showing a "hidden" wireframe on the cameras.
It would be inconvenient if you were unable to pull forward because a tumbleweed or (more likely) plastic bag rolled through, but you could back up and try again or the human could decide to ignore the beeping.
Granted, I'm not a domain expert, but we do this in my field of industrial robotics when building models with 3D vision. The computer can composite multiple profiles into a single higher-resolution image, can return data about that model when the camera on the EOAT has moved such that the field of view is limited, and can provide faults when the model does not match a previous image because something has been added or removed while the camera wasn't watching.
[+] [-] m463|3 years ago|reply
I respectfully disagree.
I think their primary use case is parking. Tesla does a really good job with automatic parking, and it IS useful especially for parallel parking.
But honestly, I have curbed my rims and hung up my underbody front spoiler on a curb and they did not help.
Really there is an opportunity for cameras to protect the car from nearby curbs and park in the same way.
But like you said, current telsa camera placement doesn't have enough coverage, especially in front.
At a minimum, I think they should add a front camera if they're going to remove the ultrasonic sensors.
and realistically, they should solve the ultrasonic problem of curbs with the cameras.
[+] [-] pengaru|3 years ago|reply
Does anyone expect anything else from Tesla at this point?
[+] [-] rob74|3 years ago|reply
My 4 year old Ford Focus doesn't show an outline of obstacles around the car, but still provides Park Assist and Autopark. Especially Park Assist is something that everyone expects from every reasonably equipped car nowadays. So yeah, maybe understandable if it's due to supply difficulties, but still a bad move...
[+] [-] pclmulqdq|3 years ago|reply
[+] [-] rbanffy|3 years ago|reply
I like the idea of coalescing multiple sensors into one, but I can't shake off the idea that relying on vision alone when you can sense depth through US or LIDAR is a terrible idea. You can fake depth data from multiple cameras, but it takes more processing and additional input should be a good thing. Did anyone ever try to fool a Tesla using the Looney Tunes painted tunnel trick?
> Once the hood has occluded the view from the windshield cameras, these obstructions will cease to exist.
Any sensible navigation software should "know" objects don't cease to exist. I hope Tesla's does it.
[+] [-] peteradio|3 years ago|reply
[+] [-] drcross|3 years ago|reply
[+] [-] davedx|3 years ago|reply
Perhaps the occupancy network is a good replacement? You're characterizing this as if Tesla Vision won't ever be able to replace it: "impossible to detect obstructions occluded from view by the hood".
What if the neural net persists where everything in the world is, and can extrapolate where objects are as the car moves? Then it's better than cameras, blind spots would actually be reduced.
I agree it's a dick move to remove it on cars that were already ordered though. Very Tesla.
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] captainmuon|3 years ago|reply
[+] [-] luckydata|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] ajross|3 years ago|reply
> The cheapest bargain-bin cars have this feature. Your $100k+ Model S or X will not.
If cheap "bargain-bin" cars have these things, then there is no supply chain problem. In fact it's likely these parts aren't available in quantity anywhere right now and every manufacturer is having to deal with it. It's just not news when Ford has to rejigger options on a bunch of models to adjust to demand. But with Tesla, yikes.
It's just so exhausting the level of emotion this brand drives. Every decision they make is An Affront to All Sensibility to someone, it seems like.
FWIW: I like the parking sensors too, they're helpful. But good grief, maybe tone down the outrage?
[+] [-] jmartin2683|3 years ago|reply
[+] [-] panick21_|3 years ago|reply
Tesla Vision outperformed Tesla with Radar in the official tests.
[+] [-] nopenopenopeno|3 years ago|reply
[+] [-] throwawaylinux|3 years ago|reply
[+] [-] petilon|3 years ago|reply
Not true. Elon Musk has long maintained that cameras/vision is all that is needed [1]. After all, that's all a human has.
[1] https://techcrunch.com/2019/04/22/anyone-relying-on-lidar-is...
[+] [-] jaimex2|3 years ago|reply
The Tesla vision on the other hand works like a more focused human driver. It can remember that wheel stop is there.
Agree they shouldn't have jumped the gun with production till it was ready though.
While the USS sensors are crap they probably should keep them for things like small children playing behind cars.
[+] [-] sxates|3 years ago|reply
This is the most basic feature that almost every car has had for years - but not your $60k Model Y. Why can't they wait to remove sensors until they've achieved feature parity with the vision only system? This seems crazy.
I don't really understand the push to do everything through the cameras. This has caused other problems too - like their decision to control automatic wipers with just the camera. Which meant in my Model 3, the automatic wipers were erratic and ineffective while in every other car with a normal $5 rain sensor the feature worked great.
[+] [-] avidphantasm|3 years ago|reply
[+] [-] joewadcan|3 years ago|reply
[+] [-] nicbou|3 years ago|reply
That's pretty neat
[+] [-] phire|3 years ago|reply
They have run out of sensors, and they don't want to pause production until they have produced more.
[+] [-] mrguyorama|3 years ago|reply
[+] [-] bdcravens|3 years ago|reply
[+] [-] drcongo|3 years ago|reply
[deleted]
[+] [-] rootusrootus|3 years ago|reply
Tesla's track record is poor. They cheaped out on a few dollars for an IR rain sensor like everyone else uses, and as a result the wipers on a Model 3 are psychotic. Combine that with their hatred for manual controls, and I had a terrible experience every winter when the rains really started.
Now they don't have radar, which everyone else does. AFAIK they still haven't got that reliable IR sensor. No CarPlay yet, which 99% of all other cars have. No 360 degree camera view. No ultrasonic parking sensors. Why again would I buy another Tesla? The competition has more features.
The cherry on top is Tesla's demonstrated willingness to screw over their customers well after the sale, and not even for the benefit of the company. Just because they hate their customers, I guess. That alone is a deal killer for me now.
[+] [-] vzaliva|3 years ago|reply
As Telsa Model 3 owner, I can attest that their automatic parking rarely works anyway. They should be adding more sensors to improve it, not remove them.
[+] [-] kypro|3 years ago|reply
More sensors help, but passed a point it's extremely marginal, plus you're throwing the majority of the raw data away anyway.
I do agree though - assuming processing power isn't a performance bottleneck here, more data is always better. But there's a reason humans only have two eyes and not three or four, for example. Two eyes are really all you need.
[+] [-] victor22|3 years ago|reply
[+] [-] JoshGlazebrook|3 years ago|reply
Right... like how vision vehicles have parity with autopilot speeds and following distance? Oh wait they don't.
Vision-only vehicles are still limited to a max of 85mph and 2 car minimum following distance for autosteer while radar is 90mph and 1 car following distance.
Texas has 85mph highways. The two car following distance constantly leads to cars cutting in-between you and the car in front of you because the gap is so large.
edit: Just to clarify, when it says "1 car length", the car adjusts the amount of space depending on speed. So its not literally just one car length.
[+] [-] atonse|3 years ago|reply
In summer 2019 we went on a road trip that was 10 hours of total driving (each way), where I actually drove for maybe 30 mins total. It was flawless and downright magical.
Since then we often have the car slam brakes out of the blue (once waking up my sleeping family and causing both kids to cry), and a few other times.
I have lost trust with the autopilot system. It’s that simple. And I LOVE my model 3, the supercharger network, all else.
I don’t know what’s going on over there but I am not remotely convinced that they have their act together with vision-only based autonomous driving.
[+] [-] modeless|3 years ago|reply
I'm sympathetic to the idea that it's possible to drive a car with just cameras, but the specific cameras that Tesla has aren't good enough. They need more coverage, better quality, and self-cleaning.
[+] [-] CivBase|3 years ago|reply
I really really do not like the use of the phrase "our existing fleet" here. It heavily implies that Tesla has some level of ownership over the vehicles they sell post-purchase.
[+] [-] olliecornelia|3 years ago|reply
[+] [-] 01100011|3 years ago|reply
I've long wondered if TSLA screwed up by choosing the hardware before fully solving the problem, leaving millions of vehicles incapable of performing safe FSD without HW upgrades.
[+] [-] bearjaws|3 years ago|reply
When I pull into my garage sometimes it recognizes my shelf as a car ON FSD....
[+] [-] davewritescode|3 years ago|reply
[+] [-] mukundmr|3 years ago|reply
[+] [-] BiteCode_dev|3 years ago|reply
[+] [-] ultrahax|3 years ago|reply
[+] [-] theshrike79|3 years ago|reply
The issue with Tesla's camera only neural network thing is that it needs a TON of cycles to determine the distance of every point in the image.
With LIDAR, for example, you get the direction and distance in the same datapoint without any extra calculation.
And the current generation of Tesla cars are operating pretty much at the limit of their power and will run out of resources unless Tesla manages to do some crazy optimisations on their detection algorithms/neural network training.
[+] [-] pavlov|3 years ago|reply
[+] [-] unknown|3 years ago|reply
[deleted]
[+] [-] socialismisok|3 years ago|reply
I genuinely don't know why anyone would buy into Tesla now.
[+] [-] vegardx|3 years ago|reply