top | item 45145342

(no title)

formercoder | 5 months ago

Humans drive without LIDAR. Why can’t robots?

discuss

order

cannonpr|5 months ago

Because human vision has very little in common with camera vision and is a far more advanced sensor, on a far more advanced platform (ability to scan and pivot etc), with a lot more compute available to it.

torginus|5 months ago

I don't think it's a sensors issue - if I gave you a panoramic feed of what a Tesla sees on a series of screens, I'm pretty sure you'd be able to learn to drive it (well).

lstodd|5 months ago

yeah, try matching a human eye on dynamic range and then on angular speed and then on refocus. okay forget that.

try matching a cat's eye on those metrics. and it is much simpler that human one.

insane_dreamer|5 months ago

The human sensor (eye) isn't more advanced in its ability to capture data -- and in fact cameras can have a wider range of frequencies.

But the human brain can process the semantics of what the eye sees much better than current computers can process the semantics of the camera data. The camera may be able to see more than the eye, but unless it understands what it sees, it'll be inferior.

Thus Tesla spontaneously activating its windshield wipers to "remove something obstructing the view" (happens to my Tesla 3 as well), whereas the human brain knows that there's no need to do that.

Same for Tesla braking hard when it encountered an island in the road between lanes without clear road markings, whereas the human driver (me) could easily determine what it was and navigate around it.

phire|5 months ago

Why tie your hands behind your back?

LIDAR based self-driving cars will always massively exceed the safety and performance of vision-only self driving cars.

Current Tesla cameras+computer vision is nowhere near as good as humans. But LIDAR based self-driving cars already have way better situational awareness in many scenarios. They are way closer to actually delivering.

kimixa|5 months ago

And what driver wouldn't want extra senses, if they could actually meaningfully be used? The goal is to drive well on public roads, not some "Hands Tied Behind My Back" competition.

tliltocatl|5 months ago

Because any active sensor is going to jam other such sensors once there are too many of them on the road. This is sad but true.

Sharlin|5 months ago

And bird fly without radar. Still we equip planes with them.

apparent|5 months ago

The human processing unit understands semantics much better than the Tesla's processing unit. This helps avoid what humans would consider stupid mistakes, but which might be very tricky for Teslas to reliably avoid.

randerson|5 months ago

Even if they could: Why settle for a car that is only as good as a human when the competitors are making cars that are better than a human?

dotancohen|5 months ago

Cost, weight, and reliability. The best part is no part.

No part costs less, it also doesn't break, it also doesn't need to be installed, nor stocked in every crisis dealership's shelf, nor can a supplier hold up production. It doesn't add wires (complexity and size) to the wiring harness, or clog up the CAN bus message queue (LIDAR is a lot of data). It also does not need another dedicated place engineered for it, further constraining other systems and crash safety. Not to mention the electricity used, a premium resource in an electric vehicle of limited range.

That's all off the top of my head. I'm sure there's even better reasons out there.

systemswizard|5 months ago

Because our eyes work better than the cheap cameras Tesla uses?

lstodd|5 months ago

problem is, expensive cameras that Tesla doesn't use don't work either.

dreamcompiler|5 months ago

Chimpanzees have binocular color vision with similar acuity to humans. Yet we don't let them drive taxis. Why?

ikekkdcjkfke|5 months ago

Chimpanzies are better than humans given a reward structure they understand. The next battlefield evilution are chimpanzies hooked up with intravenous cocaine modules running around with 50. cals

ndsipa_pomu|5 months ago

There's laws about mis-treating animals. Driving a taxi would surely count as inhumane torture.

insane_dreamer|5 months ago

they can't understand how to react to what they see the way humans do

it has to do with the processing of information and decision-making, not data capture

matthewdgreen|5 months ago

I drove into the setting sun the other day and needed to shift the window shade and move my head carefully to avoid having the sun directly in my field of vision. I also had to run the wipers to clean off a thin film of dust that made my windshield difficult to see through. And then I still drove slowly and moved my head a bit to make sure I could see every obstacle. My Tesla doesn’t necessarily have the means to do all of these things for each of its cameras. Maybe they’ll figure that out.

zeknife|5 months ago

I wouldn't trust a human to drive a car if they had perfect vision but were otherwise deaf, had no proprioception and were unable to walk out of their car to observe and interact with the world.

dotancohen|5 months ago

And yet deaf people regularly drive cars, as do blind-in-one-eye people, and I've never seen somebody leave their vehicle during active driving.

Waterluvian|5 months ago

They can. One day. But nobody can just will it to be today.

rcpt|5 months ago

We crash a lot.

insane_dreamer|5 months ago

that's (usually) because our reflexes are slow (compared to a computer), or we are distracted by other things (talking, phone, tiredness, sights, etc. etc.), not because we misinterpret what we see

nkrisc|5 months ago

Well these robots can’t.