top | item 19337466

Tesla has a self-driving strategy other companies abandoned years ago

168 points| close04 | 7 years ago |arstechnica.com | reply

212 comments

order
[+] Animats|7 years ago|reply
Waymo keeps plugging away. Each year, for the last few years, the number of miles between disconnects they report to the CA DMV has doubled. Three more doublings, and that number will be bigger than average miles between accidents for human drivers. Then they can ship a product.

Nobody else is even close.

This problem is slowly being solved, by normal engineering practices. The "fake it til you make it" players are being left behind. Uber has been shown to be totally incompetent. Tesla really just has a good lane follower and an mediocre car follower. Apple has been trying to hand-wave by talking about "significant disconnects" vs. the all disconnects DMV requires them to report.

The LIDAR industry is struggling, but there is progress. Quanergy seems to have been mostly hype.[1] Continental, the big European auto parts maker, bought Advanced Scientific Concepts, which makes and sells a good but expensive flash LIDAR used in DoD and space applications. They packaged it up for automotive use, and are waiting for the self driving industry to catch up. That technology uses exotic indium-gallium-arsenide sensor ICs, which are expensive in small quantities but would probably be affordable if they could sell a few million a year.

This looks like a problem that's being solved. Just not fast enough for startups used to quick payoffs in software.

[1] https://www.bloomberg.com/news/features/2018-08-13/how-a-bil...

[+] perl4ever|7 years ago|reply
"Three more doublings, and that number will be bigger than average miles between accidents for human drivers. Then they can ship a product."

I am skeptical of that, because I expect that even if disconnects are ultra-rare overall, and the testing is representative of normal usage, there will be enough people that frequently use the technology in ways far off on the tail of the distribution that for them, disconnects will be much less rare. And that might create a PR blowup or worse.

I think what is worrying me may be something called "heteroscedasticity".

https://en.wikipedia.org/wiki/Heteroscedasticity

[+] ufmace|7 years ago|reply
The last Ars article that I read on Waymo did imply that they had hit a metaphorical wall on their progress, and were keeping things looking good by sticking to lower-speed residential neighborhoods and avoiding situations they had trouble with, like making unprotected left turns onto major streets and passing slow or stopped vehicles on a busy road. Essentially things that require being able to understand that the other cars around are driven by people who tend to react to things, and will probably slow down if you make a turn into a lane that they would be passing you in.

https://arstechnica.com/cars/2019/02/googles-waymo-risks-rep...

https://arstechnica.com/cars/2018/12/we-finally-talked-to-an...

[+] gok|7 years ago|reply
Please stop fixating on those California disengagements. They mean approximately nothing. They are unregulated, voluntarily reported incidents. A low number could mean that safety drivers are acting irresponsibly, or that the vehicles are only being used on public roads in known-safe circumstances.

I strongly suspect that the quest for a low disengagement number is what got that woman in Arizona killed by Uber. These reports really should be sealed from the public. AV startups seem to be using their disengagement rate as a way raise money. They are aided by journalists hungry to make sense of a highly secretive field publishing these figures completely out of context.

[+] KKKKkkkk1|7 years ago|reply
You should take Waymo's numbers with a healthy dose of skepticism. They are very selective about which disengagements they report. Here are the details from their 2017 letter to the DMV:

This report covers disengagements following the California DMV definition, which means “a deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.” Section 227.46 of Article 3.7 (Autonomous Vehicles) of Title 13, Division 1, Chapter 1, California Code of Regulations.

Waymo has developed a robust process to collect, analyze and evaluate disengages for this report. We set disengagement thresholds conservatively for our public road testing. The vast majority of disengagements are not related to safety. Our test drivers routinely transition into and out of autonomous mode many times throughout the day, and the self-driving vehicle’s computer hands over control to the driver in many situations that do not involve a failure of the autonomous technology and do not require an immediate takeover of control by the driver.

To help evaluate the safety significance of disengagements, Waymo employs a powerful simulator program. In Waymo’s simulation, our team can “replay” each incident and predict the behavior of our self-driving car if the driver had not taken control of it, as well as the behavior and positions of other road users in the vicinity (such as pedestrians, cyclists, and other vehicles). Our engineers use this data to refine and improve the software to ensure the self-driving car performs safely.

https://www.dmv.ca.gov/portal/wcm/connect/42aff875-7ab1-4115...

[+] mtw|7 years ago|reply
Waymo is ahead but only as a Taxi service in dense urban cities. Tesla however can deliver one day that will self drive 90% of the time. Many, especially in North America will only want the later. In North America, a car is still seen as a freedom enabler, one that lets you go to the beach, go on a hiking trip, and go on a few offroad track in a park during weekends. All of these will not be possible with a Waymo car but still possible with a Tesla Model S/3/X
[+] bunderbunder|7 years ago|reply
> Three more doublings, and that number will be bigger than average miles between accidents for human drivers. Then they can ship a product.

They can ship a product that can drive under the conditions they're testing under. Which would be appropriate for autonomous taxi service in certain municipalities, but presumably not for the general market.

Granted, that's been their business plan for a while now.

[+] scarmig|7 years ago|reply
Do you have hard numbers so we can establish a trendline?

My perhaps false recollection was that Waymo disengagements had flatlined 2017 to 2018.

[+] AtlasBarfed|7 years ago|reply
I spam this on every waymo mention, so I apologize, but can they please automate highway driving first and then work on getting me through taco bell? I care way more about taking a nap between cities with autorecharge than automating a 4 minute trip to a fast food joint.

If they really are doing this well with non-freeway driving, then that just tells me they could have released freeway driving about 4 years ago.

[+] wil421|7 years ago|reply
At least Tesla has an actual product you can buy? Where’s Waymos product? Will they sell to consumers or the incumbent car makers?

I agree with your statement about Tesla’s marketing of self driving. My Grand Cherokee isn’t so far off their mark. It has lane assist, adaptive cruise control with stop, parallel and back in parking. I’d bet Jeep could do summon and more stuff on the highway.

[+] martythemaniak|7 years ago|reply
There's a few things to keep in mind when discussing self-driving tech.

1) Self driving tech doesn't exist today. It simply does not. There's lots of people working on it and they are using different strategies, but it is not clear how long it will take, which strategy is technically superior, which strategy is more economical, etc etc. There's a lot of very strong opinions floating around (LIDAR is essential! No, vision and lots of data! No, 3D maps or bust, etc etc). It is important to keep in mind that we don't know the future, the people busy inventing it don't know how things will play out and outside observers know even less. The tech industry is littered with strong opinions which have aged terribly.

2) Musks's first principles aside, Tesla was never going to do LIDAR. They're in the business in shipping real cars to real people and there was never any possibility of mixing that with LIDAR. If they had gone with LIDAR and started a research program like Uber, they would have started years after Google, with less data, far fewer resources and absolutely technical advantage. It is far, far more likely that this program would have bankrupted them than beat google. In essence, whether Musk's first-principles is real or a sales pitch doesn't matter. Tesla's choice was their current approach, or nothing at all.

[+] peeters|7 years ago|reply
I've always found that half the issue is with the name: Autopilot. On an airplane, autopilot means you can take your focus off of both the physical and mental act of moving the plane through the air, and if it needs you to intercede there is pretty much no emergency in which you wouldn't have a few seconds to react.

In a car, Tesla's Autopilot simply doesn't give you that. If you are reading and your autopilot exits, you could be dead before your focus can return to the road. Planes fly in wide open spaces far from hazards. Cars drive in busy, tight spaces where a fraction of a second lapse in control can be fatal.

I get that Tesla's implementation is analogous to what you find on a jet. But the environments make them very different. So start by dropping the name. It's not auto pilot, it's handsfree driving.

[+] jniedrauer|7 years ago|reply
I don't think this accurately considers how the term "autopilot" is actually used in aviation. Autopilot can mean anything from "hold a gyro estimated heading but not pitch or speed" to "maintain airspeed and follow the localizer down a glide slope while cross checking GPS and ILS."

> if it needs you to intercede there is pretty much no emergency in which you wouldn't have a few seconds to react.

Traffic avoidance is certainly one. Airspeed dropping while the autopilot tries to maintain altitude is another. This can result in a stall and complete loss of control under the wrong circumstances.

Autopilot does not mean "read a book and let the plane fly itself." You are always cross checking instruments, visually scanning for traffic, running checklists, rehearsing your next steps, etc.

[+] billhathaway|7 years ago|reply
I wish Tesla called something like driver assist. I use it for 90% of my commute and it is super helpful, but I would never pick a name for it of auto-pilot.
[+] _ea1k|7 years ago|reply
Autopilot in planes can be used, and often is used, in cases of VFR flight in which see-and-avoid is a requirement. Constant surveillance of the aircraft's surroundings is important. When it isn't done, midairs happen.
[+] mikeash|7 years ago|reply
A system with lane keeping and traffic-aware cruise control corresponds pretty well to a simple two or three-axis airplane autopilot that just holds heading and altitude.
[+] zaroth|7 years ago|reply
I’ve read so many of these articles obviously written by someone who has no hands-on experience with the Tesla Autopilot and what it actually does for the driving experience.

Maybe Tesla will get to a point where my TM3 will drive me to work while I nap, or use my phone. One thing that’s pretty amazing is that if they can get to that point, it will come as a free automatic software update or an available for purchase hardware upgrade on my current car.

This month my car will be getting 5% faster acceleration from an OTA update - how cool is that?

But right now, the reality is that I can engage AutoPilot on the highway and it immediately and dramatically changes the driving experience. Instead of focusing on steering the car, I am focused on situational awareness. I am not just looking down my lane, I am looking at the cars around me, who is passing who, what might be coming up around that bend, that guy on his phone next to me, etc.

I can look at over drivers and assess whether they are paying attention and avoid them if needed.

Because the car is actually lane keeping — not what everyone else calls lane keeping which (surprise) is a total lie - but actively steering the car in the lane around curves and centering the car in the lane probably more precisely than I would be if I were steering manually.

I strongly believe, if AutoPilot never truly advanced much beyond its current capability, and as the currently functionality becomes more widespread, we will wonder 15 years from now, how did people operate their vehicles without this level of assistance? How could you be properly aware of your surroundings if you had to be so preoccupied with minor steering inputs?

If you use your phone and watch a movie or put on makeup while driving, you are breaking the law and endangering yourself and those around you.

Every time a new technology comes to cars people say it will distract, or hypnotize, or lull drivers into distraction (see wipers, radios, automatic transmissions, original cruise control) and anyone who has actually driven with AutoPilot knows this feature is no different.

As a Tesla owner I enjoy and appreciate Tesla’s real-world approach to self-driving and it makes my life better and my drive safer. Thank you Tesla.

[+] ypzhang2|7 years ago|reply
Issue is Autopilot is more of a feature that is geared towards convenience.

You are doing all of the driver attention work, but someone who activates Autopilot isn't required to. I think a lot of the argument against Tesla is that Autopilot isn't doing enough in that arena.

GM's Super Cruise is a lot more feature-full with regards to driver attentiveness. It might not be as useful in terms of being able to be used everywhere, but its definitely more well-rounded in terms of forcing driver attentiveness.

[+] devy|7 years ago|reply
The top editor promoted comment[1] on the article summed up very well.

  The attention problem is well known in engineering. 
  It is very hard to get a human to concentrate on something 
  that will turn up good more than 99% of the time, 
  even when there's serious or fatal consequences of failure. 
  Trains are the classic example - tracks are amost always clear, 
  signals are almost always correct which means you have to
  devise all sorts of systems to keep the driver alert.
[1]: https://arstechnica.com/cars/2019/03/teslas-self-driving-str...
[+] CPLX|7 years ago|reply
It's funny how much people have danced around all this, including this article, but what Musk has done with his statements on self-driving capabilities is called lying.
[+] ufmace|7 years ago|reply
Lying seems to have a strong moral component in English that doesn't seem to be evident here. Most people would say that a proper "lie" involves intentionally trying to convince somebody of something that you know is not true, usually for personal gain. If Musk honestly believes that he can do it, then I guess it isn't technically a lie, even if most people who know the industry say he would have to be delusional to believe that.

It does feel a bit pedantic to say that though. If Musk is sometimes delusionally optimistic instead of intentionally deceptive, how much does it matter to us? I guess the level of delusionalism does vary some - Tesla might meet vehicle production targets, SpaceX might meet their intended date for the first crewed mission, but a Tesla car with current production hardware driving coast to coast while the driver reads a book sometime this year has no chance whatsoever of happening.

Also worth considering that the Silicon Valley VC universe does tend to reward people who could be described as being delusionally optimistic, and thus encourage them to continue to think in that way.

[+] xeromal|7 years ago|reply
If someone has a delusional idea, is it still a lie? I don't think Musk 'lies' to earn more money. I think he is delusional for how long certain things take to do and at worst is trying to keep his companies afloat so he can continue with his ideas. I'd say his sins are less than the CEO of Enron for instance. It's hard to blame the guy when he successfully built a rocket company.
[+] kevin_thibedeau|7 years ago|reply
> "We already have full self-driving capability on highways," Musk said during a January earnings call.

Except for obstacles that coincide with whitelisted locations their sensors can't handle. It's only a matter of time before someone dies because of that hack.

[+] cbames89|7 years ago|reply
This is a (perhaps short-term) failure of first principles reasoning. Elon's known to favor thinking about things from first principles, and there's no theoretical reason that vision can't work. However, technical limitations might take a while to overcome.
[+] czr|7 years ago|reply
Indeed. There's also a slightly less charitable interpretation–Tesla could't ship every car with LIDAR for cost reasons, so they're strongly incentivized to claim they don't need it.

That said, I don't think Tesla are wrong. If Tesla can deploy MobilEye-level automatic visual mapping (see https://youtu.be/GQ15HWCw_Ic?t=1381) this can obviate the need for LIDAR-based localization and dramatically improve their perception systems (by having very good prior information about all static obstacles, such as lane dividers).

Dynamic obstacle detection will still be worse than it would be with a vision+LIDAR approach, but not that much worse. Having superhuman ability along two axes (map-based priors and alertness / reaction time) is likely sufficient to drive better than humans, even if several remaining axes (dynamic obstacle detection and prediction) are worse.

Additionally, Tesla is not (AFAIK) using any structure-from-motion (https://www.youtube.com/watch?v=KT2KsN7yKo0) or stereo-vision (https://www.youtube.com/watch?v=SskSDjUG8ZY) techniques, but there's a chance that these could also improve perception, especially in tough cases where a single frame is not enough for good detection (e.g. white semi in sunlight).

LIDAR is not being used in the (compared to Tesla) small-scale AV systems of Waymo, Cruise, Aurora et. al because it's essential, but rather because it's convenient, and because the companies producing those systems want to give their AVs every advantage at any cost. I fully expect Tesla (and MobilEye) to do superhuman self driving without LIDAR, but it will take longer (i.e. not by EOY).

[+] RivieraKid|7 years ago|reply
People should strive for correct reasoning. First principles reasoning, as I interpret that term, is just an approach that helps to achieve correctness by overcoming ingrained assumptions.

If Elon really thought "in theory, camera-only full self driving is possible, therefore I should invest resources in that approach" than he's dumber than I thought. He's done it because Lidar was not an option - and the claim that FSD is coming soon was an intentional lie.

[+] ProblemFactory|7 years ago|reply
We figured out the first principles of fusion power in 1930s, but are still nowhere close to practical fusion power generation. The engineering difficulties are just too great, and simpler alternatives exist which get much more attention, funding and optimisation.

Similar might happen to vision-only self-driving. It might take a decade or two longer to develop compared to LIDAR-based approaches, and meanwhile LIDAR is only going to get cheaper.

[+] ForHackernews|7 years ago|reply
Vision alone fails pretty frequently even with a human brain and 700 million years of evolution behind it. Count me as a skeptic that cameras + computer will produce a safe self-driving system.
[+] vkou|7 years ago|reply
> there's no theoretical reason that vision can't work

Human brains, which are general intelligences, exist, and are collections of atoms.

I'll happily, for $500,000/copy, upfront, promise to sell you artificial general intelligences. I swear I'll get them built in the next two years.

There is, after all, no first principle reason for why my promise isn't worth the paper it's printed on. Human brains are collections of atoms, so we should be able to, out of atoms, build artificial general intelligences.

[+] haberman|7 years ago|reply
I got a Tesla last year. I love it. But it's been painfully obvious to me that Autopilot is nowhere close to something that will let you take a nap.

Often when I'm stopped at a light, cars that are standing completely still will appear to be constantly moving forwards and backwards on the display. My best theory so far is that the Tesla's spatial model is getting thrown off by the other car's turn signal. This does not inspire confidence.

In general, I find the radar-enhanced cruise control very reliable (so nice in bad traffic), but autosteer is flaky at best.

[+] FireBeyond|7 years ago|reply
> In general, I find the radar-enhanced cruise control very reliable (so nice in bad traffic)

Which is great but hardly unique to Tesla. Every major manufacturer out there offers adaptive cruise control. My car even recognizes the difference between in traffic stop-and-go/rush hour, and "queue" mode (exiting parking lots after events, etc).

[+] tbabb|7 years ago|reply
IMO Tesla does not have the hardware on the Model 3 to do full self-driving.

I believe that stereopsis (multiple cameras using parallax to solve for per-pixel depth) is necessary to get a practical, well-functioning self driving system working. LIDAR is just too expensive and not good enough, but stereopsis is extremely flexible and can have extremely high angular resolution.

Combine naive stereopsis with temporally-coherent sensor fusion (e.g. a well-designed Kalman filter), and I think you could have very robust ranging. Humans are already very good at this with two narrowly spaced eyes (stereopsis to 1/4 mile range is not unreasonable for a person)-- but a car is not limited to a 1.5 inch stereo baseline; it could have a stereo cameras on opposite sides of the windshield. That would hugely increase the depth sensitivity, even at moderate resolution-- parallax can be detected well below the Nyquist limit (since Nyquist cuts off frequency, but does not destroy phase).

Tesla is totally failing at even the basic level of environment awareness (c.f. cars which have been driving into exit dividers), which is what I consider to be the easy part of self-driving (the hard part is getting the machine to participate in a nonverbal social environment, which is what the roadway is). Rumor has it that Teslas can't detect obstacles far enough ahead to avoid them at more than 30mph-- absolutely abysmal, if true.

If it were me, I would put cameras on the Tesla windshield in this pattern:

    xXoO      OoXx
Where x and o are long-range and wide-field cameras, and caps and lowercase are dynamically exposure-adjusted to capture both brights and darks. Each pair has the largest possible parallax baseline. And I would do a big, fat sensor fusion on all eight of them to get a high-res, depth-augmented HDR map of the surroundings. No fancy, expensive sensors. Just a large number of cheap cameras and sophisticated software. Maybe do some tricks with cycling 'attention' or multi-res hierarchies to keep the computation load down and realtime.

Tesla has only three cameras with different FOVs and a very narrow baseline-- I doubt they are doing stereopsis. And I don't think they can do the job without it.

[+] jaimex2|7 years ago|reply
I'm confused.

The article seems to extrapolate that Tesla has failed in autonomous driving because it removed some lines from its Autopilot page. It's still very much in progress and progress is routinely confirmed by the company. Musk is known for blowing timelines but they do get delivered.

When it talks about "old approach" I'm again confused. No one else crowd sources driving data from a real world fleet. They have a unique unsupervised learning datasource from shadowing drivers.

As for attentiveness. I don't see how drivers not paying attention is any different to mobile phone use. Currently Autopilot is very clear on telling at every chance its an assistant. Saying it will kill people is like blaming phones for drivers texting while driving and getting killed. These people die because they are breaking the law and not in control of their vehicle.

[+] g6nhe9twPd66|7 years ago|reply
> Currently Autopilot is very clear on telling at every chance its an assistant.

Yep, hence why this article is just clickbait for Tesla haters. Every Tesla owner is fully aware that it's not self-driving in the sense that you can take a nap.

[+] wurst_case|7 years ago|reply
I wonder, though, if this had more to do with how much they are actually investing in AI vs electric car tech. Surely Tesla is an EV company first and an AI company second. I know they like to brag about their autopilot but it eems to me that having a lower cost car is more important than having an expensive lidar system on board.
[+] Skunkleton|7 years ago|reply
Most car companies make lots of money on add-on features. The base model couldn't be super profitable, but the base model with the self-driving package probably makes some good money. Or it would if the didn't have to keep doing recalls to upgrade the hardware.
[+] Shivetya|7 years ago|reply
TM3 owner, car has EAP. I have the option to buy FSD for two thousand but haven't jumped. I not only don't believe it but I don't have a need for it. Now it might be useful down the road for resale.

That being said, two thoughts.

First, if they want me to buy it then demo it in passive mode in my car. That is, there is space where the current speed limit is shown. Use that space and below that to show what signs and signals it has seen recently and the order of importance. Currently it does not see speed limit signs and it that is wholly FSD territory then Tesla is overcharging compared to other systems.

Second, just at a stand still cars around me jump and I am not sure it sees stationary cars when I am driving. The best example I have is a two lane road through a subdivision I usually take, the outer lane is a long turning lane but people tend to stop there to let kids out for the water park. I cannot recall my car showing a car when someone is loading/unloading kids but it does see cars moving in that lane when I over take them. So is it just going by too fast for the stopped car in the lane over to register? It knows its a lane. I am not sure but I will wait to see how it develops.

[+] abbbacccus|7 years ago|reply
I am still trying to figure out what started and sustained the enormous self-driving car hype of the last few years. I understand why people who don't know much about technology would buy into it, but I don't understand why so many relatively tech-savvy people have bought into it and why huge amounts of money have been invested into it. It should be obvious - and should have been obvious when the hype started, as well - that to create fully self-driving cars that can operate as well as a competent human driver in a full spectrum of real road situations would require solving extremely difficult technical problems that are nowhere near solved and that cannot necessarily be solved any time soon simply by throwing money at them. So what explains the hype and the investment money? Out of the people responsible for the hype, what fraction were/are merely deluded and what fraction were/are lying?
[+] justtopost|7 years ago|reply
We desprately want it to be true, so we ignore the truth long enough to try earnastly to make it so. It the brave and ignorant charge of the new generation. Which begs the question; Is it enough to mean well and act in earnst, or is ethics reserved for those who have the luxury of self-reflection?
[+] syntaxing|7 years ago|reply
Not having LiDAR for anything over level 3 self-driving capabilities seems like a very bad idea...Computer vision right now just does not have the spatial awareness that you need for self driving capabilities. I wish Tesla would work on something similar to the Kinect v3 but for long range (above 20 m).
[+] ableal|7 years ago|reply
"Self-driving cars also benefit from lidar sensors, and the best ones cost thousands—if not tens of thousands—of dollars each. That's too expensive for an upgrade to a customer-owned vehicle. But the economics are more viable for a driverless taxi service, since the self-driving system replaces an expensive human taxi driver."

That's probably the crucial point. For now, lidar is needed for safe operation, and too expensive for mass deployment in private cars.

[+] jayess|7 years ago|reply
If self-driving can navigate US streets fully autonomously, even in bad weather, that will be impressive. Now transplant that to just about anywhere else in the world, and it will be impossible. Mexico City, Rome, Buenos Aires. Hah, no chance.
[+] agumonkey|7 years ago|reply
Are there efforts to invert the problem ? instead of fully independent vehicles, having a bit of road signaling system (a reincarnation of 50s US embedded radio track).