The level-2 driving that Tesla is pushing seems like a worst case scenario to me. Requiring the driver to be awake and alert while not requiring them to actually do anything for long stretches of time is a recipe for disaster.
Neither the driver nor the car manufacturer will have clear responsibility when there is an accident. The driver will blame the system for failing and the manufacturer will blame the driver for not paying sufficient attention. It's lose-lose for everyone. The company, the drivers, the insurance companies, and other people on the road.
Tesla's system doesn't have enough sensors. Musk forced his engineers to try to do this almost entirely with vision processing, and that was a terrible decision. Vision processing isn't that good yet. Everybody else uses LIDAR.
I've been saying for years that the right approach was to take the technology from Advanced Scientific Concepts' flash LIDAR and get the cost down. I first saw that demonstrated in 2004 on an optical bench in Santa Monica. It became an expensive product, mostly sold to DoD. It's expensive because the units require exotic InGaAs custom silicon and aren't made in quantity. Space-X uses one of their LIDAR units to dock the Dragon spacecraft with the space station.
Last year, Continental, the big century-old German auto parts maker, bought the technology from Advanced Scientific Concepts and started getting the cost down.[1] Volume production in 2020. Interim LIDAR products are already shipping in volume. Continental is quietly making all the parts needed for self-driving. LIDAR. Radar. Computers. Actuators. Cameras. Software for sensor integration into an "environment model". They design and make all the parts needed, and provide some of the system integration.
Apple and Google were trying to avoid becoming mere low-margin Tier I auto parts suppliers. Continental, though, is quite successful as a Tier I auto parts supplier. Revenue of €40 billion in 2016. Earnings about €2.8 billion. Dividend of €850 million. They can make money on low-margin parts.
Continental may end up quietly ruling automatic driving.
It depends on what you're optimizing for. Others using LIDAR are optimizing for speed to market, while potentially sacrificing ability to solve the problem as fully. Musk's argument is that we know for certain that the entire road system can be navigated by visual cues, because that's how humans do it. We do not know for certain that this is possible with LIDAR.
You seem to be assuming that the limiting factor is cost and parts. To me it appears the limiting factor is software. AIUI no one has the software to do level 4 autonomous driving, at any price. So why would a parts supplier end up "ruling" automatic driving?
I've seen you plug flash LIDAR (especially ASC's unit) several times here, but is anyone actually using it on SDVs? I've seen things that could have been flash LIDAR on test cars, but never as the primary sensor.
I worked on a team that evaluated the ASC unit a few years ago, but they found it unusable due to bloom issues. Has that changed?
>Tesla's system doesn't have enough sensors. Musk forced his engineers to try to do this almost entirely with vision processing, and that was a terrible decision. Vision processing isn't that good yet. Everybody else uses LIDAR.
I think I agree with this, but is LIDAR expected to work in the rain?
You forget the (smaller, repeatedly broken up) American auto counterpart of Continental: Delphi. Google them and you'll find they're doing pretty well of late.
The only industry to have produced truly driverless public transportation systems is the rail industry. Not aeronautics. Rail systems happens to be my business and what I read here makes me very worried.
I don't think the majority understands what safety means in mass transportation. It's not about running miles and miles without accidents and basically saying "see"? It's about demonstrating /by design/ that the /complete/ system over its /complete/ lifetime will not kill anyone. In terms of probability of failure it translates in demonstrated hazard rates of less than 1E-9 /including the control systems/. This take very special techniques and if that could've been done using only vehicle sensors, it would have been adopted by us long ago. I am also sorry to report that doubling cameras and sensor fusion will not get you an acceptable safety level. We've tried that too, rookies.
Is it "fair", to use Elon's argument? After all, isn't additional safety enough compared to existing situation. Ah but we have been there too! For driver assistance it is indeed better. Similar systems were deployed during the second half of 20th century (e.g. KVB, ASFA, etc). But the limit is clear. It only /improves/ driver's failure rate. It does not substitute for the driver. If you substitute, you have to do much much much better. Nobody will ride a driverless vehicle provided the explanation that it is, you know, "already an improvement when compared to a typical driver". Is it fair? Maybe not, but that's the whole point for entrusting lives to a machine.
What befuddles me is that in all these discussions about self-driving cars seemingly no one refers to the massive body of knowledge in this area that comes from the aviation world.
I've posted variants of this same comment several times and I'm starting to feel like a broken record.
Look at studies of efforts to make planes safer by removing the human element. While efforts like autopilot have made things safer it reaches the point where more automation can reduce safety as pilots are no longer alert and/or don't trust the instruments and/or can't fully manually override the automation.
Call it the uncanny valley of automation safety.
Bridging that last few percent for true automation (ie where vehicles aren't designed to have drivers or pilots at all) is going to be _incredibly_ difficult, to the point where I'm not convinced it won't require something resembling a general AI.
All of this is why I think driverless cars are going to take much longer than many expect.
There's a big difference: commercial pilots are highly trained, even-tempered, and take their job seriously. Most drivers are lazy, distracted, and apt to do something stupid in an emergency. It's very hard to make something safer than a commercial pilot. It's much easier to make something safer than a typical driver.
I am not disputing your assessment, but please don't discount liability. Planes can pretty much fly themselves today - there are no significant technology issues with the idea of "taxi away, take off, fly to destination, land, taxi to gate". all of this happens in what is perhaps the most regulated traffic environment on the planet.
The issue is with creating the code that deals with "oh shit" scenarios. Whilst is is probably possible, and even feasible, to write code to cover every possible failure scenario, who is going to be left holding the can when this fails (all systems have a non-zero probability of failure)?
Who will be held responsible? The outsourced company that coded the avionics/flight control software? The airplane manufacturer? The airline company? The poor fucker that wrote the failing logic tree that was supposed to deal with that specific failure scenario, but was forced to work overtime the 47th day in a row when that particular code was cut?
It is a liability nightmare, and when you add up the cost of creating a software system that must never fail, the increased insurance premiums, the PR/marketing work to convince the unwashed masses that this is actually safer, and the whole rest of the circus required to make this a reality, you will find that pilot costs are not all that bad. Especially since pilots have significant downward pressure on real earnings these days anyway.
This is a very important topic that I am surprised does not receive enough coverage. Thank you for bringing it up.
It will be particularly interesting if accident blame is placed on the 'dumb cars', and then insurance companies do a 180 and charge MORE for 'dumb cars' operated by humans. Once they put this information in their pricing models, I assume its 'stuck' in there until the next major NTSB report is published.
As complacency sets in over a couple months & years, the accident rates will likely swing from "dumb" human operated machines back to "Level 4 Highly Intelligent Teslas/UBERS/Argo AI", and that market might get a real shock when the pendulum comes back their way!
I agree, the devil is going to be in the almost infinite edge cases, the visual negotiation that goes on between drivers at box junctions, dealing with bad or aggressive drivers who ignore right of way or tailgate, ethical decisions in all the “Kobiashi Maru” no-win situations (extreme weather, black ice, highway pile up, mechanical failures).
An advanced AI may well be able to identify whether an object coming towards the windscreen is a bird, bat, leaf or a rock, but what will its intuition be about how much of a problem it is likely to be? Should it swerve to avoid a raccoon and risk whiplash for passengers? Should it aim to avoid large insects if the owner is vegan?
Also, people are very used to mild lawbreaking. We expect a cab driver to double park and let us out the car if there are no available parking spaces, but would an AI be authorised to bend the law or would it have to find the nearest parking space, which may be 5 blocks away and could be taken by the time it gets there?
I suspect we will get autonomous drone-like flying cars long before we get full autonomy in city centres or rural areas, because flying through a mostly obstacle-free space with an ability to avoid collisions on 3 axis seems much more reachable?
Thankfully this isn't as big of an issue with driving on the ground. Airplanes don't have sensors that give them the same precision as a car's wheel rotation or proximity to nearby objects.
Biggest news buried at the end. It says that several engineers have quit since October 2016 (including the head of autopilot) when Tesla started selling "fully autonomous driving" hardware upgrade packages. Says the engineers don't agree the hardware is capable of supporting this and that it was ultimately a marketing decision.
If I were one of those engineers, and didn't believe in the claims being made, I'd personally be quite worried of being held personally liable if the company gets sued in the case of accidental death. Last thing I'd want is my bug being responsible for someone dying.
A lot of people move jobs especially in LA. Is there a first-hand link to one of these engineers critisizing the systems (not trolling just cannot get past WSJ paywall)
I just ordered a Model S with Autopilot, and as I've been reading the comments on the various Tesla forums, I'm not sure I'm ever going to use it. Some of the stories are honestly terrifying (sudden deceleration on the highway, swerving into other lanes, etc).
> In May 2015, Eric Meadows, then a Tesla engineer, engaged Autopilot on a drive in a Model S from San Francisco to Los Angeles. Cruising along Highway 1, the car jerked left toward oncoming traffic. He yelped and steered back on course, according to his account and a video of the incident.
> Mr. Meadows said he was later dismissed for what he was told were “performance issues.” Tesla declined to comment on Mr. Meadows but noted that the incident happened months before the release of the technology, giving the company plenty of time to work out problems that had been discovered during test drives.
We used to treat our test pilots with the highest regard. How low have we fallen?
This isn't exactly an isolated incident, YouTube has lots of videos of autopilot steering wildly off course. The biggest problem is that Tesla allows turning on autopilot on roads that are not a highway and feature significant turns and hills obscuring "perfect lane vision", and the system is not prepared to handle that at all:
"In recent months, the team has lost at least 10 engineers and four top managers—including Mr. Anderson’s successor, who lasted less than six months before leaving in June."
Since we're finally getting some refutations to Self-Driving Hype, let me drop some quotes here:
“I tell adult audiences not to expect it in their lifetimes. And I say the same thing to students”
"Merely dealing with lighting conditions, weather conditions, and traffic conditions is immensely complicated. The software requirements are extremely daunting. Nobody even has the ability to verify and validate the software. I estimate that the challenge of fully automated cars is 10 orders of magnitude more complicated than [fully automated] commercial aviation."
- Steve Shladover, transportation researcher at the University of California, Berkeley
"With autonomous cars, you see these videos from Google and Uber showing a car driving around, but people have not taken it past 80 percent. It's one of those problems where it's easy to get to the first 80 percent, but it's incredibly difficult to solve the last 20 percent. If you have a good GPS, nicely marked roads like in California, and nice weather without snow or rain, it's actually not that hard. But guess what? To solve the real problem, for you or me to buy a car that can drive autonomously from point A to point B—it's not even close. There are fundamental problems that need to be solved."
- Herman Herman, director of the Carnegie-Mellon University Robotics Institute
"While I enthusiastically support the research, development, and testing of self-driving cars, as human limitations and the propensity for distraction are real threats on the road, I am decidedly less optimistic about what I perceive to be a rush to field systems that are absolutely not ready for widespread deployment, and certainly not ready for humans to be completely taken out of the driver’s seat."
- Mary Cummings, director of the Humans and Autonomy Laboratory at Duke
It's amazing to me that Tesla is able to sell a car in its price range that lacks basic features that come standard on a $17k Corolla like adaptive cruise or automatic emergency breaking, especially since they effectively reduced the capabilities of their cars by rolling out AP2. If any other company tried to pull that, they'd be laughed out of the room, but somehow, Tesla is cheered.
Honestly, I don't understand why the automobile industry doesn't learn from the airline industry. Airplanes have worked out how to balance autopilot capabilities with the need for pilots to remain engaged and attentive for years. Simply implement a Drive-By-Wire, similar to Airbus' Fly-By-Wire systems. A driver's inputs to the controls would still be required, but the autonomous systems could prevent or limit certain actions (such as accelerating into a stopped vehicle or swerving off the road).
Airline pilots are professionals, car drivers are just trying to get somewhere... paying attention isn't their full-time job. Also building autopilot software for a near empty 3-dimensional space is much easier than for complex roads of varying shapes, moving obstacles, country regulations and different road markings...
I talked to a pilot once who asserted that this is far from settled in aviation: he described the Boeing way, and the Airbus way, as two separate schools of thought. Boeing's systems keep the humans in the loop more at the expense of the autonomous systems, Airbus does the opposite. Empirically, it doesn't seem to make a difference with regards to safety outcomes.
Many of the current cars are 100% drive by wire (ECU/power brakes/electric power steering), but that doesn't mean anything.
Probably quite a good part of the $hundred of millions price of a passenger plane is the autopilot (even 1% means $1M). And even at $1M/plane, 99% of a plane's autopilot works because it assumes that the current plane is the only one in a large vicinity of a point in space. This is assured by a centralized third party (control tower) that is not really automated but a very stressful human job (that's why the air traffic controllers are well paid). This is not the case with cars - in this case, most of the work being done is having each of the individual cars detect, with complex but not very good sensors and software, what is around them, in a swarm of other moving objects that do not communicate.
What does "drive-by-wire" achieve other than removing a shaft between the steering wheel and steering rack? Cars already have collision avoidance without full self-driving capabilities.
Lane departure warning and collision warning systems are pretty prevalent. And even autobraking collision avoidance systems are getting pretty widespread.
Please stop posting paywalled articles. Especially WSJ. This community represents the future of the internet. I don't know what the answer is for making sure content providers get paid, but the WSJ model isn't it. So let's vote with our attention (or lack there of) and kill this annoying practice before it makes the internet an even more walled and unpleasant place.
Your post lacks information on the part of why the WSJ model isn't acceptable. Paying money for a newspaper isn't exactly unprecedented, and there are plenty of people who are fine with that model.
[+] [-] empath75|8 years ago|reply
Neither the driver nor the car manufacturer will have clear responsibility when there is an accident. The driver will blame the system for failing and the manufacturer will blame the driver for not paying sufficient attention. It's lose-lose for everyone. The company, the drivers, the insurance companies, and other people on the road.
[+] [-] Animats|8 years ago|reply
I've been saying for years that the right approach was to take the technology from Advanced Scientific Concepts' flash LIDAR and get the cost down. I first saw that demonstrated in 2004 on an optical bench in Santa Monica. It became an expensive product, mostly sold to DoD. It's expensive because the units require exotic InGaAs custom silicon and aren't made in quantity. Space-X uses one of their LIDAR units to dock the Dragon spacecraft with the space station.
Last year, Continental, the big century-old German auto parts maker, bought the technology from Advanced Scientific Concepts and started getting the cost down.[1] Volume production in 2020. Interim LIDAR products are already shipping in volume. Continental is quietly making all the parts needed for self-driving. LIDAR. Radar. Computers. Actuators. Cameras. Software for sensor integration into an "environment model". They design and make all the parts needed, and provide some of the system integration.
Apple and Google were trying to avoid becoming mere low-margin Tier I auto parts suppliers. Continental, though, is quite successful as a Tier I auto parts supplier. Revenue of €40 billion in 2016. Earnings about €2.8 billion. Dividend of €850 million. They can make money on low-margin parts.
Continental may end up quietly ruling automatic driving.
[1] https://www.continental-automotive.com/en-gl/Passenger-Cars/...
[+] [-] ccorda|8 years ago|reply
I suspect that if/when LIDAR is cheap enough, Tesla will use it.
In the meantime they outfit every single car with the best hardware that is realistic from a cost standpoint today, instead of waiting til 2020.
[+] [-] tyrw|8 years ago|reply
[+] [-] haberman|8 years ago|reply
[+] [-] asteli|8 years ago|reply
I worked on a team that evaluated the ASC unit a few years ago, but they found it unusable due to bloom issues. Has that changed?
[+] [-] bluthru|8 years ago|reply
I think I agree with this, but is LIDAR expected to work in the rain?
[+] [-] Analemma_|8 years ago|reply
[+] [-] shas3|8 years ago|reply
[+] [-] CodeWriter23|8 years ago|reply
https://m.youtube.com/watch?v=BE2lQK_0CDw
And Tesla's engineers aren't the first to bellyache about being asked to make the impossible a reality.
[+] [-] crocal|8 years ago|reply
I don't think the majority understands what safety means in mass transportation. It's not about running miles and miles without accidents and basically saying "see"? It's about demonstrating /by design/ that the /complete/ system over its /complete/ lifetime will not kill anyone. In terms of probability of failure it translates in demonstrated hazard rates of less than 1E-9 /including the control systems/. This take very special techniques and if that could've been done using only vehicle sensors, it would have been adopted by us long ago. I am also sorry to report that doubling cameras and sensor fusion will not get you an acceptable safety level. We've tried that too, rookies.
Is it "fair", to use Elon's argument? After all, isn't additional safety enough compared to existing situation. Ah but we have been there too! For driver assistance it is indeed better. Similar systems were deployed during the second half of 20th century (e.g. KVB, ASFA, etc). But the limit is clear. It only /improves/ driver's failure rate. It does not substitute for the driver. If you substitute, you have to do much much much better. Nobody will ride a driverless vehicle provided the explanation that it is, you know, "already an improvement when compared to a typical driver". Is it fair? Maybe not, but that's the whole point for entrusting lives to a machine.
[+] [-] cletus|8 years ago|reply
I've posted variants of this same comment several times and I'm starting to feel like a broken record.
Look at studies of efforts to make planes safer by removing the human element. While efforts like autopilot have made things safer it reaches the point where more automation can reduce safety as pilots are no longer alert and/or don't trust the instruments and/or can't fully manually override the automation.
Call it the uncanny valley of automation safety.
Bridging that last few percent for true automation (ie where vehicles aren't designed to have drivers or pilots at all) is going to be _incredibly_ difficult, to the point where I'm not convinced it won't require something resembling a general AI.
All of this is why I think driverless cars are going to take much longer than many expect.
[+] [-] tlb|8 years ago|reply
[+] [-] mdekkers|8 years ago|reply
I am not disputing your assessment, but please don't discount liability. Planes can pretty much fly themselves today - there are no significant technology issues with the idea of "taxi away, take off, fly to destination, land, taxi to gate". all of this happens in what is perhaps the most regulated traffic environment on the planet.
The issue is with creating the code that deals with "oh shit" scenarios. Whilst is is probably possible, and even feasible, to write code to cover every possible failure scenario, who is going to be left holding the can when this fails (all systems have a non-zero probability of failure)?
Who will be held responsible? The outsourced company that coded the avionics/flight control software? The airplane manufacturer? The airline company? The poor fucker that wrote the failing logic tree that was supposed to deal with that specific failure scenario, but was forced to work overtime the 47th day in a row when that particular code was cut?
It is a liability nightmare, and when you add up the cost of creating a software system that must never fail, the increased insurance premiums, the PR/marketing work to convince the unwashed masses that this is actually safer, and the whole rest of the circus required to make this a reality, you will find that pilot costs are not all that bad. Especially since pilots have significant downward pressure on real earnings these days anyway.
[+] [-] cdolan|8 years ago|reply
It will be particularly interesting if accident blame is placed on the 'dumb cars', and then insurance companies do a 180 and charge MORE for 'dumb cars' operated by humans. Once they put this information in their pricing models, I assume its 'stuck' in there until the next major NTSB report is published.
As complacency sets in over a couple months & years, the accident rates will likely swing from "dumb" human operated machines back to "Level 4 Highly Intelligent Teslas/UBERS/Argo AI", and that market might get a real shock when the pendulum comes back their way!
[+] [-] StephenMelon|8 years ago|reply
An advanced AI may well be able to identify whether an object coming towards the windscreen is a bird, bat, leaf or a rock, but what will its intuition be about how much of a problem it is likely to be? Should it swerve to avoid a raccoon and risk whiplash for passengers? Should it aim to avoid large insects if the owner is vegan?
Also, people are very used to mild lawbreaking. We expect a cab driver to double park and let us out the car if there are no available parking spaces, but would an AI be authorised to bend the law or would it have to find the nearest parking space, which may be 5 blocks away and could be taken by the time it gets there?
I suspect we will get autonomous drone-like flying cars long before we get full autonomy in city centres or rural areas, because flying through a mostly obstacle-free space with an ability to avoid collisions on 3 axis seems much more reachable?
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] bluthru|8 years ago|reply
Thankfully this isn't as big of an issue with driving on the ground. Airplanes don't have sensors that give them the same precision as a car's wheel rotation or proximity to nearby objects.
[+] [-] manyoso|8 years ago|reply
[+] [-] RonanTheGrey|8 years ago|reply
E.g. I would quit too.
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] boznz|8 years ago|reply
[+] [-] tdees40|8 years ago|reply
[+] [-] abalone|8 years ago|reply
Is this video online?
[+] [-] CupOfJava|8 years ago|reply
We used to treat our test pilots with the highest regard. How low have we fallen?
[+] [-] revelation|8 years ago|reply
https://www.youtube.com/watch?v=ZBaolsFyD9I
https://www.youtube.com/watch?v=IOnuKrzCLYc
[+] [-] twsted|8 years ago|reply
"In recent months, the team has lost at least 10 engineers and four top managers—including Mr. Anderson’s successor, who lasted less than six months before leaving in June."
[+] [-] frgtpsswrdlame|8 years ago|reply
“I tell adult audiences not to expect it in their lifetimes. And I say the same thing to students”
"Merely dealing with lighting conditions, weather conditions, and traffic conditions is immensely complicated. The software requirements are extremely daunting. Nobody even has the ability to verify and validate the software. I estimate that the challenge of fully automated cars is 10 orders of magnitude more complicated than [fully automated] commercial aviation."
- Steve Shladover, transportation researcher at the University of California, Berkeley
http://www.automobilemag.com/news/the-hurdles-facing-autonom...
"With autonomous cars, you see these videos from Google and Uber showing a car driving around, but people have not taken it past 80 percent. It's one of those problems where it's easy to get to the first 80 percent, but it's incredibly difficult to solve the last 20 percent. If you have a good GPS, nicely marked roads like in California, and nice weather without snow or rain, it's actually not that hard. But guess what? To solve the real problem, for you or me to buy a car that can drive autonomously from point A to point B—it's not even close. There are fundamental problems that need to be solved."
- Herman Herman, director of the Carnegie-Mellon University Robotics Institute
https://motherboard.vice.com/en_us/article/d7y49y/robotics-l...
"While I enthusiastically support the research, development, and testing of self-driving cars, as human limitations and the propensity for distraction are real threats on the road, I am decidedly less optimistic about what I perceive to be a rush to field systems that are absolutely not ready for widespread deployment, and certainly not ready for humans to be completely taken out of the driver’s seat."
- Mary Cummings, director of the Humans and Autonomy Laboratory at Duke
https://www.commerce.senate.gov/public/_cache/files/c85cb4ef... [pdf]
All quotes pulled from this article (which is really quite good and you should read it in full):
https://www.nakedcapitalism.com/2016/10/self-driving-cars-ho...
[+] [-] jijojv|8 years ago|reply
[+] [-] strange_quark|8 years ago|reply
[+] [-] timdorr|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] misterbowfinger|8 years ago|reply
[+] [-] yeahdef|8 years ago|reply
[deleted]
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] jballanc|8 years ago|reply
[+] [-] tlextrait|8 years ago|reply
[+] [-] munin|8 years ago|reply
[+] [-] alexanderstears|8 years ago|reply
[+] [-] miahi|8 years ago|reply
Probably quite a good part of the $hundred of millions price of a passenger plane is the autopilot (even 1% means $1M). And even at $1M/plane, 99% of a plane's autopilot works because it assumes that the current plane is the only one in a large vicinity of a point in space. This is assured by a centralized third party (control tower) that is not really automated but a very stressful human job (that's why the air traffic controllers are well paid). This is not the case with cars - in this case, most of the work being done is having each of the individual cars detect, with complex but not very good sensors and software, what is around them, in a swarm of other moving objects that do not communicate.
[+] [-] charlesdenault|8 years ago|reply
[+] [-] maxerickson|8 years ago|reply
Lane departure warning and collision warning systems are pretty prevalent. And even autobraking collision avoidance systems are getting pretty widespread.
[+] [-] arcanus|8 years ago|reply
[+] [-] localhost|8 years ago|reply
[+] [-] manyoso|8 years ago|reply
around paywall
[+] [-] rsp1984|8 years ago|reply
https://twitter.com/newsycombinator/status/90078007990679142...
[+] [-] desireco42|8 years ago|reply
[+] [-] jaytaylor|8 years ago|reply
[+] [-] true_tuna|8 years ago|reply
[+] [-] Strom|8 years ago|reply