It only makes sense if one doesn't know much about self-driving. What they quote and refer to as "bullshit" is very reasonable, these systems do break down because they can't handle every possible situation on their own. They do not have human intelligence.
With robotaxis what they currently do is human drivers go out and rescue the vehicle. You can't do that with a truck, it's not economical. This will be an issue for years to come, we're probably going to see more remote assist but unlikely for an 80,000 pounds killing machine driving at highway speeds.
Which brings us to another point: Regulation and risk perception. Cruise having some hickups and maybe being involved into some minor fender benders is one thing. Politicians and some of the public in the respective cities are already concerned. But a fully loaded self driving semi being involved in a crash and potentially killing some people could be the autonomous version of 9/11. That's a lot of risk, the licenses might be revoked and then their investment is toast.
meowtimemania|2 years ago
trompetenaccoun|2 years ago
With robotaxis what they currently do is human drivers go out and rescue the vehicle. You can't do that with a truck, it's not economical. This will be an issue for years to come, we're probably going to see more remote assist but unlikely for an 80,000 pounds killing machine driving at highway speeds.
Which brings us to another point: Regulation and risk perception. Cruise having some hickups and maybe being involved into some minor fender benders is one thing. Politicians and some of the public in the respective cities are already concerned. But a fully loaded self driving semi being involved in a crash and potentially killing some people could be the autonomous version of 9/11. That's a lot of risk, the licenses might be revoked and then their investment is toast.