Waymo and GM Lead the Self-Driving Car Race, New Data Shows
Most of the questions surrounding the coming age of driverless cars pertain to practical things: regulation, insurance, training protocols for the cars’ remote human backups. Some are philosophical: What do we owe the people whose jobs will be annihilated? Do robo cars need ethics lessons? At least one question is practical and philosophical: How do we know when these things are ready to ditch their human safety drivers and roll about unattended?
No one has much of a response. You could say that as soon as the robot is safer than the average human driver—who crashes once every 238,000 miles or so—it’s wrong to keep it in the lab. Or you can argue that robo cars ought to be held to higher standards: Should they be 10 times better than the human? 1,000 times? Whatever the answer is, data will help us get there. And so we turn to the California DMV’s 2017 Autonomous Vehicle Disengagement Reports.
The Golden State, home to many of the companies leading the robo revolution, has some of the strictest rules for AVs in the country. Operators who run cars on public roads must publicly report any crashes they’re involved in. And at the end of every year, they must hand over data on how many miles they drove and how many times their onboard human safety driver had to take control from the machine—that’s called a disengagement. Combine those, and you have a number approximating how far any company’s self-driving car can go without human help. Something like a grade.
The metric is imperfect, and this data comes with a crate of caveats. But before we get into those, know this: Waymo (formerly known as Google’s self-driving car project) and General Motors appear to be leading the pack and making rapid progress toward the day when human drivers, with all their inattention and distraction and tendency to crash, will be obsolete.
Ifs and Buts
You can read more about the shortcomings of disengagement reports here, but here’s the quick rundown:
- They’re unscientific, because each company reports its data in a different way, offering various levels of detail and idiosyncratic explanations for what triggered the human takeover.
- They’re packed with vague language and lack context. Delphi cites “cyclist” as the reason for a bunch of disengagements. Zoox blamed every disengagement on a “planning discrepancy” or “hardware discrepancy.”
- They’re little use for anyone who wants to compare rival companies, because those companies aren’t running the same tests: Waymo does most of its testing in simple suburbs; GM focuses on the complex city. They’re better for tracking the progress of each outfit, but still not great, because those companies change how and where they test over time.
- A disengagement does not mean the car was going to crash, only that the human driver wasn’t 100 percent confident in how it would behave.
- They only cover driving on public roads in California. So we don’t know anything about Ford, which focuses its testing around Detroit and Pittsburgh. We don’t see data for Waymo’s increasingly important test program in Phoenix—where its cars are tooling about without anyone inside.
On the other hand, the disengagement reports are the best data we’ve got for evaluating these development efforts. No state but California demands anything like this, and private companies only share such info when the government demands it.
So, let’s sprinkle some grains of salt on the numbers and take a look. We broke them down into a pair of two-axis charts. The first looks at Waymo and General Motors. It notes how many miles they drove in 2016 and 2017 (in green) and how many miles they averaged between disengagements (in blue). (By the way, Uber didn’t have to file a report, because this data isn’t required until your first full calendar year of testing. Uber didn’t get its permit to test in California until March of 2017.)
The takeaway here is that Waymo’s software remains excellent, and it’s still doing tons of testing in California. For GM, you can see a huge ramp-up in miles driven, and a steep increase in miles per disengagement. That’s progress, and it’s a good thing: GM plans to launch a car without a steering wheel or pedals next year. Keep in mind that GM does nearly all its public street testing in San Francisco, a much more complicated environment than Palo Alto and Mountain View, where Waymo works.
Next, we have the data for Delphi (now known as Aptiv), Nissan, Mercedes-Benz, and Zoox, a San Francisco–based startup working to build a self-driving vehicle that looks nothing like today’s cars—not that it will say anything more than that for the time being. Each has a serious program, but they do so much less testing than Waymo and GM that we put them in their own chart. (Otherwise, the scales would just be totally out of proportion to each other.)
More caveats: Mercedes-Benz may not look so hot in California, but that data’s from just three vehicles. It does much more work in Europe: In 2017, it sent an autonomous S-Class on a five-month tour of five continents. Nissan does a lot of testing at NASA’s Ames Research Center, which doesn’t count as public land, so doesn’t require data reporting. And to get the most interesting bit of data from Zoox, you have to dive into its report.
In its first year of testing (thus the lack of 2016 numbers), it drove just over 100 miles through August. Over the next three months, it drove about 2,000. Yet its rate of disengagements remained steady. Overall, it averaged 160 miles per disengagement. But if you look at just November, when it was doing lots of testing in downtown San Francisco, that number jumps to 430. Even with the caveats, it’s a clear sign that Zoox is making impressive progress—and that more than one of these students is getting ready to throw on a gown, grab its diploma, and give you a ride.