Self-driving cars will change our lives in many ways over the coming years. Their advocates make big promises from the end of car crashes to the reinvention of our car-centric cities, but the constant stream of serious crashes in the news begs the question of whether this technology is being rushed onto our streets before it’s ready.
Take the recent crash in Arizona where an autonomous car crashed into a pedestrian that it didn’t “see,” killing her. Or the one that crashed into a concrete highway lane divider that it didn’t “see,” and burst into flames killing the driver. Or the one that crashed into a tractor trailer that it didn’t “see,” shearing off the roof and killing the driver. Or the one that crashed into the back of a stopped firetruck in Utah that it didn’t “see.” Or the one that crashed into the back of a stopped firetruck in Los Angeles that it didn’t “see.”
All of these crashes involved a self-driving car that crashed into something at full speed, without any brakes applied. Those types of crashes are both particularly deadly and relatively rare with human drivers.
Clearly the makers of these systems still have plenty of bugs to work out. The question is, why are the bugs being worked out using consumers, other drivers, and pedestrians as crash test dummies? If self-driving cars are missing stopped firetrucks, concrete lane dividers, and tractor trailers, how will they do in larger numbers with children chasing a ball into the street?
Meanwhile, some of the companies pushing hardest for this technology, like General Motors, are lobbying to be immune from being held responsible when their cars malfunction like this and kill people, while others, like Volvo, have refreshingly vowed to be accountable.
Some people say they think the technology is ready because no matter its problems, it must be better than all those terribly distracted drivers currently in control of their cars. The problem with this logic is that, while we do face an epidemic of distracted drivers causing crashes, current technology still requires the driver behind the wheel of a self-driving car to be paying attention, so they can take over if the system doesn’t work right. But if drivers already have trouble avoiding distractions while fully in control of their cars, giving them a car that seems to drive itself without issue is only going to give them license to become more distracted. And when the car doesn’t see a pedestrian, this is about how likely the “driver” will be to save the day.
Trading drivers that are partially distracted by cars with no one paying attention at all might make sense someday, but not until self-driving cars “see” a lot better. In the meantime, manufacturers, programmers, and regulators need to be more realistic about how attentive drivers will be in cars with systems that seem to drive themselves and carry names like autopilot. And they need to test and debug these systems on closed courses, not our streets.
The American Association for Justice has released a comprehensive review of safety and legal issues related to self-driving cars.