There is something of a regulatory vacuum at the moment when it comes to self-driving cars. The technology is advancing very quickly, and so far, laws are lagging behind. This is particularly true when it comes to assigning blame for accidents caused by these autonomous vehicles.
What to do about a fatal self-driving car accident is no longer a hypothetical question. In March, a woman was killed in Tempe, Arizona by an Uber-operated self-driving car. The investigation is ongoing, and the at-fault party has yet to be determined.
Police say the 49-year-old woman was walking a bike across the street, outside a crosswalk, at around 10pm. The Uber car that hit her was travelling at 40 miles per hour in autonomous mode, with an operator in the driver’s seat.
Uber immediately suspended its self-driving tests in Arizona and across the US.
The incident highlights the debate about legal liability when it comes to collisions in which an autonomous vehicle harms a someone through no fault of that person. Would the blame lie with the self-driving car’s owner, manufacturer, a combination of the two, or someone else?
It seems that the people of Arizona were so eager to steal California’s spot as a haven for self-driving cars that they left many of these questions unanswered. Ford, GM, Google, Intel, and Uber are all testing self-driving cars in Arizona. Exact figures are sparse, but there are at least hundreds of driverless vehicles in use in the state.
Writing in The New York Times, Claire Cain Miller argues that criminal law would likely not apply, “for the simple reason that robots cannot be charged with a crime.” Criminal law looks for a guilty mind, a particular mental state. This would be difficult if there isn’t a person actually driving the car.
There are many arguments in favour of driverless cars. They will reduce traffic jams and cut pollution. They will benefit seniors and those with reduced mobility. They’re also theoretically safer compared to human drivers, who are prone to human errors like negligence or intoxication. And it’s worth remembering that that tens of thousands of Americans are killed by human-operated cars each year. Should research on driverless cars therefore be scaled back?
According to the National Association of State Legislatures, 21 US states have laws regulating self-driving vehicles in some way. They vary from wide-open regulatory regimes like Arizona to stricter regulations like those in Nevada, which require that two operators be present in a self-driving car during a test on public roads. The car must also be accompanied by a pilot vehicle driving directly ahead of it. While it’s impossible to know if these precautions would have prevented what happened in Tempe, they could reduce the chance of similar accidents in the future.
While further regulations may be useful, this unfortunate incident isn’t necessarily a reason to abandon the idea of self-driving cars entirely. Even the most utopian techie must have surely guessed that autonomous vehicles would eventually be involved in a fatal accident. Silicon Valley and the auto industry have a responsibility to make these cars as safe as possible, but the onus is also on legislators to build a regulatory landscape that protects everyone else.
Nezha Boutamine | Staff Writer