If A Crash Is Unavoidable, Who Does A Driverless Car Hit?
People polled believe that autonomous cars should act for the greater good—most of the time
Picture this: An autonomous vehicle is driving down the street with an adult and child inside. Suddenly, several pedestrians step in front of the car. If the car doesn’t have time to stop, should it hit the pedestrians, or swerve and possibly hit something else, putting the passengers’ lives in danger?
As a future full of driverless cars approaches, engineers and manufacturers have to include designs that reflect consumers’ morals. But there had been no consensus on just what those morals should be (though a few driverless cars are already on the road, this aspect of the technology is not particularly advanced). That’s why a team of researchers based in France and the United States conducted a series of surveys to gauge where the public stands on driverless cars. They found that while people are objectively utilitarian—they want self-driving cars to swerve to avoid hitting pedestrians—the considerations are different when their own lives hang in the balance. The researchers published their work Thursday in the journal Science.
To understand where public opinion lies, the researchers launched six surveys using Amazon Mechanical Turk, an online surveying tool, polling nearly 2,000 people. Though in one survey most participants thought that the passenger should be sacrificed if there was more than one pedestrian—imposing a moral system on the cars that minimizes loss of life—the respondents’ answers were less certain when one of their family members was hypothetically in the vehicle.
“You can recognize the feeling; the feeling that I want other people to do something, but it would be great not to do it myself,” Jean-François Bonnefon, a researcher at the Toulouse School of Economics, said in a press conference.
Participants also noted that they were much less likely to buy a self-driving car if it was programmed to sacrifice the passenger in any situation, including one in which the lives of many other people hung in the balance.
The last two surveys asked participants about what regulation should look like. Most participants agreed that it makes legal sense to sacrifice the passenger if 10 pedestrians could be saved. But most people were still not convinced that the government should regulate autonomous vehicles at all, and that they would be less inclined to buy a vehicle with such regulation.
More technology still needs to be developed before self-driving cars can become widespread, the researchers write. And though regulation of these vehicles might be necessary, it could slow the adoption of self-driving cars, the authors note. That’s counterproductive, since driverless cars would very likely cut down on the 37,000 deaths in the U.S. per year from car crashes, 90 percent of which are caused by human error.
Though these starkly moral choices—decisions in which the car must “choose” to sacrifice one person over another—would likely be rare, they are necessary to consider when engineering these vehicles. And real-life scenarios would likely be far more complex than the simple hypotheticals that researchers used in the surveys, since there would be less certainty that someone would get injured.
To continue the conversation and gather more data, the researchers also launched an interactive web site where users can explore increasingly complex hypotheticals. “This will help us also identify…the scenarios that people find the most difficult to agree on,” Bonnefon said in the press conference. Manufacturers and regulators could use that information to make self-driving cars that preserve the values of society and are also attractive for consumers.
Where we’re going, we’ll still need roads, but will we need drivers? This week, Vocativ explores the state of autonomous vehicles—their regulation, technology, and security—and how close we really are to a driverless future. Read more: