Self-driving cars’ decision-making weighed in

By , in Current Affires Editor's Choice on .

As many as 10 million autonomous cars are predicted to hit public roads by 2020, and when they do, they’ll have difficult decisions to make. Understandably, there’s some urgency to build decision-making systems capable of tackling the classic “trolley problem,” in which a person—or computer, as the case may be—is forced to decide whether to sacrifice the lives of several people for the life of one.

The Moral Machine analyzes tasked respondents with making ethical choices regarding fictional driving scenarios. Over 2 million people from more than 200 million countries addressed nine grisly dilemmas, which ranged from killing pedestrians or jaywalkers, the young or the elderly, and women versus men.

Afterwards, the respondents took a survey on their education levels, socioeconomic levels, gender, age, religious beliefs and political attitudes. They were then divided up geography. The scientists found that respondents fell into one of the three “cultural clusters”—eastern (East Asian and Middle East nations with Confuscist and Islamic backgrounds), western (Europe and North America) and southern (Central and South America).

The findings are important as autonomous vehicles prepare to take the road in the U.S. and other places around the world. In the future, car manufacturers and policymakers could find themselves in a legal bind with autonomous cars. If a self-driving bus kills a pedestrian, for instance, should the manufacturer be held accountable?

The researchers identified three relatively universal preferences. On average, people wanted:

  • To spare human life over animals
  • Save more lives over fewer
  • Prioritize young people over old ones

The study is not gospel truth, the quiz was self-selecting, and questions were posed in a binary, somewhat contrived fashion—every outcome resulted in the deaths of people or animals.

Even the most sophisticated artificial intelligence (AI) systems are far from being able to reason like a human, but some are coming closer.

“The ability to assign fault is the key. Just like the best human drivers in the world, self-driving cars cannot avoid accidents due to actions beyond their control,” Amnon Shashua, Mobileye CEO and Intel senior vice president, said in a statement last year. “But the most responsible, aware, and cautious driver is unlikely to cause an accident of his or her own fault, particularly if they had 360-degree vision and lightning-fast reaction times like autonomous vehicles will.”

In any case, car manufacturers have their work cut out for them. High-profile accidents involving autonomous cars has depressed public confidence in the technology. Studies found that a majority of people aren’t convinced of driverless cars’ safety. More than 60 percent said they were ‘not inclined’ to ride in self-driving cars, almost 70 percent expressed ‘concerns’ about sharing the road with them.