Study Asks If Self-Driving Cars Should Make Moral Choices

selfdrive

Most people questioned in a study say self-driving cars should aim to save as many lives as possible in a crash, even if that means sacrificing the passenger. But they also said they wouldn’t choose to use a car set up in such a manner.

The study was conducted by three academics in the Department of Psychology and Social Behavior at the University of California, Irvine. They were exploring the implications of autonomous vehicles for the social dilemma of self-protection against the greater good.

Participants were asked to consider a hypothetical situation where a car was on course to collide with a group of pedestrians. They were three sets of choices:

A) Killing several pedestrians or deliberately swerving in a way that would kill one passer-by.

B) Killing one pedestrian or acting in a way that would kill the car’s passenger (the person who would be considered the driver in a normal car.)

C) Killing several pedestrians or killing the passenger.

Across the responses to all three dilemmas, less than one in four participants favored killing the passenger to save a pedestrian. A little over half favored killing the passenger to save two pedestrians, with the figures steadily increasing up to almost every participant saying the car should sacrifice the passenger if it would save five pedestrians.

However, the participants went on to say that although programming cars to save as many people as possible even at the expense of killing the passenger was morally correct, only a slim majority would buy a vehicle set up in such a way.

Asked to reconsider their choice when imagining themselves in an autonomous vehicle alongside a co-worker and a majority said they would prefer to buy a vehicle programmed to protect the passengers over the pedestrians. This self-interest was even more pronounced when people were asked to imagine a family member travelling with them.