Should self driving cars be programmed to sacrifice their passengers?Mon, 14/05/2018
A new study calls for an ‘urgent’ debate over the ethics of autonomous vehicles.
Mollie Cahillane, The Mail Online, reports:
Self-driving vehicles have been proposed as a solution for the rapidly increasing number of fatal traffic accidents, which now claim a staggering 1.3 million casualties each year.
While we have made strides in advancing self-driving technology, we have yet to explore at length how autonomous vehicles will be programmed to deal with situations that endanger human life, according to a new study published in Frontiers in Behavioral Neuroscience.
To understand how self-driving cars might make these judgments, the researchers looked at how humans deal with similar driving dilemmas.
When faced with driving dilemmas, people show a high willingness to sacrifice themselves for others, make decisions based on the victim's age and swerve onto sidewalks to minimize the number of lives lost.
Ethical guidelines tend to disagree with human instincts in this case, which dictate that no life should be valued above another.
'The technological advancement and adoption of autonomous vehicles is moving quickly but the social and ethical discussions about their behavior is lagging behind,' says lead author Lasse T. Bergmann, who completed this study with a team at the Institute of Cognitive Science, University of Osnabrück, Germany.
'The behavior that will be considered as right in such situations depends on which factors are considered to be both morally relevant and socially acceptable.'
Automated vehicles will eventually outperform their human counterparts, but there will still be circumstances where the cars must make an ethical decision to save or possibly risk losing a human life.
The study is especially relevant considering earlier this year a self-driving Uber car struck and killed a passenger in Arizona, in an incident widely regarded as the first death resulting from an autonomous vehicle.
The study references a scenario in which a child suddenly runs in front of an automated vehicle. Does the car swerve into a sidewalk and therefore violate traffic rules and potentially injure others?
An ethics commission initiated by the German Ministry for Transportation has created a set of guidelines, representing its members' best judgement on a variety of issues concerning self-driving cars.
These expert judgments may, however, not reflect human intuition.
Bergmann and colleagues developed a virtual reality experiment to examine human intuition in a variety of possible driving scenarios. It was based on the well-known ethical thought experiment -- the trolley problem.
In this thought experiment, there is a runaway trolley barreling down the railway tracks.
Ahead, on the tracks, there are five people tied up and unable to move and the trolley is headed straight for them.
The person is standing some distance off in the train yard, next to a lever. If they pull this lever, the trolley will switch to a different set of tracks.
However, there is one person tied up on the side track. Does the person choose to pull the lever and kill one person, or do nothing and let five people die?
'The German ethics commission proposes that a passenger in the vehicle may not be sacrificed to save more people; an intuition not generally shared by subjects in our experiment,' said Bergmann.
'We also find that people chose to save more lives, even if this involves swerving onto the sidewalk - endangering people uninvolved in the traffic incident,' he explained.
'Furthermore, subjects considered the factor of age, for example, choosing to save children over the elderly.
However, Bergmann recognized that the majority of people would not approve of decisions the cars made if they followed the ethics commission decisions.
'If autonomous vehicles abide with guidelines dictated by the ethics commission, our experimental evidence suggests that people would not be happy with the decisions their cars make for them,' said Bergmann.
He also realizes further discussion and research is needed.
'Driving requires an intricate weighing of risks versus rewards, for example speed versus the danger of a critical situation unfolding,' Bermann explained.
'Decision making-processes that precede or avoid a critical situation should also be investigated.'
Asking a machine to make a split-second judgement call is a tall order and could have potentially terrible consequences should the car operate in an unethical way. But should a computer really be accountable, or would accountability lie with the driver for not overriding control, or with the manufacturer?