Moral psychologist Azim Shariff believes that autonomous cars will be programmed to redistribute risk away from some people and towards others. “Consider an autonomous car that is deciding where to position itself in a lane – closer to a truck to its right, or a bicycle lane on its left,”said Shariff, an associate professor at the University of British Columbia in Vancouver.
“If cars were always programmed to be slightly closer to the bicycle lane, they may slightly reduce the likelihood of hitting other cars, while slightly increasing the likelihood of hitting cyclists.”
Shariff is a co-author of a major new report on the ethical judgments that may have to be programmed into autonomous vehicles. The Moral Machine Experiment was published in Nature last week. The study features the results from an online game played by nearly 40 million volunteers from around the world. The volunteers had to decide whether a driverless car should hit a pregnant woman or swerve into a wall and kill its four passengers.
Other decisions from the game, which went viral when it went online in 2016, included whether to save an athlete over an overweight person or a child over a senior. The results led to regional variations, with some cultures preferring to protect old people over young, and women over men. In some countries, so-called “jaywalkers” were saved less often than people who crossed at designated crossings.
Overweight people were about 20% more likely to be chosen to die than athletes, and homeless people had a roughly 40% greater chance of dying than executives.
The moral dilemma at the heart of the game is the “trolley problem” created by British philosopher Philippa Foot in 1967. She imagined a runaway train where you had the choice to divert on to either a track where one victim would be hit and killed or another where five would.
The Moral Machine did not feature cyclists as part of the experiment, but Shariff told Inside Science that autonomous cars would have to interact with them on the roads of the future.
“Over millions or billions [passing maneuvers with cars having to decide whether to drive close to another motor vehicle or to a cyclist] either more cyclists will die, or more passengers will die.”
A report on the paper in Nature said:
“A driver who veers away from cyclists riding on a curvy mountain road increases her chance of hitting an oncoming vehicle. If the number of driverless cars on the road increases, so too will the likelihood that they will be involved in such accidents.”
Driverless cars are not yet able to cope to with the unpredictability of cyclists. In 2015, a cyclist in Austin, Texas, confused a Google driverless car when he did a near-motionless “track-stand” at an intersection. The Google car was so bamboozled by the behavior of the balancing cyclist it would not budge.
Google has since improved its algorithms to recognize such bicyclist behavior. A Google statement said: “Through observing cyclists on the roads and private test track, we’ve taught our software to recognize some common riding behaviors, helping our car better predict a cyclist’s course.”