Text size


Killer dilemma: Who we’d let a self-driving car hit and who we’d save

Killer dilemma: Who we’d let a self-driving car hit and who we’d save


Scientists create experiment to help drivers understand morality issues around autonomous vehicles

Researchers have revealed who people would save from being killed by an autonomous vehicle – and those they wouldn’t.

The web-based moral questionnaire devised by the Massachusetts Institute of Technology (MIT) was based on a no-win situation when programming an autonomous car. Should the car swerve away from killing a group of pedestrians only to kill the occupants instead? Should it avoid hitting a mother and baby in favour of a pensioner? Should animals be favoured over criminals? Those were the sort of questions MIT asked with its ethical experiment, dubbed ‘The Moral Machine’.

The results from the two million global responses showed not only moral bias, but also whether people considered age, gender, social standing or even fitness level when making their decision.

There was a heavy preference towards sparing the lives of young people – the four most-spared in the game were a baby, a girl, a boy and a pregnant woman.

The lives of doctors and athletes were also considered worth saving.

At the other end of the scale, more people felt that elderly people should be given less preferential treatment, while the lives of criminals were valued less than those of dogs. Cats were considered the least worthy of being saved.

The survey also found significant differences between countries. In the UK, for example, the lives of men and women were held in equal regard – but just across the Channel in France, women’s lives ranked far higher than men’s. Interestingly, the UK also ranked higher for preferring to take no action.

The study broadly grouped countries into ‘western’ ‘eastern’ and ‘southern’, and found variations on those lines. For example, respondents in eastern countries were considerably more likely to favour the lives of elderly people, especially when compared with southern countries.

Edmond Awad, lead author of the paper detailing the survey’s findings, said: “The study is basically trying to understand the kinds of moral decisions that driverless cars might have to resort to. We don’t yet know how they should do that.

“The main preferences were to some degree universally agreed on, but the degree to which they agree with this or not varies among different groups or countries.”

The survey didn’t find a marked difference in morality based on any other demographic characteristic – whether that was age, education, gender, income or religious views. A total of 492,000 respondents offered demographic data.

Project associate Iyad Rahwan suggested that these preferences could, in theory, alter the way autonomous vehicle software is written. “The question is whether these differences in preferences will matter in terms of people’s adoption of the new technology when [vehicles] employ a specific rule,” he said.

Fully autonomous vehicles are not currently legal in any country outside of specifically granted exemptions. The current maximum is ‘level three’ autonomy, where a car can drive itself in specific situations but requires a human driver to be ready to take over control at any time. The MIT survey was based on theoretical ‘level five’ cars, which are fully autonomous and have no need for a human driver.