Thinking about the technology of the not so futuristic driverless car of, is not that daunting when you are thinking about what the car will do. The scary question is the what-if in any scenario where the car would need to sacrifice the people in the car or the people in the street in some unavoidable and serious accident.
Abigail Beall in New Scientist said this is one of the major problems confronting manufacturers – the moral decisions.
If you are not in a driverless car, the answer lies with you and your ethics.
If you are in an autonomous vehicle, though, the question appears to rest with your car, which has no ethics, only the work of its engineers.
Though there is yet another option being suggested for such questions – and that is thanks to a team from Italy. They have considered a way to put the decision in the hands of the human passenger in the AV.
Their suggestion is in the form of a knob.
Their paper is titled “The Ethical Knob: Ethically-Customisable Automated Vehicles and the Law.” Their study was published in Artificial Intelligence and Law. The authors are Giuseppe Contissa, Francesca Lagioia and Giovanni Sartor.
“We wanted to explore what would happen if the control and the responsibility for a car’s actions were given back to the driver,” said Guiseppe Contissa at the University of Bologna in Italy, in New Scientist.
It has been argued, they noted, that self-driving cars should be equipped with pre-programmed approaches to the choice of what lives to sacrifice when losses are inevitable.
“Here we shall explore a different approach, namely, giving the user/passenger the task (and burden) of deciding what ethical approach should be taken by AVs in unavoidable accident scenarios. We thus assume that AVs are equipped with what we call an ‘Ethical Knob.'”
“The knob tells an autonomous car the value that the driver gives to his or her life relative to the lives of others,” said Contissa in New Scientist. “The car would use this information to calculate the actions it will execute.”
How the knob would provide an answer: It would offer settings.
Egoistic would mean preference for the passenger and Altruistic, for third parties.
A third setting would be for impartiality, where the setting would allow the car to act in a utilitarian way with equal importance given to passenger(s) and third parties.
Cheyenne MacDonald in Daily Mail: “With a so-called ‘ethical knob,’ riders could tune a car’s settings so it operates as ‘full altruist,’ ‘full egoist,’ or ‘impartial’ – allowing it to decide based on the way you value your own life relative to others.”
Beall, meanwhile, quoted Edmond Awad of the MIT Media Lab, who is a researcher on the Moral Machine project: “It is too early to decide whether this would be a good solution,” Awad said, but Beall added that he welcomed a new idea in an otherwise thorny debate. Moral Machine describes itself as a platform for gathering a perspective on moral decisions made by machine intelligence.