Would you trust your car to drive itself ? Image Credit: CC BY 2.0 Steve Jurvetson
When faced with an unavoidable accident, whose life should an autonomous vehicle attempt to prioritize ?
The idea of sitting back and relaxing while your car drives itself along through the city streets might seem like something out of a science fiction movie, but as advances in driverless car technology push the practicality of such a concept ever further towards reality, it may not be long before such a scenario becomes commonplace and manually driven cars become a thing of the past.
One of the biggest dilemmas regarding the use of such vehicles however continues to lie in the programming that determines what the car should do in the event of an unavoidable accident.
In a recent study, researchers from The Institute of Cognitive Science at the University of Osnabruck asked volunteers to operate a virtual reality driving simulation of various different traffic scenarios.
"We need to ask whether autonomous systems should adopt moral judgements, if yes, should they imitate moral behavior by imitating human decisions, should they behave along ethical theories and if so, which ones and critically, if things go wrong who or what is at fault ?" said author Gordon Pipa.
The simulation exposed each of the participants to unavoidable accidents involving both people and animals and then analyzed how they reacted to each situation.
A model was then developed based on the results that could be applied to a self-driving vehicle.
"Now that we know how to implement human ethical decisions into machines we, as a society, are still left with a double dilemma," said study co-author Prof. Peter Konig.
"Firstly, we have to decide whether moral values should be included in guidelines for machine behavior and secondly, if they are, should machines act just like humans ?"
Source: EurekAlert.org | Comments (26)