Tuesday, March 19, 2024
Contact    |    RSS icon Twitter icon Facebook icon  
Unexplained Mysteries
You are viewing: Home > News > Science & Technology > News story
Welcome Guest ( Login or Register )  
All ▾
Search Submit

Science & Technology

How ethical should a self-driving car be ?

By T.K. Randall
July 9, 2017 · Comment icon 26 comments

Would you trust your car to drive itself ? Image Credit: CC BY 2.0 Steve Jurvetson
When faced with an unavoidable accident, whose life should an autonomous vehicle attempt to prioritize ?
The idea of sitting back and relaxing while your car drives itself along through the city streets might seem like something out of a science fiction movie, but as advances in driverless car technology push the practicality of such a concept ever further towards reality, it may not be long before such a scenario becomes commonplace and manually driven cars become a thing of the past.

One of the biggest dilemmas regarding the use of such vehicles however continues to lie in the programming that determines what the car should do in the event of an unavoidable accident.

In a recent study, researchers from The Institute of Cognitive Science at the University of Osnabruck asked volunteers to operate a virtual reality driving simulation of various different traffic scenarios.

"We need to ask whether autonomous systems should adopt moral judgements, if yes, should they imitate moral behavior by imitating human decisions, should they behave along ethical theories and if so, which ones and critically, if things go wrong who or what is at fault ?" said author Gordon Pipa.
The simulation exposed each of the participants to unavoidable accidents involving both people and animals and then analyzed how they reacted to each situation.

A model was then developed based on the results that could be applied to a self-driving vehicle.

"Now that we know how to implement human ethical decisions into machines we, as a society, are still left with a double dilemma," said study co-author Prof. Peter Konig.

"Firstly, we have to decide whether moral values should be included in guidelines for machine behavior and secondly, if they are, should machines act just like humans ?"

Source: EurekAlert.org | Comments (26)




Other news and articles
Recent comments on this story
Comment icon #17 Posted by eugeneonegin 7 years ago
But I don't get it. A car driven by a computer still has to obey the laws of physics. If it is driving along and someone jumps in front of it, and the car can't brake in time, or avoid them, it will hit them, same as a human driver would. What's controversial about this?
Comment icon #18 Posted by Farmer77 7 years ago
The controversy comes in the question of whether the car, assuming it can't brake in time,  should hit the pedestrians or turn and drive itself off the cliff in order to avoid the pedestrians.  Thats where the ethical question comes in. 
Comment icon #19 Posted by eugeneonegin 7 years ago
Now I get it.
Comment icon #20 Posted by Farmer77 7 years ago
Considering how obnoxious the spandex wearing bike crowd can be id be p***ed if my car chose to take me out instead of one of them 
Comment icon #21 Posted by eugeneonegin 7 years ago
I think the law should allow for us to occasionally nudge cyclists off their bikes when they get too frustrating.
Comment icon #22 Posted by Captain Risky 7 years ago
I'm thinking that self driving cars should be programmed to react to such a scenario by acting like any normal human driven car and act in self defence. Institute a defensive protocol. A person jumps out in front of you, first apply the brake and if safe swerve. The thing is that all cars would have to universally networked to react the same way to avoid collateral damage. 
Comment icon #23 Posted by Noxasa 7 years ago
It's certainly a dilemma.  Since passengers in autonomous vehicles can't be held responsible for the vehicles programming should they not assume some higher level of risk by having all vehicle programming be required to prioritize people outside the vehicle over people inside it?  I think so.  After all, that level of risk is still probably lower than if they were driving the vehicle themselves.
Comment icon #24 Posted by InconceivableThoughts 7 years ago
I suppose but that's the double edged sword here ....save millions from car accidents vs a group of cyber terrorist that will cause as much damage if the system was never even implimented.
Comment icon #25 Posted by Hammerclaw 7 years ago
I think Asimov's Three Laws of Robotics apply. A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Comment icon #26 Posted by Goodf3llow 7 years ago
If implemented in every vehicle on the road, sure. But that can take decades..


Please Login or Register to post a comment.


Our new book is out now!
Book cover

The Unexplained Mysteries
Book of Weird News

 AVAILABLE NOW 

Take a walk on the weird side with this compilation of some of the weirdest stories ever to grace the pages of a newspaper.

Click here to learn more

We need your help!
Patreon logo

Support us on Patreon

 BONUS CONTENT 

For less than the cost of a cup of coffee, you can gain access to a wide range of exclusive perks including our popular 'Lost Ghost Stories' series.

Click here to learn more

Recent news and articles