Jump to content
Unexplained Mysteries uses cookies. By using the site you consent to our use of cookies as per our Cookie Policy.
Close X
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -
UM-Bot

How ethical should a self-driving car be ?

27 posts in this topic

Recommended Posts

Stiff

I've often pondered this idea myself.  I fear the sales pitch will be that it will put the car occupants first (possibly rightly?) . But when it comes to unavoidably hitting something/someone, how would it differentiate between a small child and say, a dog or other similar sized creature? Given the choice between hitting a small child at speed or swerving to hit a brick wall, I'd say hit the wall but with an animal? Not so much.

  • Like 3

Share this post


Link to post
Share on other sites
DanL

Machines can't make ethical decisions. They ONLY do what they are programmed to do. They should be programmed to avoid accidents and avoid above all else hitting a human figure. The reaction time of a computer would make it safer for a car to follow 3' off the bumper of a lead car than a person would be at many car lengths. This would allow some pretty wild avoidance maneuvers that people could never do.

After it dodges a person then it would start avoiding striking a solid object and then a softer one. By stacking the priorities it would make every one safer. They never get distracted, go to sleep, play with their radio or cell phone and then react to unexpected things about 10,000 times faster than people.

It isn't about ethics it is just about priorities in order of likely hood of injury occurring to a person. If you hit a person they are going to be injured seriously. The person in the car has shelter and their chances of being injured in an avoidance maneuver is much lower than the person. Then it would try to not hit another car or tree or whatever and in a last case scenario hit the softest target available.

  • Like 1

Share this post


Link to post
Share on other sites
RoofGardener

Machines CAN make eithical decisions. 

1) No car should harm its passengers, or - through inaction - allow its passengers to come to harm.

2) A car should obey the instructions of its passengers, providing this does not conflict with the First Law. 

3) A car should protect its paintwork, providing this does not conflict with the First or Second laws. 

https://en.wikipedia.org/wiki/Three_Laws_of_Robotics

Beep Beep ! :P

 

  • Like 2

Share this post


Link to post
Share on other sites
kartikg
2 hours ago, Stiff said:

I've often pondered this idea,  myself.  I fear the sales pitch will be that it will put . the car occupants first (possibly rightly?) . But when it comes to unavoidably hitting something/someone, how would it differentiate between a small child and say, a dog or other similar sized creature? Given the choice between hitting a small child at speed or swerving to hit a brick wall, I'd say hit the wall but with an animal? Not so much.

Currently AI can very well differentiate between a dog, cat, child and apple , what I am unaware of is can it be done offline and within the latency limit required by car to prevent the crash. With the matter of ethics and moral they are not something we should consider because that too differs from person to person , the only thing the car should guarantee that it will prevent accidents which a human under normal circumstances will prevent in other words the car should be on par with human if not better , that's a thing which is something technology can do in few years . Coming back to morals one person might crash is car to save a child other might run over to protect himself which will be legally correct . 

  • Like 3

Share this post


Link to post
Share on other sites
Stiff
5 minutes ago, kartikg said:

Coming back to morals one person might crash is car to save a child other might run over to protect himself which will be legally correct . 

That's where the main problem will lay. Everybody has different morals but who's to say what is morally right and what is morally wrong? It's a very grey area and one that's not quite as cut and dried as percieved. Tricky one.

  • Like 2

Share this post


Link to post
Share on other sites
kartikg
8 minutes ago, Stiff said:

That's where the main problem will lay. Everybody has different morals but who's to say what is morally right and what is morally wrong? It's a very grey area and one that's not quite as cut and dried as percieved. Tricky one.

Yeah, that's why I feel tech should be bound by technical constraints rather than abstract one like morals . 

  • Like 2

Share this post


Link to post
Share on other sites
Gromdor

The customer will be able to buy an "ethics" package that prioritizes according to their own beliefs.  Within legality of course.  Companies will sell it because it shifts liability from themselves to the car purchaser.

  • Like 3

Share this post


Link to post
Share on other sites
highdesert50

Wondering if elements of learning by imitation are being used in autonomous driving programming. Imitation learning is a method, for example, where a person might observe an another to learn a particular activity. In the case of a machine, explicit and tedious programing is minimized or even eliminated by allowing the machine to more implicitly observe the activity in a natural way and learn. Though, self-survival may emerge as a consequence.

Share this post


Link to post
Share on other sites
Captain Risky

...I'd imagine a self driving car would be in total control. Aware of its environment. Able to respond. Why would it need to make any ethical choice? 

  • Like 1

Share this post


Link to post
Share on other sites
Farmer77
2 hours ago, Captain Risky said:

...I'd imagine a self driving car would be in total control. Aware of its environment. Able to respond. Why would it need to make any ethical choice? 

Driving down a two lane road, sheer drop off cliff on one side, heavy traffic in the oncoming lane and a group of children hop on the road in front of the self driving car. What decision does it make? 

Thats the ethical part being discussed. Kill the driver or the pedestrian?

  • Like 2

Share this post


Link to post
Share on other sites
Captain Risky
42 minutes ago, Farmer77 said:

Driving down a two lane road, sheer drop off cliff on one side, heavy traffic in the oncoming lane and a group of children hop on the road in front of the self driving car. What decision does it make? 

Thats the ethical part being discussed. Kill the driver or the pedestrian?

Self driving cars would have a 360 degree overview and be able to communicate and coordinate with all other vehicles. Unlike the human element they would be able slow down anticipating a dangerous stretch of road. In fact I imagine there would be no fatalities on the road other than those caused by third parties and with a 360 degree overview that would also be a factor in the self driving cars planning.

  • Like 5

Share this post


Link to post
Share on other sites
InconceivableThoughts

Wouldn't the smart thing to do is just set a system up to where all the cars on a certain road or city are connected . Say if one car stops it sends a signal to all other cars behind it saying stop aswell. There must be some program to which all the cars are synchronized that can calculate all cars positions, stopping and making them go at the right times to avoid many if not all collisions. Though when it comes to an accident involving a car and person ...the person should be at fault mainly due to the fact they were more then likely Jay walking.

  • Like 4

Share this post


Link to post
Share on other sites
Goodf3llow
10 hours ago, InconceivableThoughts said:

Wouldn't the smart thing to do is just set a system up to where all the cars on a certain road or city are connected . Say if one car stops it sends a signal to all other cars behind it saying stop aswell. There must be some program to which all the cars are synchronized that can calculate all cars positions, stopping and making them go at the right times to avoid many if not all collisions. Though when it comes to an accident involving a car and person ...the person should be at fault mainly due to the fact they were more then likely Jay walking.

Chaos if this type of system were to be hacked.

 

  • Like 2

Share this post


Link to post
Share on other sites
Darkenpath25

I dont think I would trust it , Its scary enough getting in the car with my mom and girlfriend .

  • Like 3

Share this post


Link to post
Share on other sites
eugenonegin
17 hours ago, Farmer77 said:

Driving down a two lane road, sheer drop off cliff on one side, heavy traffic in the oncoming lane and a group of children hop on the road in front of the self driving car. What decision does it make? 

Thats the ethical part being discussed. Kill the driver or the pedestrian?

Why would a group of children hop in front of a car, whether it was driven by a computer, a driver, or a lunatic?

They would get killed.

Is someone proposing driverless cars should be able to defy the laws of gravity and take off into the air?

  • Like 1

Share this post


Link to post
Share on other sites
Farmer77
24 minutes ago, eugeneonegin said:

Why would a group of children hop in front of a car, whether it was driven by a computer, a driver, or a lunatic?

They would get killed.

Is someone proposing driverless cars should be able to defy the laws of gravity and take off into the air?

Um because kids are dumb. 

I was just relaying the controversy as I understood it. 

 

  • Like 2

Share this post


Link to post
Share on other sites
eugenonegin
7 hours ago, Farmer77 said:

Um because kids are dumb. 

I was just relaying the controversy as I understood it. 

 

But I don't get it.

A car driven by a computer still has to obey the laws of physics.

If it is driving along and someone jumps in front of it, and the car can't brake in time, or avoid them, it will hit them, same as a human driver would. What's controversial about this?

  • Like 2

Share this post


Link to post
Share on other sites
Farmer77
Just now, eugeneonegin said:

But I don't get it.

A car driven by a computer still has to obey the laws of physics.

If it is driving along and someone jumps in front of it, and the car can't brake in time, or avoid them, it will hit them, same as a human driver would. What's controversial about this?

The controversy comes in the question of whether the car, assuming it can't brake in time,  should hit the pedestrians or turn and drive itself off the cliff in order to avoid the pedestrians. 

Thats where the ethical question comes in. 

  • Like 2

Share this post


Link to post
Share on other sites
eugenonegin
29 minutes ago, Farmer77 said:

The controversy comes in the question of whether the car, assuming it can't brake in time,  should hit the pedestrians or turn and drive itself off the cliff in order to avoid the pedestrians. 

Thats where the ethical question comes in. 

Now I get it.

  • Like 1

Share this post


Link to post
Share on other sites
Farmer77
Just now, eugeneonegin said:

Now I get it.

Considering how obnoxious the spandex wearing bike crowd can be id be p***ed if my car chose to take me out instead of one of them :lol:

  • Like 5

Share this post


Link to post
Share on other sites
eugenonegin
4 hours ago, Farmer77 said:

Considering how obnoxious the spandex wearing bike crowd can be id be p***ed if my car chose to take me out instead of one of them :lol:

I think the law should allow for us to occasionally nudge cyclists off their bikes when they get too frustrating.

  • Like 4

Share this post


Link to post
Share on other sites
Captain Risky
18 hours ago, Farmer77 said:

The controversy comes in the question of whether the car, assuming it can't brake in time,  should hit the pedestrians or turn and drive itself off the cliff in order to avoid the pedestrians. 

Thats where the ethical question comes in. 

I'm thinking that self driving cars should be programmed to react to such a scenario by acting like any normal human driven car and act in self defence. Institute a defensive protocol. A person jumps out in front of you, first apply the brake and if safe swerve. The thing is that all cars would have to universally networked to react the same way to avoid collateral damage. 

  • Like 3

Share this post


Link to post
Share on other sites
Noxasa

It's certainly a dilemma.  Since passengers in autonomous vehicles can't be held responsible for the vehicles programming should they not assume some higher level of risk by having all vehicle programming be required to prioritize people outside the vehicle over people inside it?  I think so.  After all, that level of risk is still probably lower than if they were driving the vehicle themselves.

  • Like 1

Share this post


Link to post
Share on other sites
InconceivableThoughts
On Monday, July 10, 2017 at 2:20 PM, Goodf3llow said:

Chaos if this type of system were to be hacked.

 

I suppose but that's the double edged sword here ....save millions from car accidents vs a group of cyber terrorist that will cause as much damage if the system was never even implimented.

  • Like 2

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.