Still Waters Posted January 25 #1 Share Posted January 25 (IP: Staff) · The classic thought experiment known as the "trolley problem" asks: Should you pull a lever to divert a runaway trolley so that it kills one person rather than five? Alternatively: What if you'd have to push someone onto the tracks to stop the trolley? What is the moral choice in each of these instances? For decades, philosophers have debated whether we should prefer the utilitarian solution (what's better for society; i.e., fewer deaths) or a solution that values individual rights (such as the right not to be intentionally put in harm's way). In recent years, automated vehicle designers have also pondered how AVs facing unexpected driving situations might solve similar dilemmas. For example: What should the AV do if a bicycle suddenly enters its lane? Should it swerve into oncoming traffic or hit the bicycle? https://techxplore.com/news/2023-01-ethical-self-driving-cars.html 1 Top Link to comment Share on other sites More sharing options...
+Desertrat56 Posted January 25 #2 Share Posted January 25 2 minutes ago, Still Waters said: The classic thought experiment known as the "trolley problem" asks: Should you pull a lever to divert a runaway trolley so that it kills one person rather than five? Alternatively: What if you'd have to push someone onto the tracks to stop the trolley? What is the moral choice in each of these instances? For decades, philosophers have debated whether we should prefer the utilitarian solution (what's better for society; i.e., fewer deaths) or a solution that values individual rights (such as the right not to be intentionally put in harm's way). In recent years, automated vehicle designers have also pondered how AVs facing unexpected driving situations might solve similar dilemmas. For example: What should the AV do if a bicycle suddenly enters its lane? Should it swerve into oncoming traffic or hit the bicycle? https://techxplore.com/news/2023-01-ethical-self-driving-cars.html What happened to hitting the brakes? 1 Top Link to comment Share on other sites More sharing options...
+OverSword Posted January 25 #3 Share Posted January 25 This dilemma is discussed in iRobot (Will Smith). Two cars in a crash only able to save one person choice between an adult and a child. the Robot chose the one most likely to survive (the adult) even though it was assumed in the movie that a human would have tried to save the child and that would have been the right choice. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now