Monday, June 24, 2019
Contact us    |    Advertise    |   Help    |   Cookie Policy    |   Privacy Policy    RSS icon Twitter icon Facebook icon
    Home  ·  News  ·  Forum  ·  Stories  ·  Image Gallery  ·  Columns  ·  Encyclopedia  ·  Videos
Find: in

Prominent experts call for ban on AI weapons


Posted on Tuesday, 28 July, 2015 | Comment icon 29 comments

Schwarzenegger's T-800 model from the Terminator movie franchise. Image Credit: CC BY 2.0 Stephen Bowler
More than 1,000 top technical experts have called for a full ban on autonomous weapon systems.
Stephen Hawking, Elon Musk and hundreds of other scientists, engineers and researchers have written an open letter this week designed to encourage a complete ban on AI weapons and killer robots before their development sparks a new arms race across the globe.

Entitled "Autonomous Weapons: an Open Letter from AI & Robotics Researchers", the document details the opinion held by many that autonomous weapon systems pose an existential threat that could ultimately lead to humanity being wiped off the face of the Earth.

"If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow," the letter reads.

"Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce."

The scenario of a world decimated by intelligent robots is already quite familiar having been played out multiple times in science fiction movies such as The Terminator and The Matrix.

"In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so," the letter concludes.

"Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control."

The full text of the letter can be viewed - here.

Source: Popular Mechanics | Comments (29)

Tags: Robot, Artificial Intelligence

Recent comments on this story
Comment icon #20 Posted by tyrant lizard on 29 July, 2015, 19:16
The Three Laws, quoted as being from the "Handbook of Robotics, 56th Edition, 2058 A.D.", are: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. Simples
Comment icon #21 Posted by XenoFish on 29 July, 2015, 22:17
I've always wondered why no one in all those robot apocalypse movies never used ammo made from magnets. I mean think about it, you shoot one of these things in the "head" with a neodymium magnet. Their operating system should screw up.
Comment icon #22 Posted by FlyingAngel on 30 July, 2015, 18:10
The Three Laws, quoted as being from the "Handbook of Robotics, 56th Edition, 2058 A.D.", are: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. Simples Not that simple. They could cut down trees, pollute clean water, destroy the environment which indirectly affect humans while not directly injured human kind. They could s... [More]
Comment icon #23 Posted by TheGreatBeliever on 30 July, 2015, 18:37
I find it hard to believe computers could have a mind of their own. Theories of the replicator seems quite plausible. But i don see any for AI. Not much explaination of how it'll come about..
Comment icon #24 Posted by TheGreatBeliever on 30 July, 2015, 18:38
....
Comment icon #25 Posted by Beefers on 30 July, 2015, 18:56
If I am not mistaken, the only weapon ever actually banned - in history - was the crossbow... You see how well that worked... Nuclear weapons could be considered "banned" for a lot of countries as well, and chemical weapons. That doesn't stop people from doing so either (including other violations of international law). So yeah, like wou said, it didn't and doesn't work.
Comment icon #26 Posted by Anomalocaris on 2 August, 2015, 0:21
Comment icon #27 Posted by tyrant lizard on 5 August, 2015, 16:48
Not that simple. They could cut down trees, pollute clean water, destroy the environment which indirectly affect humans while not directly injured human kind. They could secretly or indirectly give our life miserable while not conflicting any laws. There's flaws in those quotes. Forget all the laws ****. It all starts with a mad scientist who could create a super AI that disobey any laws. A robot must obey orders given to it by a human...
Comment icon #28 Posted by FlyingAngel on 5 August, 2015, 18:47
A robot must obey orders given to it by a human... That's if the robot is programmed correctly. Big "IF". There's always a secret society who don't obey orders thus making a mutated robot
Comment icon #29 Posted by DieChecker on 5 August, 2015, 19:32
A robot must obey orders given to it by a human... Unless it is autonomous. In which case it creates it's own orders. You can try to persuade it, but unless their software requires it, they don't have to obey.


Please Login or Register to post a comment.


  On the forums
Forum posts:
Forum topics:
Members:

6485516
272453
180688

 
Woman wakes up all alone in cold, dark plane
6-24-2019
Tiffani Adams fell asleep during her flight and awoke to find that she was trapped alone in the parked plane.
Giant squid filmed in the Gulf of Mexico
6-23-2019
Scientists have captured footage of a huge cephalopod emerging from the gloom 750m below the surface.
Rover detects mystery methane spike on Mars
6-23-2019
NASA's long-lived Curiosity rover picked up the sudden and unexpected spike of methane on Wednesday.
Man builds pyramid using 1 million pennies
6-23-2019
Cory Nielsen spent three years putting together the impressive structure using 1,030,315 pennies.
Stories & Experiences
Bitten by a spirit?...true story
6-15-2019 | Dallas, Tx
 
Bed shaking
6-15-2019 | India
 
Red floating lights
6-15-2019 | Canada
 
A night-time encounter
5-28-2019 | Missouri
 
UFO flies over home
5-28-2019 | Ypsilanti, Mi
 
Crashed UFO off Gig Harbor Wa.
5-15-2019 | Gig Harbor Wa.
 
 
My enlightenment
5-12-2019 | Sacramento, California
 
I was bitten by a ghost
5-11-2019 | Virginia
 
Giant bird shadows
4-21-2019 | Hammond Indiana
 

         More stories | Send us your story
Featured Videos
Gallery icon 
'Deleting' disease using CRISPR
Posted 6-15-2019 | 5 comments
A look at science's goal of eliminating genetic diseases using gene-editing techniques.
 
What happens when lava meets ice ?
Posted 5-18-2019 | 2 comments
This intriguing experiment reveals just what happens when you pour lava on to ice.
 
7 ghost stories from the Queen Mary
Posted 5-10-2019 | 2 comments
A look at some creepy tales of ghosts and hauntings from the retired ocean liner.
 
Could Jack have survived 'Titanic' ?
Posted 5-2-2019 | 2 comments
The MythBusters investigate whether or not both Jack and Rose could have survived the movie.
 
Latin America's most dangerous snake
Posted 3-24-2019 | 1 comment
Coyote Peterson comes face to face with an extremely dangerous reptile.
 
 View: More videos
 
Top   |  Home   |   Forum   |   News   |   Image Gallery   |  Columns   |   Encyclopedia   |   Videos   |   Polls
UM-X 10.712 Unexplained-Mysteries.com (c) 2001-2019
Terms   |   Privacy Policy   |   Cookies   |   Advertise   |   Contact   |   Help/FAQ