Friday, April 26, 2024
Contact    |    RSS icon Twitter icon Facebook icon  
Unexplained Mysteries
You are viewing: Home > News > Science & Technology > News story
Welcome Guest ( Login or Register )  
All ▾
Search Submit

Science & Technology

Prominent experts call for ban on AI weapons

By T.K. Randall
July 28, 2015 · Comment icon 29 comments

Schwarzenegger's T-800 model from the Terminator movie franchise. Image Credit: CC BY 2.0 Stephen Bowler
More than 1,000 top technical experts have called for a full ban on autonomous weapon systems.
Stephen Hawking, Elon Musk and hundreds of other scientists, engineers and researchers have written an open letter this week designed to encourage a complete ban on AI weapons and killer robots before their development sparks a new arms race across the globe.

Entitled "Autonomous Weapons: an Open Letter from AI & Robotics Researchers", the document details the opinion held by many that autonomous weapon systems pose an existential threat that could ultimately lead to humanity being wiped off the face of the Earth.

"If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow," the letter reads.

"Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce."
The scenario of a world decimated by intelligent robots is already quite familiar having been played out multiple times in science fiction movies such as The Terminator and The Matrix.

"In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so," the letter concludes.

"Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control."

The full text of the letter can be viewed - here.

Source: Popular Mechanics | Comments (29)




Other news and articles
Recent comments on this story
Comment icon #20 Posted by tyrant lizard 9 years ago
The Three Laws, quoted as being from the "Handbook of Robotics, 56th Edition, 2058 A.D.", are: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. Simples
Comment icon #21 Posted by XenoFish 9 years ago
I've always wondered why no one in all those robot apocalypse movies never used ammo made from magnets. I mean think about it, you shoot one of these things in the "head" with a neodymium magnet. Their operating system should screw up.
Comment icon #22 Posted by FlyingAngel 9 years ago
The Three Laws, quoted as being from the "Handbook of Robotics, 56th Edition, 2058 A.D.", are: A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. Simples Not that simple. They could cut down trees, pollute clean water, destroy the environment which indirectly affect humans while not directly injured human kind. They could s... [More]
Comment icon #23 Posted by TheGreatBeliever 9 years ago
I find it hard to believe computers could have a mind of their own. Theories of the replicator seems quite plausible. But i don see any for AI. Not much explaination of how it'll come about..
Comment icon #24 Posted by TheGreatBeliever 9 years ago
....
Comment icon #25 Posted by Beefers 9 years ago
If I am not mistaken, the only weapon ever actually banned - in history - was the crossbow... You see how well that worked... Nuclear weapons could be considered "banned" for a lot of countries as well, and chemical weapons. That doesn't stop people from doing so either (including other violations of international law). So yeah, like wou said, it didn't and doesn't work.
Comment icon #26 Posted by Anomalocaris 9 years ago
Comment icon #27 Posted by tyrant lizard 9 years ago
Not that simple. They could cut down trees, pollute clean water, destroy the environment which indirectly affect humans while not directly injured human kind. They could secretly or indirectly give our life miserable while not conflicting any laws. There's flaws in those quotes. Forget all the laws ****. It all starts with a mad scientist who could create a super AI that disobey any laws. A robot must obey orders given to it by a human...
Comment icon #28 Posted by FlyingAngel 9 years ago
A robot must obey orders given to it by a human... That's if the robot is programmed correctly. Big "IF". There's always a secret society who don't obey orders thus making a mutated robot
Comment icon #29 Posted by DieChecker 9 years ago
A robot must obey orders given to it by a human... Unless it is autonomous. In which case it creates it's own orders. You can try to persuade it, but unless their software requires it, they don't have to obey.


Please Login or Register to post a comment.


Our new book is out now!
Book cover

The Unexplained Mysteries
Book of Weird News

 AVAILABLE NOW 

Take a walk on the weird side with this compilation of some of the weirdest stories ever to grace the pages of a newspaper.

Click here to learn more

We need your help!
Patreon logo

Support us on Patreon

 BONUS CONTENT 

For less than the cost of a cup of coffee, you can gain access to a wide range of exclusive perks including our popular 'Lost Ghost Stories' series.

Click here to learn more

Recent news and articles