Jump to content
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -
UM-Bot

New UN-hosted talks set to tackle killer robots

19 posts in this topic

Recommended Posts

 
freetoroam
Quote

however it is also clear that robots capable of deciding when to pull the trigger are a very bad idea indeed.

Yep, cos at the moment humans are doing this and it is equally a very bad idea. 

If they can not control.the humans, they will not stop them building these robots. Someone will.

  • Like 1
  • Thanks 1

Share this post


Link to post
Share on other sites
aztek

i have more faith in robots than people

  • Like 2
  • Thanks 1

Share this post


Link to post
Share on other sites
Orphalesion

People, we are safe from our army of killer robots malfunctioning and destroying us all....because we have our army of killer robots to protect us!

Share this post


Link to post
Share on other sites
Chaldon
Posted (edited)

Somehow I don't feel better when people kill people. Destruction is destruction, it matters not who or what does it. The more I live the more I feel that the way of making world peace as described by Klaatu is the only solution for such barbaric species as us, and for that we certainly need thinking machines.

Edited by Chaldon

Share this post


Link to post
Share on other sites
The Caspian Hare

The difference is that human motives for violence are understood and war can be mitigated (if not eradicated) through systems already in place such as law, diplomacy, or military deterrence. With a human opponent you can control aggression by appealing to morality, appealing to law, offering economic incentives or finally by being so strong that any attack is too costly.

Machines will have motives incomprehensible to us and may be impossible to persuade or predict.

  • Like 1

Share this post


Link to post
Share on other sites
pallidin

The posters here have some excellent comments... just thought I'd recognize that.

Carry-on...

Share this post


Link to post
Share on other sites
Guyver

Murphy's law - Anything that can go wrong, will go wrong.  That's why killer robots should be outlawed.....and we should not develop A.I.

See the movie "Ex-Machina" for a nice update on the potential these machines will have in no time if progress continues.  

  • Sad 1

Share this post


Link to post
Share on other sites
Seti42

The only way to stop a bad guy with killer robots is a good guy with killer robots!
Seriously, though...As long as autonomous killer robots cost more than actual human soldiers, it won't be an issue.

Share this post


Link to post
Share on other sites
Wickian

For some reason I can easily picture robots forcing us into a caste society if they ever took over the world to "protect" us.

Share this post


Link to post
Share on other sites
pallidin

If I'm gathering correctly what both the article and you people are saying, autonomous robotics are potentially  dangerous as AI technology increases.

Ok, I'm a techie sort-of guy, so I can buy that.

And, as pointed-out by you guys, it appears common-sense doubtful that any "world agreement" could EVER stop continued development of advanced AI robotics, either in the public or private sector.

So, it appears we have a problem.

Huh. Tough one.

Share this post


Link to post
Share on other sites
UFOwatcher
Posted (edited)

Not sure if I would be more concerned about a killer robot programmed only to attack an enemy Vs an indiscriminate landmine that kills "anyone" that messes with it.

Edited by UFOwatcher
I said too much

Share this post


Link to post
Share on other sites
paperdyer
22 hours ago, Chaldon said:

Somehow I don't feel better when people kill people. Destruction is destruction, it matters not who or what does it. The more I live the more I feel that the way of making world peace as described by Klaatu is the only solution for such barbaric species as us, and for that we certainly need thinking machines.

It's never a good thing to kill a person, even when you have to.  However I think the human race is the only species on the planet that doesn't get rid of their undesirable members.  We try to rehabilitate where we can.  This sets us apart from the other critter of the Earth.  Sometimes rehabilitation works. And when it doesn't we lock up the offenders after 3 offenses and throw away the key in a sense and have the tax payers pay the bill.  More humane than killing, maybe, but less practical.in my opinion.

Share this post


Link to post
Share on other sites
Richie256
Posted (edited)

Hi all, first post here.

I was wondering if the Isaac Asimov’s Three Laws of Robotics can truly be implemented into robots in real life (and never be bypass by the AI) or is it just pure science fiction?

Quote

robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law

 

Edited by Richie256
  • Like 1
  • Thanks 1

Share this post


Link to post
Share on other sites
pallidin

Hi, Richie!!!!

Welcome, and thanks for giving your thoughts.

Share this post


Link to post
Share on other sites
Chaldon
Posted (edited)
19 hours ago, paperdyer said:

It's never a good thing to kill a person, even when you have to.  However I think the human race is the only species on the planet that doesn't get rid of their undesirable members.  We try to rehabilitate where we can.  This sets us apart from the other critter of the Earth.  Sometimes rehabilitation works. And when it doesn't we lock up the offenders after 3 offenses and throw away the key in a sense and have the tax payers pay the bill.  More humane than killing, maybe, but less practical.in my opinion.

How about wars? They are unavoidable, ain't they? Everyone say they must be. And there is no other choice. Many of us feel disgusted by military actions and at the same time we honour warriors as saviours. So sometimes we must kill? And sometimes a lot of people, without even knowing who they are and if they are guilty of anything. I am talking about weapons of mass destruction which we proudly demonstrate in military parades and military exercises. No, in this aspect we are even worse than other beasts - those at least do not dedicate their lives to killing and are not honoured for that.

Sorry, I shouldn't been starting this discussion. I know that no one feels better thinking of these things. All we can hope for is that some day... In a very, very, very distant future... May be... Somehow... There will be no wars.

Edited by Chaldon

Share this post


Link to post
Share on other sites
TripGun

Nothing more autonomous than an ICBM if you ask me.

Share this post


Link to post
Share on other sites
paperdyer
On ‎8‎/‎29‎/‎2018 at 10:58 AM, Chaldon said:

How about wars? They are unavoidable, ain't they? Everyone say they must be. And there is no other choice. Many of us feel disgusted by military actions and at the same time we honour warriors as saviours. So sometimes we must kill? And sometimes a lot of people, without even knowing who they are and if they are guilty of anything. I am talking about weapons of mass destruction which we proudly demonstrate in military parades and military exercises. No, in this aspect we are even worse than other beasts - those at least do not dedicate their lives to killing and are not honoured for that.

Sorry, I shouldn't been starting this discussion. I know that no one feels better thinking of these things. All we can hope for is that some day... In a very, very, very distant future... May be... Somehow... There will be no wars.

Wars can be avoided, but human natural has to change first.  As long as different diametrically opposed  ideologies exist, the threat of war will exist.

  • Thanks 1

Share this post


Link to post
Share on other sites
danydandan
Posted (edited)

I think we need to concentrate on "narrow AI" "general" AI is ok too. The real bad potential is with machine learning or learning machines that can rewrite their programming. AI in general are confined to their programming and can't escape or alter it's coding. Thus is only simulation of human intelligence and doesn't have 'free will'. However a learning machine in theory has 'free will'.

I know they all really fall under the umbrella of AI but in reality learning machines would be like the kill all human machines in the movies, general or narrow AIs don't have the ability to behave like that, unless they are specifically programmed for it.

Edited by danydandan

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.