UM-Bot Posted August 27, 2018 #1 Share Posted August 27, 2018 Experts from around the world have gathered in Geneva this week to find a solution to this ever-growing threat. https://www.unexplained-mysteries.com/news/320976/new-un-hosted-talks-set-to-tackle-killer-robots Link to comment Share on other sites More sharing options...
freetoroam Posted August 27, 2018 #2 Share Posted August 27, 2018 Quote however it is also clear that robots capable of deciding when to pull the trigger are a very bad idea indeed. Yep, cos at the moment humans are doing this and it is equally a very bad idea. If they can not control.the humans, they will not stop them building these robots. Someone will. 1 1 Link to comment Share on other sites More sharing options...
aztek Posted August 27, 2018 #3 Share Posted August 27, 2018 i have more faith in robots than people 2 1 Link to comment Share on other sites More sharing options...
Orphalesion Posted August 27, 2018 #4 Share Posted August 27, 2018 People, we are safe from our army of killer robots malfunctioning and destroying us all....because we have our army of killer robots to protect us! Link to comment Share on other sites More sharing options...
Chaldon Posted August 27, 2018 #5 Share Posted August 27, 2018 (edited) Somehow I don't feel better when people kill people. Destruction is destruction, it matters not who or what does it. The more I live the more I feel that the way of making world peace as described by Klaatu is the only solution for such barbaric species as us, and for that we certainly need thinking machines. Edited August 27, 2018 by Chaldon Link to comment Share on other sites More sharing options...
The Caspian Hare Posted August 27, 2018 #6 Share Posted August 27, 2018 The difference is that human motives for violence are understood and war can be mitigated (if not eradicated) through systems already in place such as law, diplomacy, or military deterrence. With a human opponent you can control aggression by appealing to morality, appealing to law, offering economic incentives or finally by being so strong that any attack is too costly. Machines will have motives incomprehensible to us and may be impossible to persuade or predict. 1 Link to comment Share on other sites More sharing options...
pallidin Posted August 27, 2018 #7 Share Posted August 27, 2018 The posters here have some excellent comments... just thought I'd recognize that. Carry-on... Link to comment Share on other sites More sharing options...
Guyver Posted August 27, 2018 #8 Share Posted August 27, 2018 Murphy's law - Anything that can go wrong, will go wrong. That's why killer robots should be outlawed.....and we should not develop A.I. See the movie "Ex-Machina" for a nice update on the potential these machines will have in no time if progress continues. 1 Link to comment Share on other sites More sharing options...
Seti42 Posted August 28, 2018 #9 Share Posted August 28, 2018 The only way to stop a bad guy with killer robots is a good guy with killer robots! Seriously, though...As long as autonomous killer robots cost more than actual human soldiers, it won't be an issue. Link to comment Share on other sites More sharing options...
Wickian Posted August 28, 2018 #10 Share Posted August 28, 2018 For some reason I can easily picture robots forcing us into a caste society if they ever took over the world to "protect" us. Link to comment Share on other sites More sharing options...
pallidin Posted August 28, 2018 #11 Share Posted August 28, 2018 If I'm gathering correctly what both the article and you people are saying, autonomous robotics are potentially dangerous as AI technology increases. Ok, I'm a techie sort-of guy, so I can buy that. And, as pointed-out by you guys, it appears common-sense doubtful that any "world agreement" could EVER stop continued development of advanced AI robotics, either in the public or private sector. So, it appears we have a problem. Huh. Tough one. Link to comment Share on other sites More sharing options...
UFOwatcher Posted August 28, 2018 #12 Share Posted August 28, 2018 (edited) Not sure if I would be more concerned about a killer robot programmed only to attack an enemy Vs an indiscriminate landmine that kills "anyone" that messes with it. Edited August 28, 2018 by UFOwatcher I said too much Link to comment Share on other sites More sharing options...
paperdyer Posted August 28, 2018 #13 Share Posted August 28, 2018 22 hours ago, Chaldon said: Somehow I don't feel better when people kill people. Destruction is destruction, it matters not who or what does it. The more I live the more I feel that the way of making world peace as described by Klaatu is the only solution for such barbaric species as us, and for that we certainly need thinking machines. It's never a good thing to kill a person, even when you have to. However I think the human race is the only species on the planet that doesn't get rid of their undesirable members. We try to rehabilitate where we can. This sets us apart from the other critter of the Earth. Sometimes rehabilitation works. And when it doesn't we lock up the offenders after 3 offenses and throw away the key in a sense and have the tax payers pay the bill. More humane than killing, maybe, but less practical.in my opinion. Link to comment Share on other sites More sharing options...
Richie256 Posted August 28, 2018 #14 Share Posted August 28, 2018 (edited) Hi all, first post here. I was wondering if the Isaac Asimov’s Three Laws of Robotics can truly be implemented into robots in real life (and never be bypass by the AI) or is it just pure science fiction? Quote A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law Edited August 28, 2018 by Richie256 1 1 Link to comment Share on other sites More sharing options...
pallidin Posted August 29, 2018 #15 Share Posted August 29, 2018 Hi, Richie!!!! Welcome, and thanks for giving your thoughts. Link to comment Share on other sites More sharing options...
Chaldon Posted August 29, 2018 #16 Share Posted August 29, 2018 (edited) 19 hours ago, paperdyer said: It's never a good thing to kill a person, even when you have to. However I think the human race is the only species on the planet that doesn't get rid of their undesirable members. We try to rehabilitate where we can. This sets us apart from the other critter of the Earth. Sometimes rehabilitation works. And when it doesn't we lock up the offenders after 3 offenses and throw away the key in a sense and have the tax payers pay the bill. More humane than killing, maybe, but less practical.in my opinion. How about wars? They are unavoidable, ain't they? Everyone say they must be. And there is no other choice. Many of us feel disgusted by military actions and at the same time we honour warriors as saviours. So sometimes we must kill? And sometimes a lot of people, without even knowing who they are and if they are guilty of anything. I am talking about weapons of mass destruction which we proudly demonstrate in military parades and military exercises. No, in this aspect we are even worse than other beasts - those at least do not dedicate their lives to killing and are not honoured for that. Sorry, I shouldn't been starting this discussion. I know that no one feels better thinking of these things. All we can hope for is that some day... In a very, very, very distant future... May be... Somehow... There will be no wars. Edited August 29, 2018 by Chaldon Link to comment Share on other sites More sharing options...
TripGun Posted August 30, 2018 #17 Share Posted August 30, 2018 Nothing more autonomous than an ICBM if you ask me. Link to comment Share on other sites More sharing options...
paperdyer Posted August 30, 2018 #18 Share Posted August 30, 2018 On 8/29/2018 at 10:58 AM, Chaldon said: How about wars? They are unavoidable, ain't they? Everyone say they must be. And there is no other choice. Many of us feel disgusted by military actions and at the same time we honour warriors as saviours. So sometimes we must kill? And sometimes a lot of people, without even knowing who they are and if they are guilty of anything. I am talking about weapons of mass destruction which we proudly demonstrate in military parades and military exercises. No, in this aspect we are even worse than other beasts - those at least do not dedicate their lives to killing and are not honoured for that. Sorry, I shouldn't been starting this discussion. I know that no one feels better thinking of these things. All we can hope for is that some day... In a very, very, very distant future... May be... Somehow... There will be no wars. Wars can be avoided, but human natural has to change first. As long as different diametrically opposed ideologies exist, the threat of war will exist. 1 Link to comment Share on other sites More sharing options...
danydandan Posted August 30, 2018 #19 Share Posted August 30, 2018 (edited) I think we need to concentrate on "narrow AI" "general" AI is ok too. The real bad potential is with machine learning or learning machines that can rewrite their programming. AI in general are confined to their programming and can't escape or alter it's coding. Thus is only simulation of human intelligence and doesn't have 'free will'. However a learning machine in theory has 'free will'. I know they all really fall under the umbrella of AI but in reality learning machines would be like the kill all human machines in the movies, general or narrow AIs don't have the ability to behave like that, unless they are specifically programmed for it. Edited August 30, 2018 by danydandan Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now