Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -
Sign in to follow this  
Followers 4
Saru

Should 'killer robots' be banned ?

47 posts in this topic

They should be killed. Preferably by other robots.

Then those robots should be killed.

...by Zombie Sharks. *I just got an idea for a screenplay!*

Share this post


Link to post
Share on other sites
 

Obama is already using robots to kill people abroad. The difference between a bleary-eyed operator sitting at a desk and pushing the button that makes a killer drone shoot a missile at what looks like a tent in Waziristan, and programming the drone to do that by itself is only a small one. Plus, an autonomous drone can not be kidnapped by e.g. Iran by interfering with the transmission link.

So yes, this will come. Say hello to the Terminator.

Share this post


Link to post
Share on other sites

They should not be banned they save pilots lifes by not flying in hot zones.

Share this post


Link to post
Share on other sites

@Likely Guy...

Sci fi channel will be all over that

Zombie Sharks vs Killer Robots starring Debbie Gibson

Share this post


Link to post
Share on other sites

No one mentioned iRobot...

- robots shouldn't be given intelligence

- robots shouldn't have the power to destroy and kill

And stop all kind of war, plain and simple.

Share this post


Link to post
Share on other sites

If a man thinks that someone should die, then he should be totaly up to completing the task to it's full respect. Cowards dwelling in the art of war.. no good can ever come out of that.

If a man thinks that someone should die, then he should be totaly up to completing the task to it's full respect. Cowards dwelling in the art of war.. no good can ever come out of that.

Share this post


Link to post
Share on other sites

Another A human rights organization that has no idea what its like to be on the front lines of anything. I will agree with banning autonomous weapons when they successfully ban IED's.

Edited by atomk12

Share this post


Link to post
Share on other sites

Facial Recognition and Weapon System Recognition works wonders on the battlefield, aiding the allies to determine who is a threat or who is a high value target. These systems allow for quick identifications and higher performance of special ops squads who been deployed long before the conventional army has. The systems are sparsely used, meaning they are very sensitive equipment that Seals Teams don't want used by other countries. Sometimes this type of equipment is reconfigured to cruise missiles to attain their targets, allowing for much better accuracy and less casualties while our country is at war. The impact of this technology has greatly decrease casualties rate in any current theater of war compared to our past conflicts. Being able to land in the battlefield or field of operations and immediately identify targets or threats is a great improvement, this technology saves lives.

As for drones, this also has a impact of increasing the accuracy and decreasing the casualty rate in the theater of war we are currently in. Of course the decision to fire is placed solo in the hands of a human, which means someone will be held responsible for mistakes or outright murder. In the worst case scenario, enemies of our nation take control of the drones, which would be a extraordinary feat in today standards cause the technology gap between the two countries. You can trick the drones, but you won't have full control of the drones that are being deployed by United States of America and her allies in the current theater of war.

I don't see the army, air force, or the navy heading in the direction of a self-controlled drone that has the ability to choose and terminate any targets that it sees fit. What I do see is the ability to take the human pilots out of the equation and being able to kill off targets at our own pace. Being able to do something at your own pace does have its advantages, you are able to confirm and double check the target before firing.

Even then, due to human error, we still make mistakes and kill civilians.

Edited by Uncle Sam

Share this post


Link to post
Share on other sites

DARPA and Boston Dynamics have already made a few military robot prototypes. Just need to add some weapons to them and they'll be 'Killer Robots'.

http://youtu.be/mdYSStF4fqc

I think they are pretty amazing actually.

Share this post


Link to post
Share on other sites

DARPA and Boston Dynamics have already made a few military robot prototypes. Just need to add some weapons to them and they'll be 'Killer Robots'.

[media=]http://youtu.be/mdYSStF4fqc[/media]

I think they are pretty amazing actually.

That is designed to be a pack mule...

Share this post


Link to post
Share on other sites

That is designed to be a pack mule...

Obviously.

I'm just saying it hypothetically. Strap a weapon on them and they could be considered a 'killer robot'.

Share this post


Link to post
Share on other sites

I'm guessing this means I wont be getting a killer robot anytime soon?

Share this post


Link to post
Share on other sites

I'm voting YES on robot soldiers, but NO on higher autonomous functions. A human needs to be in the Loop to say "Yes, that target needs to be hit.".

Personnally I think it will eventually get much worse... maybe with Robot Soldier operations outsourcing to India, Brazil, or some such place. "Blam! Blam! Blam! .... Thank you, come again." Seriously though, the Romans eventually had to oursource their militarys, and eventually the US will too. Then we're totally skrewed.

I saw an article the other day that the Average... average, PC or Console gamer is theoretically better at remote surgery then a actual experienced surgeon, simply due to hand-eye training and familiarity with the medium. So, I don't fear that the US is going to run out of RC soldier operators anytime soon, but I do think eventually... finacially... it could happen.

Share this post


Link to post
Share on other sites

I'm voting YES on robot soldiers, but NO on higher autonomous functions. A human needs to be in the Loop to say "Yes, that target needs to be hit.".

Like ordering shooting all targets that wear black and green soldier uniform? Oh wait, there are babies and women that wear black and green clothes too.

It's still can be blamed for malfunction, like ordering to shoot target A but it shoots B instead.

Share this post


Link to post
Share on other sites

Like ordering shooting all targets that wear black and green soldier uniform? Oh wait, there are babies and women that wear black and green clothes too.

It's still can be blamed for malfunction, like ordering to shoot target A but it shoots B instead.

You must not be talking to me. I was clear in that I said HUMANS should be reponsible for the trigger pull. If babies are being killed then it is a soldier somewhere showing bad discression, not a computer program.

This is why our Boys in Afghanistan are always getting shot at, with few shots fired back. They can't shoot back unless they can see a enemy target. Only the more Anti-Gun people still believe our soldiers and government agencies are just shooting at everything that moves.

Share this post


Link to post
Share on other sites

Dalaks, Cylons anyone we already have problems with trigger happly vedio game soldiers who think they are playing war games killing innocent civilians around the world this would only make this problem worse than it is this program must be stop whide there still time for the sake of our desendants safety.

Share this post


Link to post
Share on other sites

This topic is substantially arguable.

Perhaps its because of our lack of complete understanding of cyberspace.

Share this post


Link to post
Share on other sites

Dalaks, Cylons anyone we already have problems with trigger happly vedio game soldiers who think they are playing war games killing innocent civilians around the world this would only make this problem worse than it is this program must be stop whide there still time for the sake of our desendants safety.

Daleks are actually octopus like critters inside a trashcan shaped suit of power armor.

Share this post


Link to post
Share on other sites

You must not be talking to me. I was clear in that I said HUMANS should be reponsible for the trigger pull. If babies are being killed then it is a soldier somewhere showing bad discression, not a computer program.

Good in theory but not in practice. The view range of robots are limited. You can probably step on babies or children without knowing it. Humans have ears that could detect who's attacking behind. If controlling a robot is just about a screen, a speaker and a mouse to shoot like in video games, then there's nothing good about it excepts that it has an armor and the soldier is not actually there. It's still too easy to kill a robot. Once the enemies know the weak point, what's left of a robot is a piece of metal.

Because every robot looks the same, it's impossible to determine who's responsible for shooting an innocent. Holding a true gun is far different from waving a mouse. With a mouse, if you move your hand 1-2 cm accidently, the shooting angle is modified, and the person who is 50m away from the object got hit.

The angle of the head must perfectly match the angle of the guns. If there's a technical problem like oil dry out or some movement stuck, the head look straight but the guns point behind or left or right or the view angle different, it's a massacre. It's not as easy like in a video game where your mouse point where it's shot.

Share this post


Link to post
Share on other sites

Good in theory but not in practice. The view range of robots are limited. You can probably step on babies or children without knowing it. Humans have ears that could detect who's attacking behind. If controlling a robot is just about a screen, a speaker and a mouse to shoot like in video games, then there's nothing good about it excepts that it has an armor and the soldier is not actually there. It's still too easy to kill a robot. Once the enemies know the weak point, what's left of a robot is a piece of metal.

Because every robot looks the same, it's impossible to determine who's responsible for shooting an innocent. Holding a true gun is far different from waving a mouse. With a mouse, if you move your hand 1-2 cm accidently, the shooting angle is modified, and the person who is 50m away from the object got hit.

The angle of the head must perfectly match the angle of the guns. If there's a technical problem like oil dry out or some movement stuck, the head look straight but the guns point behind or left or right or the view angle different, it's a massacre. It's not as easy like in a video game where your mouse point where it's shot.

That is a perfectly logical opinon and stance. Personnally, I just value an American life more then I would even a civilian in a nation that is at war with us. If the civilians of a nation at war with the US don't like casulaties, they should rise up and remove their government.

Your points are only engineering issues, not true problems. Sensors can be improved. Reflexes can be improved. Accuracy and targetting systems can be improved.

Video game players are more then trained enough to fight remotely with complete trust in their actions. It is the technology of field robots that needs to catch up. Once robots that have faster reflexes then humans are developed, then you'll not see accidents related to clumbsness of robots. It will eventually be EXACTLY like a video game.

Edited by DieChecker

Share this post


Link to post
Share on other sites

C'mon guys, what have the poor defenseless robots ever done to you? They have feelings too you know.

It would be interesting to see how they would perform however I'm not sure if they would ever be used with full autonomy.

Yeah, but sometime robots might stop being defenseless and they might try to take control over the world (wanting to rule humans and everything on this planet). People have feelings too, but you see everyday what is happening in the world. Robots must be used to help Humanity, not destroying it and i'm afraid that it might someday happen, if they're given complete freedom. :w00t: Edited by CuriousGreek

Share this post


Link to post
Share on other sites

Making a robot autonomus is different from giving a robot Free Will. It would be a robot with free will that would be dangerous, as it would make its own decisions based on experience, and the ethics and morals it developes from those experiences. An autonomus robot is just like a car building robot that exactly follows software commands. There is no driver, but there is also no choices. A = red, which leads to firing a rocket. B = blue, which leads to locking safetys on all ordinance. And so on. A robot with free will would decide on its own, based on thousands of variables... not compared to a database, but off experience to determine who is enemy and who is friendly.

At least that is how I see it.

Adding a controller, a driver, makes the robot THAT much MORE safe to use remotely.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 4

  • Recently Browsing   0 members

    No registered users viewing this page.