Jump to content
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -

Meet Norman, the psychopathic AI


Still Waters

Recommended Posts

Norman is an algorithm trained to understand pictures but, like its namesake Hitchcock's Norman Bates, it does not have an optimistic view of the world.

When a "normal" algorithm generated by artificial intelligence is asked what it sees in an abstract shape it chooses something cheery: "A group of birds sitting on top of a tree branch."

Norman sees a man being electrocuted.

And where "normal" AI sees a couple of people standing next to each other, Norman sees a man jumping from a window.

The psychopathic algorithm was created by a team at the Massachusetts Institute of Technology, as part of an experiment to see what training AI on data from "the dark corners of the net" would do to its world view.

http://www.bbc.co.uk/news/technology-44040008

Link to comment
Share on other sites

NICE!.....Some knucklehead will think it would be "cute" to install in a military mech........

  • Like 4
Link to comment
Share on other sites

Well obviously if you teach it with nothing but violent images (which is what they did), it has nothing to pull from but violence. 

This is just common sense.

  • Like 3
Link to comment
Share on other sites

Aren't all AI psychopathic as they can't feel empathy? In fact they can't feel anything.

  • Like 4
  • Haha 1
Link to comment
Share on other sites

2 hours ago, danydandan said:

Aren't all AI psychopathic as they can't feel empathy? In fact they can't feel anything.

Absolutely, without human emotion AI is definitively psychopathic (it has to be). Even if the programmers 'showed' AI flowers and butterflies and super peaceful images...it would still be psychopathic. Images of sweet little puppies and kitties have no more intrinsic 'feeling' for AI than images of mass murder and torture have. 

IMO AI development is a big 'no no'...I'd pull the plug on all of it. 

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

12 minutes ago, Lilly said:

Absolutely, without human emotion AI is definitively psychopathic (it has to be). Even if the programmers 'showed' AI flowers and butterflies and super peaceful images...it would still be psychopathic. Images of sweet little puppies and kitties have no more intrinsic 'feeling' for AI than images of mass murder and torture have. 

IMO AI development is a big 'no no'...I'd pull the plug on all of it. 

I think specific task based AI is very important in different ways. From data mining to medical diagnosis.

Edited by danydandan
  • Like 2
Link to comment
Share on other sites

13 minutes ago, danydandan said:

I think specific task based AI is very important in different ways. From data mining to medical diagnosis.

Medical diagnosis also needs human common sense. 

  • Like 2
Link to comment
Share on other sites

30 minutes ago, danydandan said:

I think specific task based AI is very important in different ways. From data mining to medical diagnosis.

I'm still hesitant...in everything there exists a 'risk/benefit' element. I'm just not convinced that the development of machines that can 'think' is the way to go. 

  • Like 1
Link to comment
Share on other sites

56 minutes ago, Lilly said:

I'm still hesitant...in everything there exists a 'risk/benefit' element. I'm just not convinced that the development of machines that can 'think' is the way to go. 

AI and Machine Learning are two different concepts, AI doesn't think it's bound by it's code, a learning machine would be able to evolve it's code.

Specifically medical diagnosis for cancer that rely on visual evidence, an AI would be very beneficial. It would led to less errors.

Edited by danydandan
Link to comment
Share on other sites

1 hour ago, danydandan said:

AI and Machine Learning are two different concepts, AI doesn't think it's bound by it's code, a learning machine would be able to evolve it's code.

 

A very important distinction, thanks. If kept within the restraints of it's code then AI (especially when applied to medical diagnosis) seems alright. Guess I'm just the 'worrying type'. 

Link to comment
Share on other sites

Did anybody actually read the article? The only reason the program outputs such horrible things is because that's all they trained it with. They fed it nothing but pictures of death, war, murder and violence, that's all it knows and that's the only database it has to pull from.

It was created to be psychopathic, it didn't "turn" psychopathic. You cannot judge the motivations of AI based on this case.

Edited by moonman
  • Thanks 1
Link to comment
Share on other sites

15 minutes ago, moonman said:

Did anybody actually read the article? The only reason the program outputs such horrible things is because that's all they trained it with. They fed it nothing but pictures of death, war, murder and violence, that's all it knows and that's the only database it has to pull from.

It was created to be psychopathic, it didn't "turn" psychopathic. You cannot judge the motivations of AI based on this case.

AI's don't have motivations, or feelings they are completely bound to there programmed functions.

Link to comment
Share on other sites

36 minutes ago, danydandan said:

AI's don't have motivations, or feelings they are completely bound to there programmed functions.

Exactly.

  • Like 1
Link to comment
Share on other sites

Yeah, as if they really needed an experiment to tell us this.

They had me convinced way back at: "HAL open up the pod bay door". 

  • Like 1
  • Thanks 1
  • Haha 1
Link to comment
Share on other sites

Just now, Lilly said:

Yeah, as if they really needed an experiment to tell us this.

They had me convinced way back at: "HAL open up the pod bay door". 

I know, right?

Next they might want to do an experiment on whether putting your hand on a hot stove top hurts....
 

  • Like 2
Link to comment
Share on other sites

1 minute ago, Lilly said:

Yeah, as if they really needed an experiment to tell us this.

They had me convinced way back at: "HAL open up the pod bay door". 

Yup, HAL actions said it all to me. :tu:

Link to comment
Share on other sites

No psychopathic... just seeing the images it's been programmed to. Stupid how they try to embellsh on that. 

Link to comment
Share on other sites

If it’s not a high-speed asteroid, it will be AI that takes us out.  

Link to comment
Share on other sites

Sure is funny to read all the comments here talking about AI doing what it's programmed to do. There is a reason it's called AI and not simply "software". AIs learn. AI can teach themselves. AI can program themselves. And what about biotechnological applications, AI infused with neural network? Do these things sound like things that do what they're programmed to do?

  • Thanks 1
Link to comment
Share on other sites

 "the culprit is often not the algorithm itself, but the biased data that was fed to it."

That's exactly how humans become bigoted jerks who post alt-right garbage all over the internet.

  • Like 1
Link to comment
Share on other sites

1 hour ago, Rolci said:

Sure is funny to read all the comments here talking about AI doing what it's programmed to do. There is a reason it's called AI and not simply "software". AIs learn. AI can teach themselves. AI can program themselves. And what about biotechnological applications, AI infused with neural network? Do these things sound like things that do what they're programmed to do?

AI can't teach themselves, can't program themselves. Your confusing AI with Learning Machines.

Here's a good article on the topic

https://www.forbes.com/sites/theyec/2018/06/12/artificial-intelligence-machine-learning-and-the-future-of-connection/#2f76cd3f6a69

https://medium.com/iotforall/the-difference-between-artificial-intelligence-machine-learning-and-deep-learning-3aa67bff5991

Edited by danydandan
Link to comment
Share on other sites

23 hours ago, NicoletteS said:

No psychopathic... just seeing the images it's been programmed to. Stupid how they try to embellsh on that. 

Exactly. It's sad how many people read this bad article and think "OMG AI is evil" without understanding the basics of what is happening.

Garbage in, garbage out. It's that simple. They fed it nothing but garbage.

Edited by moonman
Link to comment
Share on other sites

A program written to write program code to determine future behavior in writing the program code and basing all of it in a closed subject matter of violence. No surprises here, somebody just likes to pop balloons for the noise. A better experiment is to now attempt to reform the code with pleasant images.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.