Jump to content
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -

Not AI


Will Due

Recommended Posts

 

No matter how advanced artificial intelligence gets, it will never be:

  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)
  • undeterred by danger or pain; brave (Courageous)

 

Link to comment
Share on other sites

12 minutes ago, Will Due said:

 

No matter how advanced artificial intelligence gets, it will never be:

  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)
  • undeterred by danger or pain; brave (Courageous)

 

I believe humans can write a computer program that can perform all those things above.

What I don't believe is that AI can subjectively experience anything. It is forever a collection of parts with no capacity to experience as an organic unit.

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

35 minutes ago, Will Due said:
  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)
  • undeterred by danger or pain; brave (Courageous)

Sounds more like artificial dog than artificial intelligence.

 

  • Like 1
  • Haha 3
Link to comment
Share on other sites

 

Whether objective or subjective, can AI ever experience anything?

 

 

Link to comment
Share on other sites

1 hour ago, Will Due said:

 

No matter how advanced artificial intelligence gets, it will never be:

  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)
  • undeterred by danger or pain; brave (Courageous)

 

Computers and machines can already do all of the above to varying amounts. Some of them to a level humans are unlikely to ever achieve.

At the moment machines don't have the sense of introspection and planning/decision making via a consistent "self" that feels separate, and somehow beyond causation of the mechanical processes from whence it arises (ie. a nervous system in humans). This "self" doesn't really exist in humans either, we only have the feeling that it does, it is an illusion. In the same way that magicians don't really saw their assistant in half and join them back together, no matter how convincing it appears.

It is very likely that machines in the future will know they exist though. Perhaps eventually, in ways our puny cognitive abilities will never fully understand, as they will have the ability to improve themselves without having to wait on evolution (social and biological) to make changes the way we do. They also won't have all of the emotional and perceptual vagaries that makes us inherently off our rockers (to varying amounts) and so prone to delusion and conflict. Going to be interesting times.

Edited by Horta
  • Like 5
Link to comment
Share on other sites

No one know how qualia emerges, so we have no way of knowing if animals have it, or machines can get it.
It could be something that automatically comes along with decision making. Machines could already have it, in a primitive way, and that could evolve parallel with their advancement.
They could even become more humane than humans, because they don't have our limitations and frailty.

  • Like 2
Link to comment
Share on other sites

 

During a battle, is the tank courageous, or the people inside it?

 

 

Link to comment
Share on other sites

9 minutes ago, sci-nerd said:

No one know how qualia emerges, so we have no way of knowing if animals have it, or machines can get it.

I disagree to some extent with the bolded. I think it might be more accurate to say that no one has sufficiently demonstrated how inner experience emerges, or been able to explain it convincingly as yet, but there are some reasonable "in principle" explanations for (the rather vague and ill defined term) "consciousness" itself. The problems begin with definition of terms themselves though, and usually discussion breaks down there.

Quote

It could be something that automatically comes along with decision making. Machines could already have it, in a primitive way, and that could evolve parallel with their advancement.
They could even become more humane than humans, because they don't have our limitations and frailty.

I have the feeling that without complex language, humans wouldn't have the ability to experience things through a "self" the way we do. This seems consistent with social and even cognitive/ perceptual differences of certain cultural groups, who have also been claimed to have commensurate linguistic differences (though this itself is controversial). This idea would make "consciousness" a social phenomenon, more so than directly biologically inherited one. More Jaynesian than Darwinian, or more directly an effect of "software" than "hardware".

It seems highly likely that some critters, particularly higher mammals (such as our pets) have some rudimentary conception of a "self" and even limited ability to plan. Yet they also have rudiments of communication via language, though without the apparatus that will ever allow it to be as verbal and complex as our own. All "inner experience" seems tied to this notion of a separate "inner self". Central to language is an ability to have a mental space full of placeholders and analogues, and probably allows us this feeling/analogue of a separate distinguishing "self" to develop as we mature.

This could all end up being wrong with more study of course, but sadly at the moment the field seems weighed down with those (mostly philosophers) who begin with the idea that there is some special dedicated process or substance that exists and that's where the search should begin. Even to the point of clinging to 19th century substance dualism, though hidden in jargon. Not to mention opinions such as the op, who clings to ideas such as a magical "soul" because it props up their god/religious beliefs.

Machines could develop self awareness without us knowing initially, it seems possible hypothetically. Though they could also be programmed that way eventually, with more understanding of how our own minds work, simply because the autonomy that will allow for would be extremely useful in certain situations. 

  • Like 4
Link to comment
Share on other sites

7 hours ago, Will Due said:

 

During a battle, is the tank courageous, or the people inside it?

 

 

How much courage does it take to mow down people from a virtually impenetrable place?

  • Like 4
Link to comment
Share on other sites

13 hours ago, Will Due said:

 

No matter how advanced artificial intelligence gets, it will never be:

  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)

Someone is not a fan of the reimagined ‘Battlestar Galactica’. ;)  

Quote
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)

Someone does not own an Alexa. :D  

Quote

undeterred by danger or pain; brave (Courageous)

Say, after me, “Hey, Siri!”

Wait! Do you own an iPhone? 

:devil:  

  • Haha 1
Link to comment
Share on other sites

12 hours ago, papageorge1 said:
13 hours ago, Will Due said:

 

No matter how advanced artificial intelligence gets, it will never be:

  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)
  • undeterred by danger or pain; brave (Courageous)

 

I believe humans can write a computer program that can perform all those things above.

What I don't believe is that AI can subjectively experience anything. It is forever a collection of parts with no capacity to experience as an organic unit.

Can anyone explain why I have really bad arguments with both Alexa and Siri?!?!?!? :wacko:  

  • Haha 1
Link to comment
Share on other sites

12 hours ago, Tuco's Gas said:

Actually the debate of whether or not AI will ever be able to become imbued with any or all of those 3 qualities depends upon one's definition of the qualities in question.

To wit: If "loyalty" could be described as simply "following instructions or orders without questioning them" and "remaining in continual and absolute obeisence to one's leader or superior" then software already does this, since it is no more than instructions for the machine.

"Right and Wrong?" Well, in the field of philosophy the study of that is called Ethics. While true, a software program is incapable of sincerely being "concerned" with ethics, or fretting over them, it could certainly emit ethical answers and solutions to problems if its coder wrote them in such manner. Alexa and Siri both refuse to use obscene or vulgar language. Thus, seemingly polite and ethical.  

"Courageous?" A software program would continue to run and give voice transmitted orders for inhabitants of a burning office building to escape. Even while the computers were being destroyed by fire. Thus, in essence, giving its life to save others. Sounds courageous. 

And the software would never acknowledge pain. An android or robot could be programmed to go into combat.  Drones already do, facing enemy fire. Hmm..Sounds brave to me.

Of course we know that the software is merely programmed to imitate all those things. And AI will never get beyond the point of imitation of human qualities. The question is how adept they'll become at such emulation. 

Being someone who has been a fan of varying science fiction genres, I read and watch things with AI, and I wonder of the spiritual implications. Or at least, the possibility of actually creating or there being a soul. The 2004’s ‘Battlestar Galactica’, Star Trek, Star Wars, varying other fictions like the movie, ‘AI’, even the series, “WestWorld”. Is there a soul there in that man made vassal? Could it become, through pain and hardship like in ‘Westworld’? The character of DATA is always a conundrum. 

With that in my mind, I still consider Alexa and Siri, and still consider them as objects. I can have a discussion with Alexa in my home, but in the end, I feel it’s still just me. My phone is still something tucked away in my purse or pockets, when I don’t need it. Would I consider that with a conscious? 

I just feel, in the end, I can’t see how a conscious can come from varying ma made figures and multiple bits of programming. Then again, I still try to figure out how our soft fleshy vassals get consciousness, and when exactly they ‘arrive’ into us. *shrugs* 

AI’s gets their behaviors programmed into them, by those who have the experiences to know the difference between right and wrong. (And yes, I feel that is how we get that, by experience that is taught to the newbies. In the end, I think it’s how we relate to it and how we use it. 

Being taught by something, (and I’m talking about my own belief system too), that is just to be told it’s there, and being told to follow something, just because, doesn’t mean it will be followed through with understanding. I think, we have to know ourselves first to understand how and what to follow through with (programming, rules, etc.) and not just depend on something that just ‘feeds’ us something. 

I guess, in the end, I associate my world today and how I live in it, in how I can disassociate the fiction (science fiction) with the real world. (If that makes sense)

  • Like 2
Link to comment
Share on other sites

12 hours ago, eight bits said:
13 hours ago, Will Due said:
  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)
  • undeterred by danger or pain; brave (Courageous)

Sounds more like artificial dog than artificial intelligence.

 

You know, if I just continued on without saying anything, I knew someone else was going to mention ‘BG’. 

Then again, and actors playing characters either right or wrong on ‘BS’, :  I think about Richard Hatch’s character in the more recent ‘BG’. ;)  

 

 

 

  • Like 1
Link to comment
Share on other sites

12 hours ago, Will Due said:

 

Whether objective or subjective, can AI ever experience anything?

 

 

This reminds me of something, (and I read this in a Star Trek novel) about whether how one can actually get into the soul of someone else. The thought is, there isn’t really a way. How can you be someone else, soul and all? You are always you. 

So, in that note, can you answer that question? I don’t think anyone can, because how would we ever know? A conundrum in itself. 

  • Like 1
Link to comment
Share on other sites

12 hours ago, Horta said:
13 hours ago, Will Due said:

 

No matter how advanced artificial intelligence gets, it will never be:

  • giving or showing firm and constant support or allegiance to a person or institution (Loyal)
  • concerned with the principles of right and wrong behavior and the goodness or badness of human character (Moral)
  • undeterred by danger or pain; brave (Courageous)

 

Computers and machines can already do all of the above to varying amounts. Some of them to a level humans are unlikely to ever achieve.

At the moment machines don't have the sense of introspection and planning/decision making via a consistent "self" that feels separate, and somehow beyond causation of the mechanical processes from whence it arises (ie. a nervous system in humans). This "self" doesn't really exist in humans either, we only have the feeling that it does, it is an illusion. In the same way that magicians don't really saw their assistant in half and join them back together, no matter how convincing it appears.

It is very likely that machines in the future will know they exist though. Perhaps eventually, in ways our puny cognitive abilities will never fully understand, as they will have the ability to improve themselves without having to wait on evolution (social and biological) to make changes the way we do. They also won't have all of the emotional and perceptual vagaries that makes us inherently off our rockers (to varying amounts) and so prone to delusion and conflict. Going to be interesting times.

I wonder at how you see it that way. I’m not saying you’re wrong or I feel I maybe right, I’m just wondering. Even then, I can’t say I have the answers, because I just stay at not really being sure. I entertain the thought about us feeling something, because our bodies makes us feel that. I also wonder how I can feel and sense and be aware at all, and feel how that can be even if it’s outside our bodies. To me, something things make sense, and some things don’t. (I kind of like this conundrum I find it fun)............... ((I also seem to like the word conundrum too, sorry...........) 

But, your post is very thought provoking in a positive way. :yes:  

  • Like 1
Link to comment
Share on other sites

12 hours ago, sci-nerd said:

No one know how qualia emerges, so we have no way of knowing if animals have it, or machines can get it.
It could be something that automatically comes along with decision making. Machines could already have it, in a primitive way, and that could evolve parallel with their advancement.
They could even become more humane than humans, because they don't have our limitations and frailty.

Damn! I had to look that word up! (Learned something new today! And, I’m not done with my coffee yet.) :D  

I find this a wonder to consider. Makes me wonder, if I reflect really hard on this, will I burst a synapse?!?!? :o  :wacko:  

  • Haha 2
Link to comment
Share on other sites

12 hours ago, Will Due said:

 

During a battle, is the tank courageous, or the people inside it?

 

 

Wait until I finish my coffee!!!!

  • Haha 3
Link to comment
Share on other sites

10 hours ago, Horta said:
12 hours ago, sci-nerd said:

No one know how qualia emerges, so we have no way of knowing if animals have it, or machines can get it.

I disagree to some extent with the bolded. I think it might be more accurate to say that no one has sufficiently demonstrated how inner experience emerges, or been able to explain it convincingly as yet, but there are some reasonable "in principle" explanations for (the rather vague and ill defined term) "consciousness" itself. The problems begin with definition of terms themselves though, and usually discussion breaks down there.

I wonder if anyone would. Though, this here, is something that makes me wonder more, because of how we would describe it. (If I got it correctly with what you said here. I could have gotten it wrong.......... I sawry if I did.) Would we have the ability to understand the other side of the fence, if we found out ways to ‘know’ how it feels? 

10 hours ago, Horta said:
Quote

It could be something that automatically comes along with decision making. Machines could already have it, in a primitive way, and that could evolve parallel with their advancement.
They could even become more humane than humans, because they don't have our limitations and frailty.

I have the feeling that without complex language, humans wouldn't have the ability to experience things through a "self" the way we do. This seems consistent with social and even cognitive/ perceptual differences of certain cultural groups, who have also been claimed to have commensurate linguistic differences (though this itself is controversial). This idea would make "consciousness" a social phenomenon, more so than directly biologically inherited one. More Jaynesian than Darwinian, or more directly an effect of "software" than "hardware".

It seems highly likely that some critters, particularly higher mammals (such as our pets) have some rudimentary conception of a "self" and even limited ability to plan. Yet they also have rudiments of communication via language, though without the apparatus that will ever allow it to be as verbal and complex as our own. All "inner experience" seems tied to this notion of a separate "inner self". Central to language is an ability to have a mental space full of placeholders and analogues, and probably allows us this feeling/analogue of a separate distinguishing "self" to develop as we mature.

This could all end up being wrong with more study of course, but sadly at the moment the field seems weighed down with those (mostly philosophers) who begin with the idea that there is some special dedicated process or substance that exists and that's where the search should begin. Even to the point of clinging to 19th century substance dualism, though hidden in jargon. Not to mention opinions such as the op, who clings to ideas such as a magical "soul" because it props up their god/religious beliefs.

Machines could develop self awareness without us knowing initially, it seems possible hypothetically. Though they could also be programmed that way eventually, with more understanding of how our own minds work, simply because the autonomy that will allow for would be extremely useful in certain situations. 

I would think so too. I wonder at how some consider animals don’t have spirits/souls/consciousness and some do. And to why, they think that. I never would consider it likely that they didn’t, and now after being a pet owner for over ten years, (and now no, the last one passed away :( last fall.) I feel strongly, that they each do have that. And, wonder if they could be around after death, (I know, very debate able, I agree with that) and sometimes feel as I think people may haunt an area, I feel some of my pets do too. ( again, I’m not bashing people’s head here with it, it’s something I think for myself only as to why I consider souls and such. ) 

Also, thinking of the line in your post about not knowing about machines could develop awareness, yeah, how would we know? 

And yes, this is fiction based, but from a series of books dealing with native Americans by archeologists, but I have come by one having more than one soul, (I would love to be corrected on this, Piney, or desert? :yes:  ) And, I wonder, how would that play out. *shrugs* 

  • Like 1
Link to comment
Share on other sites

"Do Androids Dream of Electric Sheep?"

  • Like 2
  • Thanks 1
  • Haha 3
Link to comment
Share on other sites

On 4/22/2020 at 6:46 AM, Stubbly_Dooright said:

I wonder if anyone would. Though, this here, is something that makes me wonder more, because of how we would describe it. (If I got it correctly with what you said here. I could have gotten it wrong.......... I sawry if I did.) Would we have the ability to understand the other side of the fence, if we found out ways to ‘know’ how it feels? 

I would think so too. I wonder at how some consider animals don’t have spirits/souls/consciousness and some do. And to why, they think that. I never would consider it likely that they didn’t, and now after being a pet owner for over ten years, (and now no, the last one passed away :( last fall.) I feel strongly, that they each do have that. And, wonder if they could be around after death, (I know, very debate able, I agree with that) and sometimes feel as I think people may haunt an area, I feel some of my pets do too. ( again, I’m not bashing people’s head here with it, it’s something I think for myself only as to why I consider souls and such. ) 

Also, thinking of the line in your post about not knowing about machines could develop awareness, yeah, how would we know? 

And yes, this is fiction based, but from a series of books dealing with native Americans by archeologists, but I have come by one having more than one soul, (I would love to be corrected on this, Piney, or desert? :yes:  ) And, I wonder, how would that play out. *shrugs* 

I agree and it is anecdotal but after working with non verbal people as the result of Cerebral Palsy and late stages Dementia and Parkinson’s it is very clear to me that my dog does a damn good job of conveying to me what he needs or wants. 
 

Edited by Sherapy
  • Like 3
  • Thanks 1
Link to comment
Share on other sites

Why would a machine need to have a human experience? That's the real question. 

A machine that is exploring the surface of pluto would need enough A.I. to keep it self going and fulfill it's mission objectives. Even if a machine became more "human" all we'd need is the 3 laws of robots as a "subconscious" program. 

  • Like 1
Link to comment
Share on other sites

31 minutes ago, XenoFish said:

Why would a machine need to have a human experience?

 

It looks to me that the answer to that question is because for some, the human experience requires the creation of a machine that's equal to or better than man himself as proof that God doesn't exist.

Because if man does, it will mean that man is smarter than God.

 

Spoiler

 

So as time goes on, at some point perhaps thousands of years from now, and unlike today, man will have exhausted his attempts to do what God does.

Which will then stand as proof that man isn't that smart afterall.

 

 

 

 

Edited by Will Due
Link to comment
Share on other sites

7 hours ago, Will Due said:

 

It looks to me that the answer to that question is because for some, the human experience requires the creation of a machine that's equal to or better than man himself as proof that God doesn't exist.

Because if man does, it will mean that man is smarter than God.

 

  Hide contents

 

So as time goes on, at some point perhaps thousands of years from now, and unlike today, man will have exhausted his attempts to do what God does.

Which will then stand as proof that man isn't that smart afterall.

 

 

 

 

Who is arguing that god isn’t smart? 
 

You do know that omniscient means all knowing and god is defined by Philosophy as infinite perfection and beyond human comprehension by the Religious. 

The only poster that has argued god isn’t smart, is not infinite perfection is Mr. Walker. 

Edited by Sherapy
  • Like 1
Link to comment
Share on other sites

7 hours ago, Will Due said:

 

It looks to me that the answer to that question is because for some, the human experience requires the creation of a machine that's equal to or better than man himself as proof that God doesn't exist.

Because if man does, it will mean that man is smarter than God.

 

  Reveal hidden contents

 

So as time goes on, at some point perhaps thousands of years from now, and unlike today, man will have exhausted his attempts to do what God does.

Which will then stand as proof that man isn't that smart afterall.

 

 

 

 

I'm not going to look at your hidden content. I don't need to be psychic to know what it is. Plus you are under the assumption that god exist in a tangible way. If you toss out God you're questions do not matter. From a technical stand point. We would need a way to mimic the human brain and train a machine to think, which might involve some kind of neural connect that allows for mind mapping. Plus running this AI through 1000's of hours in simulations. Something that might take another 100 years to do. Unless we decide to clone human brains and create thinking machines out of them.

  • Like 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.