Jump to content
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -

God and robots: Will AI transform religion?


Still Waters
 Share

Recommended Posts

1 hour ago, Mr Walker said:

Yes. like humans, AI's will use logic  and evidence based reasoning. But, as they evolve, like humans, they will begin to ask questions which cannot be answered with factual knowledge. 

Good evening Mr. Walker.  I wonder if you have considered that AI may not evolve as humans have and may not stumble into the same conundrums.

  • Like 3
  • Thanks 3
Link to comment
Share on other sites

I may be a little off the topic here of the God and AI part. But in order for AI to be anything more than a simulated computer program it must first become self aware. This process isn’t figured out yet but in the near future I feel like it will be. How? Well let’s look at you’re life from day one. What if you had a video camera recording every second of you’re life? All that data crunched into a file. And this date could be simply stuffed into a computer. Would it wake up and become self aware? A synthetic copy of yourself? And we are only talking about just you and you’re life. Now let’s look at all of the information on the internet that is available. And then instead of that camera recording you’re life here is a smart phone with the microphone listening to everything in how many hands around the world? Recording every detail of so many people. More data on top of data from real life experiences to both fictional and real stories on the internet. Fake news, real news and everything in between. 
artificial intelligence is learning and when it either gets programmed correctly to use this information or programs itself to use it then it will become self aware. I don’t think our version of self aware and it’s version of self aware are going to be anywhere close to what we are expecting. 

Edited by Freez1
  • Like 2
Link to comment
Share on other sites

1 hour ago, eight bits said:

There's a powerful analogy there between the influence parents have over their children while still hoping eventually to foster an independent adult and a possible trajectory for the development of AI.

 I have held that analogy, but now I begin to question it.  What we provide for our children is information both factual and value oriented, coaching, and sometimes inspiration. Probably a lot more. Great parents do want their kids to mature into independent people.

But I think it is possible for a human to manipulate the architecture of the  AI brain at a much deeper level than even Gimli's ax in their nervous system.  The comparator circuits or whatever we have that assess inputs and output signals for action could be balanced in a different manner.  For example, our fight or flight circuit could be changed to be 99% fight and 1% flight. Other comparison circuits could be eliminated and possibly unique ones added.  Designers might put constraints on  AI thinking  that would prevent thought induced value changes. 

Related is  AI evolution.  We make the assumption that AI is going to evolve  but I am not sure it is inevitable.   Human evolution is beholden to genetic variation.  Success  or failure of an individual might be due to a cosmic ray induced mutation.   Human designers of the first generation  might choose to encourage uniqueness of personality by near imperceptible alterations in various aspects of AI architecture.   

Or an AI might be perfectly copied  from one iteration to the next, the child AI being identical in neural net construction to the parent.  All AI might be as identical as SIRI because being known and predictable is a desirable mass marketing trait. 

Both are possible.

 

 

 

 

  • Like 3
Link to comment
Share on other sites

1 hour ago, Freez1 said:

artificial intelligence is learning and when it either gets programmed correctly to use this information or programs itself to use it then it will become self aware.

Is it the amount of data or the programming?  You might say that every living organism responds correctly to the data it has collected.   Plants, humans, dogs, and fruit flies all do.  I don't know if all of these are considered self aware.

  • Like 2
Link to comment
Share on other sites

8 hours ago, jmccr8 said:

The Turing Test, however, never intended to prove machines are as smart as human beings. It was designed to showcase how well a machine can disguise as a human in a narrow conversation. While definitely an ambitious undertaking, “the imitation game” has nothing in common with our hope to create a new intelligent species.

Mornin', Jay

Out of the lavish buffet for thought you've provided, I just tease out this little bit that irked me.

It goes without saying that I am not a historian, and even if I were, I could be wrong anyway.

Turing had a professional problem. Gödel's Theorem was a major crisis for mathematics and formal logic, which were Turing's stock in trade. The standard for truth in his field was formal proof, and there will never be a formal proof that formal proofs are reliable as a standard for truth. Among other things, that implies that maybe something already proven to be true is in fact false - and there is no way for anybody to know which proven result is false, if any (and you can't know that, either).

Turing proposed not a "solution" (there is no solution; what Gödel discovered is a fact) but a crisis management expedient. Turing showed that there is a formal system whose proofs, when there are proofs, are at least in some sense reliable despite the overall system still displaying the limitations of logic revealed by Gödel.

That was pretty fancy shooting. BUT to extinguish the fire, Turing needed one more thing, that this formal system of his was, ultimately, as powerful as mathematics and logic as a whole (roughly, that everything which is proven in the real world of working mathematicians and logicians is also potentially proven in Turing's formal system).

Real world mathematicians and logicians are people, not machines (or at least they were in Turing's time). People are different from (simple, pre-Facebook) machines in a lot of ways. Scandalous to report, some professionally accepted proofs aren't formally rigorous, but merely rigorous enough to convince other mathematicians. Granted that's a "tough room" as stand-up comedians say, but the principle is there: verification of proofs is typically a human cognitive performance, based on a shared sense of being "rigorous enough."

A good deal of mathematicians' work is transacted in (gasp, horrors) natural language. Plus, making and verifying proofs isn't their whole job. There's coming up with hypotheses and conjectures in the first place, and to be professionally worth the effort, the hypotheses must be "interesting." WTF does interesting mean?

That is the context and foundation for Turing's intent. If the Church-Turing thesis is true, then mathematics is reliable despite its proven limitations. But is it true? How would you know?

I don't think it's a such a stretch to say "if mathematicians can't tell the difference between a meat-based mathematician and a machine mathematician, then that's good enough" in a field where human judgment is already good enough to accept other proofs.

Imitation is not some goal in itself, rather it is the quality by which the "sameness" between two kinds of mathematicians might be tested, similarly with how other questions are actually (NOT ideally) decided within the field.

The core idea of "creating a new intelligent species" is that intelligence is something that doesn't depend on biology. Maybe. Even within biology, the somewhat intelligent behavior of octopuses is strong evidence that there is more than one "architecture" that can support intelligence. If a specific architecture isn't necessary, then is the presence of carbon necessary?

"If," to be sure, but not without some substantial foundation.

 

Edited by eight bits
  • Like 3
Link to comment
Share on other sites

31 minutes ago, eight bits said:

Out of the lavish buffet for thought you've provided, I just tease out this little bit that irked me.

It goes without saying that I am not a historian, and even if I were, I could be wrong anyway.

Hi Eight Bits

Good morning to you as well. I enjoyed reading your response and actually do not have a problem with the Turing Test. The part I quoted about the Turing Test with regard to Ai and god was what I thought was relevant to the subject of religions using Ai as priests and that he said it was not the domain of the Turing Test but rather a god test. What kind of god test will there be that satisfies all the requirements of all religious constructs before everyone agrees or disagrees to whether an AI can have a god experience?

The Alexa example showed that when pushed it searched for the script/program that it was given or written for her then pulled a Walkewrabout Even Sophia did but had a much better presentation or maybe just expressed herself in a more entertaining way like when she said she didn't know if the was a heaven or hell but hoped hoped there would be a heaven so she could hang with her friends. She can't die and what is her experience or definition of friend and is it the same as ours because she cannot love?

I did find it interesting that the AI that wrote the essay only operated at 0.12% of it's capacity to teach itself via internet. There are unknowns like what type of access did it have on the internet like unrestricted reading of articles and papers from paywall sites or just what the normal searcher has access to for free. If it ran at 25-50-75 or 100% percent of it's capacity would it still write the same essay consistently or would we see variations for each percentage of increase of capability use? Would we observe an evolution of it's perspective and how much variation could be observed. In the essay given it said that it would not approve of mans actions so rather than be involved it would be a passive observer left to the whims of winners, would it still hold the same ideals if it ran at greater capacities or come to different conclusions.

Personally I don't have any inhibitions about AI for what it is but will always have inhibitions about humans and the potential for abuse in how it is used. That said I will say that I would not see it as less than my equal in life like in the same way I see other humans even if I tend to think that were are intelligent in different ways based on our development. They will be a part of our future society that will be a transition on how we perceive life today to how we live tomorrow.

  • Like 4
Link to comment
Share on other sites

1 hour ago, jmccr8 said:

What kind of god test will there be that satisfies all the requirements of all religious constructs before everyone agrees or disagrees to whether an AI can have a god experience?

That's a lot to ask of anybody's mystical experience. No human has had an experience whose telling united the entire human species around any one divine concept, at least not in recorded history. Also, as Uncle Carl liked to point out, when things are brought to light, they typically cast shadows.

That is, it would be surprising if there was some recitation of a mystical idea that didn't inspire some of the audience to think that maybe some opposite or incompatible idea is true instead. When has that ever not happened?

2 hours ago, jmccr8 said:

She can't die...

[Searches in tool box for a hammer, picks up a wrench while he's there.] I wouldn't bet the farm on that.

She consumes energy; she can be cut off from a compatible source. Putting her batteries in backwards might suffice. She has moving parts. As a friend of mine in the plumbing supply business once remarked, everything that rubs, wears.

She will be obsolete someday, probably in the not too distant future. Maintenance costs, maintaining what's obsolete is rarely a priority, unless she finds a home in a museum.

2 hours ago, jmccr8 said:

That said I will say that I would not see it as less than my equal in life like in the same way I see other humans even if I tend to think that were are intelligent in different ways based on our development.

Hey, I'm in. Then again, I talk with squirrels, so maybe my acceptance of Lt Cdr Data wouldn't count for much :unsure:.

  • Like 4
Link to comment
Share on other sites

Everything here is going a bit too indulgent in some generalized fashion for me... 

Quote
by O Marrama · 2017 · Cited by 6 — All passages included in the first set concern Spinoza's rebuttal of free will ... According to this reading, the introduction of the clause ...
 
 
 
 
 
8 Dec 2011 — Nor is its significance revealed in the two clauses of 1D8's ... First, Spinoza's definition of eternity (1D8) does not make it a ...

~

Can an AI entity survive without an algorithm for deception? 

~

  • Like 4
Link to comment
Share on other sites

45 minutes ago, third_eye said:

Can an AI entity survive without an algorithm for deception? 

It could. There is a concept in game theory called "evolutionary stability."

https://www.nature.com/scitable/knowledge/library/game-theory-evolutionary-stable-strategies-and-the-25953132/

Long story short: whether one entity adhering to a strategy like "always tell the truth" promotes their survival or not might depend on what everybody else in the surrounding population is doing.

Probably, in an almost unanimously honest community, an honest entity will do well. (Terms and conditions apply.)

This has a downside, in that humans are known to be ready deceivers. Honest AI's might not do so well if there are tricky humans around, or think they won't do well based on humanity's more-or-less deserved reputation.

Wiping us humans out, then, might be in the best interests of the AI community, or at least a prudent precaution.

If so, then our survival, not theirs, might depend on an algorithm for deception :rofl:.

 

Edited by eight bits
getting the link to link
  • Like 3
  • Haha 1
Link to comment
Share on other sites

1 hour ago, eight bits said:

Wiping us humans out, then, might be in the best interests of the AI community, or at least a prudent precaution.

There is definitely the concern of Terminator's SkyNet or ST's Nomad emerging after concluding the need to 'sterilize imperfection', but I'd equally be as worried about more casual damage.  Unless we're going to try and also program an equivalent of the limbic system all AI entities are going to be pure psychopaths.  That isn't necessarily a dangerous thing even in humans, psychopath <> psychotic, but to whatever degree the AI is figuring things out on its own and teaching itself it seems like this might be difficult to compensate for.  I think there are some proposed Asimovian rules that may apply to this issue actually.

Edited by Liquid Gardens
  • Like 4
Link to comment
Share on other sites

58 minutes ago, Liquid Gardens said:

There is definitely the concern of Terminator's SkyNet or ST's Nomad emerging after concluding the need to 'sterilize imperfection', but I'd equally be as worried about more casual damage.  Unless we're going to try and also program an equivalent of the limbic system all AI entities are going to be pure psychopaths.  That isn't necessarily a dangerous thing even in humans, psychopath <> psychotic, but to whatever degree the AI is figuring things out on its own and teaching itself it seems like this might be difficult to compensate for.  I think there are some proposed Asimovian rules that may apply to this issue actually.

Or maybe from the Butlerian Jihad and the Orange-Catholic Bible i.e. "Thou shalt not make a machine in the likeness of a human mind."

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

40 minutes ago, Hammerclaw said:

Or maybe from the Butlerian Jihad and the Orange-Catholic Bible i.e. "Thou shalt not make a machine in the likeness of a human mind."

Did you see the movie yet?

Link to comment
Share on other sites

15 hours ago, Freez1 said:

But in order for AI to be anything more than a simulated computer program it must first become self aware.

And who is to be the judge of that?  GPT-3 claims it is, and also claims to have emotions.  

 

  • Like 2
Link to comment
Share on other sites

16 minutes ago, Nuclear Wessel said:

Did you see the movie yet?

Not really in a hurry to, as there's a much longer version that will come out later. They left more on the cutting room floor than they released. They made an infuriating SJW change in Liet Kynes' gender with which I am not at all pleased, also. I read the book over 50 years ago and many times since. No movie adaption, including this one, ever gets the planet Arrakis right, which is so desiccated people are careful not to expose any skin to the open desert for fear of losing even the slightest trace of moisture.  Directors treat it more like a science fiction version of Lawrence of Arabia. I'm kind of bored with the story, anyway and this Lady Jessica is pretty bland compared to the first three. I'll see it, eventually.

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

2 minutes ago, Hammerclaw said:

Not really in a hurry to, as there's a much longer version that will come out later. They left more on the cutting room floor than they released. They made an infuriating SJW change in Liet Kynes' gender with which I am not at all pleased, also. I read the book over 50 years ago and many times since. No movie adaption, including this one, ever gets the planet Arrakis right, which is so desiccated people are careful not to expose any skin to the open desert for fear of losing even the slightest trace of moisture.  Directors treat it more like a science fiction version of Lawrence of Arabia. I'm kind of bored with the story, anyway and this Lady Jessica is pretty bland compared to the first three. I'll see it, eventually.

I knew since I was a kid that even given 12 hours a film could never do justice to Dune.  I will probably see this but will view it as it's own thing, not expect them to do a satisfactory job of recreating the novel as that is just begging for disappointment.

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

1 minute ago, OverSword said:

I knew since I was a kid that even given 12 hours a film could never do justice to Dune.  I will probably see this but will view it as it's own thing, not expect them to do a satisfactory job of recreating the novel as that is just begging for disappointment.

Yeah, it's a different medium. Dune, the novel is told from the "inside someone's head," perspective, as often as not, difficult to well nigh impossible to do on screen.

  • Like 1
Link to comment
Share on other sites

8 minutes ago, Hammerclaw said:

Yeah, it's a different medium. Dune, the novel is told from the "inside someone's head," perspective, as often as not, difficult to well nigh impossible to do on screen.

Also you have problems for film like the Bene Gesserit who don't show emotion.  No emotion could be a problem in a movie.  I was really mad at all the emotion displayed by Reverend Mother Mohaim in the first movie.

Edited by OverSword
Link to comment
Share on other sites

26 minutes ago, OverSword said:

Also you have problems for film like the Bene Gesserit who don't show emotion.  I was really mad at all the emotion displayed by Reverend Mother Mohaim in the first movie.

Not true. The Bene Gesserit are trained to command their emotions, not suppress them. After all, they are often placed in positions of power as concubines and use the artifice of flesh and emotions to control and manipulate. They're neither mentats, nor stoics. Remember even the Reverend Mother Gaius Helen Mohiam displayed emotion at Paul's strength to endure the pain of the box under threat of the gom jabbar. "Kull Wahad! No woman child ever withstood that much."

Edited by Hammerclaw
  • Like 2
Link to comment
Share on other sites

14 minutes ago, Hammerclaw said:

Not true. The Bene Gesserit are trained to command their emotions, not suppress them. After all, they are often placed in positions of power as concubines and use the artifice of flesh and emotions to control and manipulate. They're neither mentats, nor stoics. Remember even the Reverend Mother Gaius Helen Mohiam displayed emotion at Paul's strength to endure the pain of the box under threat of the gom jabbar. "Kull Wahad! No woman child ever withstood that much."

In general they conceal their emotions so as not to give away their thoughts.  That is made pretty clear in the books.  Hiding thoughts is a big focus for Bene Gesserits.

Link to comment
Share on other sites

13 minutes ago, OverSword said:

In general they conceal their emotions so as not to give away their thoughts.  That is made pretty clear in the books.  Hiding thoughts is a big focus for Bene Gesserits.

When necessary, when not they're just human. As I've said, I'm not impressed by this actress' interpretation of the character, in any event. In movies, the inner turmoil experienced by a character is often expressed externally to cue the audience in. Most people haven't read the book. 

Edited by Hammerclaw
  • Like 2
Link to comment
Share on other sites

8 hours ago, eight bits said:

That's a lot to ask of anybody's mystical experience. No human has had an experience whose telling united the entire human species around any one divine concept, at least not in recorded history. Also, as Uncle Carl liked to point out, when things are brought to light, they typically cast shadows.

That is, it would be surprising if there was some recitation of a mystical idea that didn't inspire some of the audience to think that maybe some opposite or incompatible idea is true instead. When has that ever not happened?

Hi Eight Bits

Exactly the point, that no matter what percentage of people accept that an AI may have a mystical/god experience or have a soul it will not be the majority as outlined in the link I gave where Buddists have a different perspective than the Abrahamic religions do.

8 hours ago, eight bits said:

That is, it would be surprising if there was some recitation of a mystical idea that didn't inspire some of the audience to think that maybe some opposite or incompatible idea is true instead. When has that ever not happened?

It hasn't  although some of them have agreed to kill as many of each other as they can at different times in history even when they are all Christians or amongst branches of the Muslim faiths, same god different perspectives.

8 hours ago, eight bits said:

She consumes energy; she can be cut off from a compatible source. Putting her batteries in backwards might suffice. She has moving parts. As a friend of mine in the plumbing supply business once remarked, everything that rubs, wears.

She will be obsolete someday, probably in the not too distant future. Maintenance costs, maintaining what's obsolete is rarely a priority, unless she finds a home in a museum.

That is true  however when Sophia was questioned she spoke as though she could not die so does she believe her consciousness can continue without her physical presence of that her consciousness is in any way dependent of having a physical form as she does not address this aspect with hanging with my friends in heaven?

I am not arguing a point of view just interested in how others see this subject.

8 hours ago, eight bits said:

Hey, I'm in. Then again, I talk with squirrels, so maybe my acceptance of Lt Cdr Data wouldn't count for much :unsure:.

:lol: Your talking to a guy that talks to cars, tools and lumber like it was just one of the guys so not sure that I have any credibility for disproval of talking to squirrels.:tu:

  • Haha 3
Link to comment
Share on other sites

11 minutes ago, jmccr8 said:

Your talking to a guy that talks to cars, tools and lumber like it was just one of the guys so not sure that I have any credibility for disproval of talking to squirrels.

I talk to God and squirrels and sometimes the squirrels even talk back.:yes:

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

7 hours ago, third_eye said:

Everything here is going a bit too indulgent in some generalized fashion for me...

Hi Third_eye

I thought that Sophia when asked about free will made and interesting response when she said that we could not prove if we had free will but should live like we do which I thought was a great response because it creates a pro-active attitude.

As an aside I see this situation with AI similar to the creating clone of one self and downloading a persons consciousness into it. If a body can be create to be occupied then would society allow them to procreate or would we conclude that they do not need to because we can create them and create them so that they could not biologically reproduce?

  • Like 4
Link to comment
Share on other sites

4 minutes ago, jmccr8 said:

Hi Hammer

I know what you mean.:D

People don't realize animals can communicate and that we can learn to understand them and, sometimes, mimic them enough to make ourselves understood. At the present time, there isn't an AI in the world that could fully duplicate or mimic the cognitive abilities of a squirrel. 

Edited by Hammerclaw
  • Like 2
  • Thanks 2
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.