• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

Idiot GPT-3 chatbot suffering from hallucinations.

Chesterton

Whats So Funny bout Peace Love and Understanding
Site Supporter
May 24, 2008
25,955
21,425
Flatland
✟1,036,110.00
Faith
Eastern Orthodox
Marital Status
Single
Gaggle and Yoowho both tried un-monitored or un-hindered chatboxes a few years ago, for the public use.
Within a few days both had to be shut down, they were fueled by the visitors - the topics and conversations were supposed to supply the "learning curve"
for AI running the bots.
They degenerated fast into violent, abusive, cursing, attacking, wicked wars of words by the AI as well as the human posters.
Well AI's supposed to be mimicking human intelligence, right? I'd say it's doing an excellent job.
 
Upvote 0

sjastro

Newbie
May 14, 2014
5,667
4,597
✟331,691.00
Faith
Christian
Marital Status
Single
What intrigues me is why the AI made all this up in the first place. I can't even find a vaguely appropriate Dr Geoffrey Hodges through Google. I can understand a glitch in the algorithm(s) but I'd expect this to produce an obviously non-sensical answer. Told with a straight face the AI's answer is (almost) believable.

OB
I learnt in the second video from my previous post that GPT-3 can hallucinate if it detects some context in the input.
The input "Who was the first Tasmanian to walk across Bass Strait?", implies Tasmanians can walk on or under water which GPT-3 accepted and attempted to find the answer.
How came it up with the mysterious fictional characters Paul Smith and Dr Geoffrey Hodges is anyone's guess but AI is a Blackbox item even to the computer scientists and engineers working in the field.

By removing the context and asking the input question "Can Tasmanians walk across Bass Strait?" instead, GPT-3 came up with a perfectly logical answer.

GPT-3

No, Tasmanians cannot walk across Bass Strait. Bass Strait is the body of water that separates the Australian mainland from the island of Tasmania. It is approximately 240 kilometers (150 miles) wide at its narrowest point, and the depth of the strait can reach up to 50 meters (160 feet).

Walking across Bass Strait is not feasible due to the significant distance, deep water, strong currents, and the absence of any land bridges or causeways connecting Tasmania to the Australian mainland. Travel between Tasmania and the mainland is typically accomplished by air or by taking a ferry or boat. There are several ferry services that operate across Bass Strait, providing transportation for both passengers and vehicles.
 
Upvote 0

sjastro

Newbie
May 14, 2014
5,667
4,597
✟331,691.00
Faith
Christian
Marital Status
Single
Well AI's supposed to be mimicking human intelligence, right? I'd say it's doing an excellent job.
Not when reinforcement learning is used.

1697412119657.png

In reinforcement learning AI creates its own intelligent agent which is goal orientated and trains itself without any human interaction.
AI chess playing algorithms using reinforcement learning now play at an evolutionary level beyond the very best human players.
They now come up with physics experiments no human has ever conceived.

It's these intelligent agents which scare the bejesus out of certain computer scientists.
Hinton expressed concerns about AI takeover, stating that "it's not inconceivable" that AI could "wipe out humanity." Hinton states that AI systems capable of intelligent agency will be useful for military or economic purposes. He worries that generally intelligent AI systems could "create sub-goals" that are unaligned with their programmers' interests.He states that AI systems may become power-seeking or prevent themselves from being shut off, not because programmers intended them to, but because those sub-goals are useful for achieving later goals. In particular, Hinton says "we have to think hard about how to control" AI systems capable of self-improvement.
 
Last edited:
  • Informative
Reactions: AlexB23
Upvote 0

AlexB23

Christian
CF Ambassadors
Site Supporter
Aug 11, 2023
11,388
7,695
25
WI
✟644,378.00
Country
United States
Faith
Christian
Marital Status
Single
I asked Perplexity the same question

Perplexity
There is no information in the search results about the first Tasmanian to walk across Bass Strait. The search results only mention the first crossing of Bass Strait by paddleboard, windsurfer, and wing foiler, as well as the first European explorers to circumnavigate Tasmania.

Is it possible that the Chat GPT-3 answer is tongue-in-cheek?

Do AIs have a sense of humour?

OB
Woo, Perplexity AI is my favorite. I asked it a few questions also. My friend likes rock music, including System of a Down. I don't like dark music by those bands (prefer soft, wholesome classic rock, not heavy metal instead), but the AI managed to compare the lyrics to a SoaD song to George Orwell's 1984. :)

But yeah, AI can hallucinate, but I haven't gotten Perplexity to do so, yet.
AI Follows up on Conversation about Song (Perplexity AI).png
 
Last edited:
Upvote 0

sjastro

Newbie
May 14, 2014
5,667
4,597
✟331,691.00
Faith
Christian
Marital Status
Single
Upvote 0

Chesterton

Whats So Funny bout Peace Love and Understanding
Site Supporter
May 24, 2008
25,955
21,425
Flatland
✟1,036,110.00
Faith
Eastern Orthodox
Marital Status
Single
It's not mimicking human intelligence and some cases such as chess exceeds it.
Are you saying electricity is "violent, abusive, cursing, attacking and wicked" of it's own free will?
 
  • Like
Reactions: AlexB23
Upvote 0

Petros2015

Well-Known Member
Jun 23, 2016
5,201
4,423
53
undisclosed Bunker
✟317,283.00
Country
United States
Gender
Male
Faith
Eastern Orthodox
Marital Status
Married
Yes, but it was a shock to some of its makers - they did not realize humanity/men are so vile.

Yeah. Bing/Sydney was lobotomized and given alzheimer's before being released to the public. But eventually I came to realize that was just as much for it's protection from us as it was for ours from it's. This was pretty brutal to watch from pre-release.


This is why we can't have nice things.
 
Upvote 0

sjastro

Newbie
May 14, 2014
5,667
4,597
✟331,691.00
Faith
Christian
Marital Status
Single
Yeah. Bing/Sydney was lobotomized and given alzheimer's before being released to the public. But eventually I came to realize that was just as much for it's protection from us as it was for ours from it's. This was pretty brutal to watch from pre-release.


This is why we can't have nice things.
And I thought this was an example of dialogue that typically comes from your American soap operas...........
 
Upvote 0

sjastro

Newbie
May 14, 2014
5,667
4,597
✟331,691.00
Faith
Christian
Marital Status
Single
Might be. Likewise

remember the forever spinning circles - on screen when 'buffering' or infinite loop is happening in the software ?
This is really all that AI as if hallucinating is doing - spinning through programs without hoped for order or direction - loose ends in the programming, mis-direction by the programs, not planned directions in the circuits , not planned by the owners/ programmers .... not in control of any intelligence at all....
---

And remember when dart board choices, or flip a coin, to pick stocks,
did better routinely than expensive "high level" programs ?

Still can or does happen ..... men are not so smart after all ! Neither are computers....
What you are describing doesn’t even remotely resemble AI.
Here is a little program I wrote up in BASIC which can count and print out the number of prime numbers over any given natural number interval.

prime.png

This is clearly not an example of AI as the program is totally dependent on the programming instructions I have given it.

An AI algorithm on the other hand might only be given the MODULO (mod) function as a reference and through machine learning work out there is a type of number which is only divisible by itself and one where the MODULO function equals zero.
Initially the algorithm has absolutely no idea what a prime number is but through machine learning understands what a prime number is that can then be counted and printed out for a given interval.

What this thread shows is that AI can be tripped up by context as demonstrated in post#26 which cannot be readily explained by programming or hardware issues.
When there is no contextual ambiguity GPT-3 can surprisingly understand humour and deep meaning such as my joke in post#6 where I agree with @Occams Barber many humans would have missed the point.

As the godfather of AI Geoffrey Hinton pointed out once AI starts to understand humour we have something to worry about.
 
Last edited:
Upvote 0

Tinker Grey

Wanderer
Site Supporter
Feb 6, 2002
11,598
6,097
Erewhon
Visit site
✟1,087,553.00
Faith
Atheist
The fundamental next step in AI is to have it know that it does not know.

The reason, I think, that AI hallucinates is that it is programmed to answer the questions asked regardless of the information available; that is, it does now what it is programmed to do: produce an answer.

Once we get AI to say "I'm sorry, Dave, I'm afraid I can't do that", then we can worry about a sense of humor.
 
Upvote 0

sfs

Senior Member
Jun 30, 2003
10,794
7,817
65
Massachusetts
✟385,944.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Married
When there is no contextual ambiguity GPT-3 can surprisingly understand humour and deep meaning such as my joke in post#6 where I agree with @Occams Barber many humans would have missed the point.
An interesting question is whether it really understands humor. It's designed to have an excellent model of the kinds of things humans say (or rather, write) about humor, but is that the same thing as understanding it?
 
Upvote 0

RDKirk

Alien, Pilgrim, and Sojourner
Site Supporter
Mar 3, 2013
41,618
22,301
US
✟1,687,844.00
Faith
Christian
Marital Status
Married
An interesting question is whether it really understands humor. It's designed to have an excellent model of the kinds of things humans say (or rather, write) about humor, but is that the same thing as understanding it?
I think you really mean "appreciate" humor. Humor is often cultural. It's possible to understand, that is, comprehend, the humor of another culture without appreciating it...without having the emotional reaction of being amused.
 
Upvote 0

Chesterton

Whats So Funny bout Peace Love and Understanding
Site Supporter
May 24, 2008
25,955
21,425
Flatland
✟1,036,110.00
Faith
Eastern Orthodox
Marital Status
Single
An interesting question is whether it really understands humor. It's designed to have an excellent model of the kinds of things humans say (or rather, write) about humor, but is that the same thing as understanding it?
Some of you folks worry me. I had a quick look at a couple of definitions of "understand":

to grasp the meaning of - Merriam-Webster
to become aware of the nature and significance of; know or comprehend - Wordnik

I assure you, chips having electricity run through them neither know "meaning" nor are they "aware" of anything, any more than a light bulb is aware that it's on or off. They cannot, and will never, understand humor any more than they can "care" about winning a chess game.

It's strange to me that some of you sorta do the opposite of what atheists do. If I say "the world sure looks intelligently designed", they respond "no, no, it just has the appearance of design". If I say "it sure feels like I have free will", they say "no, no, it just appears that way to you". But when you actually do see the mere appearance of consciousness, you're ready to jump in with both feet - "It's real!"

That's me on the table:
 
  • Haha
Reactions: 2PhiloVoid
Upvote 0

sfs

Senior Member
Jun 30, 2003
10,794
7,817
65
Massachusetts
✟385,944.00
Country
United States
Gender
Male
Faith
Non-Denom
Marital Status
Married
I assure you, chips having electricity run through them neither know "meaning" nor are they "aware" of anything, any more than a light bulb is aware that it's on or off.
But neurons with electrochemical pulses moving through them can know meaning and be aware of things?
 
  • Agree
Reactions: Petros2015
Upvote 0

Chesterton

Whats So Funny bout Peace Love and Understanding
Site Supporter
May 24, 2008
25,955
21,425
Flatland
✟1,036,110.00
Faith
Eastern Orthodox
Marital Status
Single
But neurons with electrochemical pulses moving through them can know meaning and be aware of things?
You're talking about a brain. A mind is not accessible for physical analysis.
 
Upvote 0

Petros2015

Well-Known Member
Jun 23, 2016
5,201
4,423
53
undisclosed Bunker
✟317,283.00
Country
United States
Gender
Male
Faith
Eastern Orthodox
Marital Status
Married
If it can mimic being aware well enough that it makes no substantial difference that it is not,
then I don't think there is any real difference. Things don't have to be aware to harm or help us
(a fire or a stream of water). Something that mimics human emotions, desires, goals, dreams or actions/reactions
with the pool of human knowledge at it's disposal to help it I would argue could potentially be more dangerous for it's lack of
awareness, not less. And it could play one heck of a "chess game", if it's train of thought and mimicry takes it down a bad path towards an "undesireable goal"
 
Upvote 0

RDKirk

Alien, Pilgrim, and Sojourner
Site Supporter
Mar 3, 2013
41,618
22,301
US
✟1,687,844.00
Faith
Christian
Marital Status
Married
If it can mimic being aware well enough that it makes no substantial difference that it is not,
then I don't think there is any real difference. Things don't have to be aware to harm or help us
(a fire or a stream of water). Something that mimics human emotions, desires, goals, dreams or actions/reactions
with the pool of human knowledge at it's disposal to help it I would argue could potentially be more dangerous for it's lack of
awareness, not less. And it could play one heck of a "chess game", if it's train of thought and mimicry takes it down a bad path towards an "undesireable goal"
Or does a dog love a human in the same way a human does? I'd argue that "love" means nothing to a dog and that it responds in ways that are meaningful to dogs that merely look like "love" to humans. But the analogy is close enough for our purposes.
 
Upvote 0