• Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.

Can AI possess intuition?

expos4ever

Well-Known Member
Oct 22, 2008
11,235
6,223
Montreal, Quebec
✟297,173.00
Country
Canada
Gender
Male
Faith
Christian
Marital Status
Private
Here is my take on this. I see no reason to believe that the phenomenology of mind arises from anything other than physical processes in the brain. That includes all of the phenomenology of mind, including what we call intuition.

On this premise, seems inescapable to believe that intuition is simply the manifestation of some complex physical processing in the brain. I see no reason at all to believe this process cannot be automated.

The only conceivable argument I can think of against this is that there is something distinctive about processing in "brain meat" vs processing in silicon chips. This might be true, but my "intuition" is that mind is platform independent.
 
  • Like
Reactions: Ophiolite
Upvote 0

LastDaysJames

Member
Apr 30, 2025
19
4
52
Manitoba
✟910.00
Country
Canada
Gender
Male
Faith
Christian
Marital Status
Married
I personly believe that true intuition is from the Holy Spirit speaking to our mind filtered through the emotion. So with that premise, then I would say no; anything artificial can't have intuition. Can it create mock-intuition so on the out side look like the real thing....? Perhaps.
 
Upvote 0

truthpls

Well-Known Member
Oct 16, 2023
2,615
556
victoria
✟76,641.00
Country
Canada
Faith
Christian
Marital Status
In Relationship
As I've watched this thread progress and with out going into any of the points made, I just have the hardest time believing that a machine has intuition.
From a Christian perspective, people can be possessed and controlled by spirits. So can machines I would contend. The image of the beast in bible prophesy will speak and have power to bring death sentences on people who disobey. We know that the image reflecting Satan's agenda and will directly because that is what the beast (which the AI android or whatever the image of the beast is) is all about. A physical embodiment of Satan as a man.
So yes AI can have 'intuition' because it can be controlled by spirits. That means the actual intuition may not be from the robot/android/computer but from hell itself using the AI as a conduit
 
Upvote 0

2PhiloVoid

Unscrewing Romans 1:32
Site Supporter
Oct 28, 2006
24,151
11,251
56
Space Mountain!
✟1,327,031.00
Country
United States
Gender
Male
Faith
Christian
Marital Status
Married
Politics
US-Others
Prof Daniel Dennett said:


Dreyfus didn't think an AI could have intuition.


The audience laughed but I do not find that funny. That's a trivialization of intuition and it is not helpful toward a serious investigation of intuition. Then Dennett contradicted himself:


But according to his own definition, the computer program can trace its steps of long division and explain its logic to the asker as AI chats like Qwen can do today.

Can an AI simulate intuition?

Yes, according to Dennett's trivial example.

Can an AI possess real intuition?

How do people recognize one another? We do it intuitively, without consciously analyzing a person's facial features. Similarly, AI can perform pattern recognition using vector-based models without requiring a step-by-step analysis of facial characteristics to reach a conclusion. A deep learning model trained on millions of medical images can "intuitively" identify diseases in new images by recognizing subtle patterns. In this regard, AI demonstrates a form of pattern recognition intuition.

In contrast, an AI chess player can make moves that appear intuitive to human observers, yet they are actually based on analyzing move-by-move contingencies, looking 10 moves ahead. If you ask why it makes a specific move, it can trace its reasoning and explain its steps.

Another type of intuition relies on heuristics. For instance, when presented with two different answers, the simpler one is likely correct. For another example, when someone tells me that he is a jazz player, I immediately think of a saxophone. Of course, my intuition could be wrong. AI can utilize heuristics similarly.

What other kinds of human intuition are there? Can an AI replicate them all?

Can an AI have intuition?

Today's AI possesses some aspects of human intuition already. Perhaps in the future, AI can develop the full spectrum of human intuition. I don't know.

I would say no. ... but that's because I don't believe intuition is a real thing, and the term is nearly synonymous with superstition.

So, I'm not a real big fan of the idea of "intuition." I think people are naturally concerned and mindful about their ongoing existence and our minds are built for forming plausibility in forsight for survival. Some of us are good at this and some of us aren't.
 
Last edited:
Upvote 0

SelfSim

A non "-ist"
Jun 23, 2014
7,007
2,217
✟207,219.00
Faith
Humanist
Marital Status
Private
I would say no. ... but that's because I don't believe intuition is a real thing, and the term is nearly synonymous with superstition.
Interesting .. if we were to view intuition as a testably instinctive behaviour exhibited by many biologicial species, (the latter of which, also present no evidence of superstitious behaviours), then might that perspective enable us to visualise the possibility of an instinctive intuition of a type limited to only AIs .. (but also not shared by biological species)?
 
Upvote 0

2PhiloVoid

Unscrewing Romans 1:32
Site Supporter
Oct 28, 2006
24,151
11,251
56
Space Mountain!
✟1,327,031.00
Country
United States
Gender
Male
Faith
Christian
Marital Status
Married
Politics
US-Others
Interesting .. if we were to view intuition as a testably instinctive behaviour exhibited by many biologicial species, (the latter of which, also present no evidence of superstitious behaviours), then might that perspective enable us to visualise the possibility of an instinctive intuition of a type limited to only AIs .. (but also not shared by biological species)?

I think the use of the term suffers from semantic overreach and/or ambiguity. Take the case of the average Webster's denotative choices for "intuition"


But, if you have either a different definition to add, or something scientific to show me from the field of animal neuro-science, I'm all open to being presented with the evidence.
 
Upvote 0

SelfSim

A non "-ist"
Jun 23, 2014
7,007
2,217
✟207,219.00
Faith
Humanist
Marital Status
Private
I think the use of the term suffers from semantic overreach and/or ambiguity. Take the case of the average Webster's denotative choices for "intuition"
Sure .. I agree.
But, if you have either a different definition to add, or something scientific to show me from the field of animal neuro-science, I'm all open to being presented with the evidence.
Science doesn't really care much about lexical dictionary definitions, (ie: we see scientists altering them all time, upon the presentation of new direct evidence or new contexts).
The scientific method starts with observations .. not definitions. So the conditional part of: 'if you have .. a different definition', is thus not a particularly useful condition from the scientific viewpoint .. (its an irrelevant condition, really).
 
Upvote 0

2PhiloVoid

Unscrewing Romans 1:32
Site Supporter
Oct 28, 2006
24,151
11,251
56
Space Mountain!
✟1,327,031.00
Country
United States
Gender
Male
Faith
Christian
Marital Status
Married
Politics
US-Others
Sure .. I agree.

Science doesn't really care much about lexical dictionary definitions, (ie: we see scientists altering them all time, upon the presentation of new direct evidence or new contexts).
The scientific method starts with observations .. not definitions. So the conditional part of: 'if you have .. a different definition', is thus not a particularly useful condition from the scientific viewpoint .. (its an irrelevant condition, really).

You're apparently missing my underlying point. Let me be a little clearer: on the colloquial level, I'm only focusing at the moment on the more common reference people make to the term, "intuition," the one represented in Webster's first entry:

the power or faculty of attaining to direct knowledge or cognition without evident rational thought and inference
Forget the other possible denotations or even the 'scientific' understanding of intuition in more operative terms. Taking just the usual denotation above in red, how would you operationalize your science to study it? To me, it sounds like it would quickly turn into a study of supposed paranormal phenomena such as ESP, clairvoyance, mediums and other pseudo-scientific mumbo jumbo.

That is the point I'm trying to get to in what I began saying earlier. However, I know full well and good that you scientists have a more nuanced referent in mind, particularly when it might be applied to the sagacity of the possible mind of an A.I.
 
Upvote 0

expos4ever

Well-Known Member
Oct 22, 2008
11,235
6,223
Montreal, Quebec
✟297,173.00
Country
Canada
Gender
Male
Faith
Christian
Marital Status
Private
Interesting .. if we were to view intuition as a testably instinctive behaviour exhibited by many biologicial species, (the latter of which, also present no evidence of superstitious behaviours), then might that perspective enable us to visualise the possibility of an instinctive intuition of a type limited to only AIs .. (but also not shared by biological species)?
Interesting .. if we were to view intuition as a testably instinctive behaviour exhibited by many biologicial species, (the latter of which, also present no evidence of superstitious behaviours), then might that perspective enable us to visualise the possibility of an instinctive intuition of a type limited to only AIs .. (but also not shared by biological species)?
In other words, you are raising the possibility that intuition is platform dependent. That certainly seems like a possibility to me, although, as stated in previous posts, I believe what we call intuition is, in fact, platform independent. And hence, we can create AIs with the same faculty of intuition that humans possess.
 
Upvote 0

SelfSim

A non "-ist"
Jun 23, 2014
7,007
2,217
✟207,219.00
Faith
Humanist
Marital Status
Private
You're apparently missing my underlying point. Let me be a little clearer: on the colloquial level, I'm only focusing at the moment on the more common reference people make to the term, "intuition," the one represented in Webster's first entry:

the power or faculty of attaining to direct knowledge or cognition without evident rational thought and inference
Forget the other possible denotations or even the 'scientific' understanding of intuition in more operative terms. Taking just the usual denotation above in red, how would you operationalize your science to study it? To me, it sounds like it would quickly turn into a study of supposed paranormal phenomena such as ESP, clairvoyance, mediums and other pseudo-scientific mumbo jumbo.
I don't understand your issue here.
You just demonstrated my point about why scientific research doesn't start out assuming dictionary definitions as being true.
One ends up in a circular argument.
That is the point I'm trying to get to in what I began saying earlier. However, I know full well and good that you scientists have a more nuanced referent in mind, particularly when it might be applied to the sagacity of the possible mind of an A.I.
Really? What 'nuanced referent' are you assuming as being so on behalf of all 'you scientists', there?
(Now where's my flaming head avatar .. I know I put it somewhere on my SSD drive .. y'know its so capacious I just can't find it .. sigh .. isn't that just typical eh? )
 
Upvote 0

SelfSim

A non "-ist"
Jun 23, 2014
7,007
2,217
✟207,219.00
Faith
Humanist
Marital Status
Private
In other words, you are raising the possibility that intuition is platform dependent.
Well, firstly I don't know what model you are assuming when you use the term 'platform' there.
I'm starting out from more like: 'I don't know what intuition is .. but maybe we can investigate from what we can see AI doing(?)'
That certainly seems like a possibility to me, although, as stated in previous posts, I believe what we call intuition is, in fact, platform independent.
Right .. I get it .. that's what you're assuming to be true before you commence investigations, yes(?)
That doesn't look like a sound initialisation basis to adopt prior to scientific investigation, to me(?)
And hence, we can create AIs with the same faculty of intuition that humans possess.
Circularity .. We're trying to figure out what AI represents .. and not to confirm our going-in assumptions.
(Please see my prior response to 2PhiloVoid, above).
 
Upvote 0

2PhiloVoid

Unscrewing Romans 1:32
Site Supporter
Oct 28, 2006
24,151
11,251
56
Space Mountain!
✟1,327,031.00
Country
United States
Gender
Male
Faith
Christian
Marital Status
Married
Politics
US-Others
I don't understand your issue here.
You just demonstrated my point about why scientific research doesn't start out assuming dictionary definitions as being true.
One ends up in a circular argument.
It's not an issue per say; rather, it's a point of clarification. When I say that "I don't believe in intuition," what I mean precisely is in reference to, and only to, this colloquial denotation. If this denotation absolutely isn't what cognitive and a.i. scientists are precisely referring to, but to some other denotation of meaning, one more attuned for observation and measure in relation to what we know of human cognition and the ability to infer, then ............. maybe "intuition" is a thing. Whether an A.I. can attain that "intuition" at some point then becomes a live talking point for me.
Really? What 'nuanced referent' are you assuming as being so on behalf of all 'you scientists', there?
(Now where's my flaming head avatar .. I know I put it somewhere on my SSD drive .. y'know its so capacious I just can't find it .. sigh .. isn't that just typical eh? )

Whichever working definition of "intuition" they have in mind about mind and about potential A.I. cognitive capacities. Y'know, the sort that only you scientists know about and about which us philosophers are completely and utterly clueless about since, well, we never, ever, ever study that sort of thing............and if we do find a source, we toss it into the Humean Fire.
 
Last edited:
Upvote 0

SelfSim

A non "-ist"
Jun 23, 2014
7,007
2,217
✟207,219.00
Faith
Humanist
Marital Status
Private
It's not an issue per say; rather, it's a point of clarification. When I say that "I don't believe in intuition," what I mean precisely is in reference to, and only to, this colloquial denotation. If this denotation absolutely isn't what cognitive and a.i. scientists are precisely referring to, but to some other denotation of meaning, one more attuned for observation and measure in relation to what we know of human cognition and the ability to infer, then ............. maybe "intuition" is a thing. Whether an A.I. can attain that "intuition" at some point then becomes a live talking point for me.
...
Whichever working definition of "intuition" they have in mind about mind and about potential A.I. cognitive capacities. Y'know, the sort that only you scientists know about and about which us philosophers are completely and utterly clueless about since, well, we never, ever, ever study that sort of thing............and if we do find a source, we toss it into the Humean Fire.
Hmm .. (thinking, perhaps dangerously, out aloud here), I know from various of his Podcasts, (as an example of an AI scientist), Lex Fridman tends to lean towards the notion of, say, consciousness, being a property of the interaction between two physical systems (see footnote for more background).
Perhaps the same notion can be applied when investigating this 'intuition' thing? This approach would skirt around the specific issue of definition, by using human responses on their expereinces as the test subject for gathering test data.
The approach throws the entire gamut of human subjectivity into the test itself and the outcome would then be on a probabilistic basis(?)
I'm not entirely sold on the idea .. but I don't mind it too much either ...
The approach is along the same lines as I mentioned before, where I suggested that we should use AI interaction experiences to learn more about what we mean by our own 'intuition', which then I choose to interpret as being the real underlying, yet undistinguished, question being asked in the OP(?) .. I (still) dunno though ..

Footnote:
His general thinking is that if we have the same experience of interacting with other living creatures when we interact with robots, that is good enough. It doesn’t matter if the robot is sharing the same experience, for the experience to exist. Our interactions with robots can be conscious, even if we are not both always conscious independently. By extension, perhaps the interaction of two robots could be conscious, too, but with a different quality to experience than our own.
 
Upvote 0

linux.poet

Barshai
Christian Forums Staff
Purple Team - Moderator
Angels Team
CF Senior Ambassador
Site Supporter
Apr 25, 2022
4,536
2,007
Poway
✟341,229.00
Country
United States
Gender
Female
Faith
Christian
Marital Status
In Relationship
Politics
US-Republican
Possess? A.I. is pure, raw, unfiltered intuition that does not have the human facility called intellect that counterbalances intuition and produces sanity.

Intuition is, quite simply, just a group of associations. The three symbols d, o, and g, when assembled, refer to an obedient 4 legged mammal that barks, sniffs, and licks its human companions. But that goes much deeper than language - I associate Johnny Depp with Pirates of the Caribbean, William Shatner with Captain Kirk and Captain Kirk with Star Trek. Intuition is modeled on web diagrams, it's a network of ideas linked together by associations. Normally, these associations reflect reality, but intuition doesn't have to reflect reality. I can associate bees with the color purple all day long and draw purple bees and make an entire fantasy world where purple bees exist. It still doesn't change the fact that there are no purple bees. God created them yellow. There are no rainbow giraffes or pots of gold at the top of them. Sorry.

Today's A.I. are computational neural networks, which are just massive web diagrams with associations between them. They have gotten good enough to associate d, o, and g with a furry 4-footed mammal. They are also able to associate d, o, and g with the words that should come before and after d, o, and g in a sentence. Do these intuitively constructed sentences reflect reality? Only sometimes, just like a real intuition.

In a normal human, sensory data and intellectual data stored in the intellect will constantly inform the human about whether their intuition reflects reality, and suggest accurate associations. The A.I. computer has no such limits or data coming in.

A.I. is computerized intuition, plain and simple.
 
Upvote 0

Ophiolite

Recalcitrant Procrastinating Ape
Nov 12, 2008
9,065
9,971
✟267,313.00
Country
United Kingdom
Faith
Agnostic
Marital Status
Private
Possess? A.I. is pure, raw, unfiltered intuition that does not have the human facility called intellect that counterbalances intuition and produces sanity.

Intuition is, quite simply, just a group of associations. The three symbols d, o, and g, when assembled, refer to an obedient 4 legged mammal that barks, sniffs, and licks its human companions. But that goes much deeper than language - I associate Johnny Depp with Pirates of the Caribbean, William Shatner with Captain Kirk and Captain Kirk with Star Trek. Intuition is modeled on web diagrams, it's a network of ideas linked together by associations. Normally, these associations reflect reality, but intuition doesn't have to reflect reality. I can associate bees with the color purple all day long and draw purple bees and make an entire fantasy world where purple bees exist. It still doesn't change the fact that there are no purple bees. God created them yellow. There are no rainbow giraffes or pots of gold at the top of them. Sorry.

Today's A.I. are computational neural networks, which are just massive web diagrams with associations between them. They have gotten good enough to associate d, o, and g with a furry 4-footed mammal. They are also able to associate d, o, and g with the words that should come before and after d, o, and g in a sentence. Do these intuitively constructed sentences reflect reality? Only sometimes, just like a real intuition.

In a normal human, sensory data and intellectual data stored in the intellect will constantly inform the human about whether their intuition reflects reality, and suggest accurate associations. The A.I. computer has no such limits or data coming in.

A.I. is computerized intuition, plain and simple.
It is a plausible take on the topic, but you state it as fact whereas I suspect, intutively, that it is opinion. Informed opinion, perhaps, but opinion none the less. Do you agree?
 
Upvote 0

linux.poet

Barshai
Christian Forums Staff
Purple Team - Moderator
Angels Team
CF Senior Ambassador
Site Supporter
Apr 25, 2022
4,536
2,007
Poway
✟341,229.00
Country
United States
Gender
Female
Faith
Christian
Marital Status
In Relationship
Politics
US-Republican
It is a plausible take on the topic, but you state it as fact whereas I suspect, intutively, that it is opinion. Informed opinion, perhaps, but opinion none the less. Do you agree?
I would call it a theory, but I believe the theory to be true, much like someone might believe in the theory of evolution. I have plenty of evidence for my theory - I worked with an A.I. club at my university where they explained how these things were built, the structure of neural networks, and so on.

Here’s a basic source to explain the architecture of neural networks: What is a Neural Network? - GeeksforGeeks I could drag in some more sophisticated sources if you like. The computational version of neural networks uses number values to calculate the worth of each node. For example, if I have a chess A.I., each possible move from any given position is a node, and the computer calculates the value of each resulting position and assigns a numerical value to each series of moves.

In like manner, humans calculate the values of different courses of action (different series of intuitive nodes) based on sensory input, intellectual data acquired from other humans, and even our feelings. This is universal. The fact that I have live-fire sensory input which keeps me connected to the world around me and biological chemicals called feelings that register what is good or bad for my body and soul at any given moment correct my intuition to accurately reflect the outer world. For example, let’s say one night I calculate that a series of gardening actions will be strategically valuable.

1. Install potting soil in white planting trays
2. Plant tomato seedings
3. Stake, prune, and water the seedlings appropriately.
4. Retrieve information about carrots from the local nursery.

The next morning, it starts to rain. I conclude, based on the sensory information from my eyeballs and nerves on my shoulders, that it is raining. I apply the intellectual data that rain is not a consistent event where I live, and I decide to defer my gardening action to when it is not raining, and adopt a different course of action for my day. Therefore, my intellect and senses regulate my intuition and allow it to adopt sane results.

I could make this mathematical and, in theory, it would not change the outcome. The value of a purple bee is 0: it does not exist, it is an impossible idea that helps nobody. The value of gardening when it is not raining is 7: that produces healthy food to eat. Gardening in the rain has a value of 3: a good action, but not as good as a 7. Of course, then you need additional computer hardware to interpret the numbers, and the numbers may be reductionist, because you’re reducing my senses, feelings, and intellect to one number. Okay, let’s do 3 numbers instead of one. Now you will get a more accurate simulation. If you break down each sense and each intellectual source of data and each emotional value, you will get a multiplicity of numbers for each intuitive node.

Meanwhile, we have a truckload of evidence that A.I. produces highly delusional results, which is impossible with direct computational computing. We also have a wealth of scientific studies connecting delusional behavior in humans to solitary confinement. If you deprive human beings of sensory, intellectual, and emotional data, their intuitions still operate, producing highly delusional results. Since A.I. doesn’t have consistent sensory or intellectual data to work with, and they have no feelings, we are getting delusional results. It’s if we locked a small child up in solitary confinement, gave them a bunch of books to read, and expected them to explain the world to us. That doesn’t work.

If something has the computational structure that is a model of an intuition, and it acts like an intuition under an intuitive stress case, then it is an intuition. If you remove the stress cases, you will continue to get half-delusional intuitive results until you manage to feed in enough sensory and intellectual information for the system to learn whole reality, and give your machine a suitable substitute for feelings.

If one wants to declare a researched theory a mere opinion, they would be entitled to their skepticism, but I would encourage you to do your own research. I would need compelling research studies to show that A.I. does not resemble intuition, some compelling alternative for why A.I. suffers from delusions while traditional computing does not, and some alternative proof of why solitary confinement victims go insane to abandon this theory, as I consider it to be very compelling.
 
Upvote 0

Ophiolite

Recalcitrant Procrastinating Ape
Nov 12, 2008
9,065
9,971
✟267,313.00
Country
United Kingdom
Faith
Agnostic
Marital Status
Private
@linux.poet Thank you for your detailed response, which at 3:15 am is too late to give it the attention it merits. That's reserved for future me, but my provisional reaction is that seems more of a hypthesis than a theory.
Watch this space.
 
  • Like
Reactions: linux.poet
Upvote 0

linux.poet

Barshai
Christian Forums Staff
Purple Team - Moderator
Angels Team
CF Senior Ambassador
Site Supporter
Apr 25, 2022
4,536
2,007
Poway
✟341,229.00
Country
United States
Gender
Female
Faith
Christian
Marital Status
In Relationship
Politics
US-Republican
All good. To be fair, I'm still drawing data out of my memory on this, as that model of an A.I. neural network I just posted was very simplified. Each of the A.I. "senses" to regulate the nodes on the neural network is actually a bunch of complex math equations. Machine learning is a complex field.

For LLMs, another consideration is how humans acquire and process language as opposed to machines, and then you're back to Noam Chomsky and the Noam Chimpsky experiments where they taught chimps sign language. This means that language can technically be acquired (from humans) by any neuronal structure that can handle the string of intuitive associations, whether human, animal, or machine. This leaves me with a Social Constructivist theory of language as opposed to the Nativist theory of Chomsky, as his theory is almost debunked by the LLMs and chimp sign language experiments. The Nativist theory is hanging on by the thread that humans can acquire language from each other and thus we are "specially" equipped to acquire it, as opposed to the training required by the LLMs and the chimps.

66a7e3cf8156b80be706f4db_64c2921cc1b4e7bc13e784f4_Chomskys%2520Theories%2520in%2520Context.png


I think part of your skepticism is coming from the scientific debate over human language acquisition. The Behaviorist theory is the language learning theory behind Duolingo and Memrise, and it has some merit, but to really acquire a language I think you need to use it to talk to another person. Otherwise, all you have is intellectual information about words. I do believe that language acquisition and operation is intuitive, that the intuition operates it, and it is not a purely intellectual operation.
 
Upvote 0

Ophiolite

Recalcitrant Procrastinating Ape
Nov 12, 2008
9,065
9,971
✟267,313.00
Country
United Kingdom
Faith
Agnostic
Marital Status
Private
I think part of your skepticism is coming from the scientific debate over human language acquisition.
Your previous two posts reveal and substantiate that you have given considerable thought to the topic and read extensively in the relevant literature, certainly more than I. That raises the red flag that my objections could be a classic Dunning-Kruger failure on my part.

However, your suspected source of my skepticism is related to my thoughts on language acquisition. I don't know enough about language acquisition to use that as a basis for such skepticism. Rather it was the tone of your initial post that carried a sub-text that screamed "opinion" at me. ("Tone" is sometimes used to implicate a negative. I use it here as a shorthand for the style and resonance of the words.)

To be clear, I am not doubting that your expressed thoughts on the topic are plausible, but I see nothing in your posts to justify describing it as a theory in the scientific sense. I wonder if you are using the word more colloquially. For example you say "I would call it a theory, but I believe the theory to be true, much like someone might believe in the theory of evolution."

Indeed some people might believe the theory of evolution, or any other theory, but I consider belief in any theory to be an unacceptable, anti-scientific abuse of the term. Theories should be accepted, as the most likely, or one of the most likely explantions for a phenomenon, or set of observations, when the accumulated weight of evidence and argument, mutliply validated, would render any objections to such acceptance as unreasonable. (With the universal and eternal proviso that such acceptance is always provisional.)

On that basis, what you seem to have offered is a competent, well rehearsed stage in the development of a hypothesis that could lead to a subtantive theory. But it's not a theory. (The last sentence is my considered opinion. :))
 
  • Informative
Reactions: linux.poet
Upvote 0

linux.poet

Barshai
Christian Forums Staff
Purple Team - Moderator
Angels Team
CF Senior Ambassador
Site Supporter
Apr 25, 2022
4,536
2,007
Poway
✟341,229.00
Country
United States
Gender
Female
Faith
Christian
Marital Status
In Relationship
Politics
US-Republican
Rather it was the tone of your initial post that carried a sub-text that screamed "opinion" at me. ("Tone" is sometimes used to implicate a negative. I use it here as a shorthand for the style and resonance of the words.)
Ah. The tone of the first post is "surprise that this is even being debated", but it's my reading and knowledge into the subject of language and computer science and A.I., which has led to my surprise.

Keep in mind, I quoted nobody and my response is in response to the title of the thread, which was started by a Christian. I see no need to treat him with contempt, as I have respected his contributions elsewhere. Not only that, another of his posts says that he is an A.I. researcher with a full PhD in the matter. The fact that A.I. is based on neural networks is fairly common knowledge.

In general, I do not see the need to treat people, even atheists, with contempt - that is the sin of pride, for one thing. A stopped clock is right twice a day and even Smeagol may have something yet to do. Also, I think atheists may understand science better than Christians do as a whole group, simply because whatever intellectual basis one has for their beliefs, that one is inclined to research in rather extensive detail. An Orthdodox Christian would be rather invested in studying and translating church history and liturgy, a non-denominational Christian will research ancient Greek, Hebrew, cultural contexts of the Bible, and read systematic theologies, and an atheist, whose worldview is based on science, would be well-versed in scientific terminology, data, research studies, and so on.

Which is a long way of saying that I didn't intend to treat you with disrespect, it's just that, uh, I actually know things about this topic, or at least I think I know.
but I see nothing in your posts to justify describing it as a theory in the scientific sense. I wonder if you are using the word more colloquially. For example you say "I would call it a theory, but I believe the theory to be true, much like someone might believe in the theory of evolution."
Let's just go with the idea that I was using the colloquial meaning of the word "theory", and leave it at that.

Technically my professional expertise is in English (which as led to the examination of human language acquisition in college as part of that) and web design (which has led to my appearance in many computer science courses in college, and the aforementioned A.I. club, which has led to more independent research into the LLMs). So, as much as I don't like admitting it, you're probably right. In order to verify my hypothesis, you'd have to do scientific experimentation or research in the field of psychology and computer science to verify or disprove the hypothesis. In this case, the verification could come from existing research studies.

But that's beside the point, casually throwing scientific terminology around is probably above my pay grade, since my actual academic qualifications are in the humanities.
 
Last edited:
Upvote 0