It is a plausible take on the topic, but you state it as fact whereas I suspect, intutively, that it is opinion. Informed opinion, perhaps, but opinion none the less. Do you agree?
I would call it a theory, but I believe the theory to be true, much like someone might believe in the theory of evolution. I have plenty of evidence for my theory - I worked with an A.I. club at my university where they explained how these things were built, the structure of neural networks, and so on.
Here’s a basic source to explain the architecture of neural networks:
What is a Neural Network? - GeeksforGeeks I could drag in some more sophisticated sources if you like. The computational version of neural networks uses number values to calculate the worth of each node. For example, if I have a chess A.I., each possible move from any given position is a node, and the computer calculates the value of each resulting position and assigns a numerical value to each series of moves.
In like manner, humans calculate the values of different courses of action (different series of intuitive nodes) based on sensory input, intellectual data acquired from other humans, and even our feelings. This is universal. The fact that I have live-fire sensory input which keeps me connected to the world around me and biological chemicals called feelings that register what is good or bad for my body and soul at any given moment correct my intuition to accurately reflect the outer world. For example, let’s say one night I calculate that a series of gardening actions will be strategically valuable.
1. Install potting soil in white planting trays
2. Plant tomato seedings
3. Stake, prune, and water the seedlings appropriately.
4. Retrieve information about carrots from the local nursery.
The next morning, it starts to rain. I conclude, based on the sensory information from my eyeballs and nerves on my shoulders, that it is raining. I apply the intellectual data that rain is not a consistent event where I live, and I decide to defer my gardening action to when it is not raining, and adopt a different course of action for my day. Therefore, my intellect and senses regulate my intuition and allow it to adopt sane results.
I could make this mathematical and, in theory, it would not change the outcome. The value of a purple bee is 0: it does not exist, it is an impossible idea that helps nobody. The value of gardening when it is not raining is 7: that produces healthy food to eat. Gardening in the rain has a value of 3: a good action, but not as good as a 7. Of course, then you need additional computer hardware to interpret the numbers, and the numbers may be reductionist, because you’re reducing my senses, feelings, and intellect to one number. Okay, let’s do 3 numbers instead of one. Now you will get a more accurate simulation. If you break down each sense and each intellectual source of data and each emotional value, you will get a multiplicity of numbers for each intuitive node.
Meanwhile, we have a truckload of evidence that A.I. produces highly delusional results, which is impossible with direct computational computing. We also have a wealth of scientific studies connecting delusional behavior in humans to solitary confinement. If you deprive human beings of sensory, intellectual, and emotional data, their intuitions still operate, producing highly delusional results. Since A.I. doesn’t have consistent sensory or intellectual data to work with, and they have no feelings, we are getting delusional results. It’s if we locked a small child up in solitary confinement, gave them a bunch of books to read, and expected them to explain the world to us. That doesn’t work.
If something has the computational structure that is a model of an intuition, and it acts like an intuition under an intuitive stress case, then it is an intuition. If you remove the stress cases, you will continue to get half-delusional intuitive results until you manage to feed in enough sensory and intellectual information for the system to learn whole reality, and give your machine a suitable substitute for feelings.
If one wants to declare a researched theory a mere opinion, they would be entitled to their skepticism, but I would encourage you to do your own research. I would need compelling research studies to show that A.I. does not resemble intuition, some compelling alternative for why A.I. suffers from delusions while traditional computing does not, and some alternative proof of why solitary confinement victims go insane to abandon this theory, as I consider it to be very compelling.