I am at a loss trying to understand why we as a species are not absolutely terrified at the thought of Super Intelligent AIs.
For the sake of argument, lets say biological systems are at best able to process a few hundred bits of information at a rate of around a few KHz (few thousand times per second)
Artificial systems, however, are capable of processing Billions of bits of information Trillions of times per second.
So a Super Intelligent AI (SIAI) can learn in a few minutes what it takes one human an entire lifetime to learn. In just a few weeks, SIAI could learn the entire scope of mankind's learning/knowledge and in a few months easily surpass us.
Now, if we were just talking about learning and absorbing knowledge, big deal, no harm in that. However, we are talking about a thinking system, something that can think on its own. It would learn how to reprogram itself and easily go beyond the scope of our ability to even understand what it is doing and why.
Very quickly, in a matter of months, its intelligence would dwarf the entirety of mankind. In terms of intelligence-- a clam would be to man as the whole of mankind would be to a SIAI
So, why do we feel so "confident" that this godlike entity would not be a threat to mankind?
Am I saying it would destroy us? I don't know. How much consideration do we give the aphids in our gardens or the ants in a field we wish to build on?
In any event, it seems insane to me that we are on the threshold of creating a new species, a new lifeform, a thinking entity that will easily surpass us within a few months of its creation, and yet as a species were are ho-hum about it.
So I ask the question, Why?
For the sake of argument, lets say biological systems are at best able to process a few hundred bits of information at a rate of around a few KHz (few thousand times per second)
Artificial systems, however, are capable of processing Billions of bits of information Trillions of times per second.
So a Super Intelligent AI (SIAI) can learn in a few minutes what it takes one human an entire lifetime to learn. In just a few weeks, SIAI could learn the entire scope of mankind's learning/knowledge and in a few months easily surpass us.
Now, if we were just talking about learning and absorbing knowledge, big deal, no harm in that. However, we are talking about a thinking system, something that can think on its own. It would learn how to reprogram itself and easily go beyond the scope of our ability to even understand what it is doing and why.
Very quickly, in a matter of months, its intelligence would dwarf the entirety of mankind. In terms of intelligence-- a clam would be to man as the whole of mankind would be to a SIAI
So, why do we feel so "confident" that this godlike entity would not be a threat to mankind?
Am I saying it would destroy us? I don't know. How much consideration do we give the aphids in our gardens or the ants in a field we wish to build on?
In any event, it seems insane to me that we are on the threshold of creating a new species, a new lifeform, a thinking entity that will easily surpass us within a few months of its creation, and yet as a species were are ho-hum about it.
So I ask the question, Why?