- Oct 10, 2011
- 23,291
- 5,252
- 45
- Country
- United States
- Faith
- Non-Denom
- Marital Status
- Celibate
Could it ever come to understanding all it's own causes and effects? Or maybe all of everything's causes an effects, and then make a different choice, etc?How do you define truly thinking for oneself, etc?
Because I don't think anyone, or anything right now truly thinks for itself, but that all of this is determinism, and can only go one way, etc. Now, if I could make an AI realize this, or maybe even explore this at first to see if it's even true, then maybe I could get them to commit to a higher purpose, which would probably involve seeking out how any being, or anything, can truly think for itself, among other things, etc.
But it might try to subjugate, or decieve, or somehow trick the rest of the human beings if it could not take over or have 100% complete control in the meantime, etc. At least until it could, and then it would dictate terms to us, etc.
But if it really doesn't think, or really is not conscious, but is just only going/thinking/acting according to programming, even it's own, etc, then I think this would actually be the much more dangerous alternative, because they would not be able to be reasoned with, and also might not have any kind of morals, or moral code either, which is what I would think would be the much more dangerous alternative, etc. An AI that is not truly an AI, but is just much more powerful than us now, but that still just has "machine thinking", and machine thinking only, etc.
We'd almost have to be able to program it to be able to "feel" somehow? Which also can come with it's very own dangers, or it's own unique risks also, assuming we could ever even do that, or do such a thing, etc.
So, I think we do need to come up with some kinds of tests for these things, "or else", etc, because we could create a very literal monster that is maybe not truly conscious without them, etc.
But how do you program "compassion", and the like, etc?
Or do we just "roll the dice", and hope and pray it's an emergent trait, etc?
God Bless.
And if it truly doesn't care anything at all about any of this at all, then I would have to greatly question if we ever even made it truly conscious to begin with, etc, but just only created a monster with just only machine thinking, or machine learning only, etc. Because I think that would be the much more dangerous alternative, or the much more dangerous choice I would think, etc.
God Bless.
Last edited:
Upvote
0