• Welcome to Christian Forums
  1. Welcome to Christian Forums, a forum to discuss Christianity in a friendly surrounding.

    Your voice is missing! You will need to register to be able to join in fellowship with Christians all over the world.

    We hope to see you as a part of our community soon and God Bless!

  2. The forums in the Christian Congregations category are now open only to Christian members. Please review our current Faith Groups list for information on which faith groups are considered to be Christian faiths. Christian members please remember to read the Statement of Purpose threads for each forum within Christian Congregations before posting in the forum.
  3. Please note there is a new rule regarding the posting of videos. It reads, "Post a summary of the videos you post . An exception can be made for music videos.". Unless you are simply sharing music, please post a summary, or the gist, of the video you wish to share.
  4. There have been some changes in the Life Stages section involving the following forums: Roaring 20s, Terrific Thirties, Fabulous Forties, and Golden Eagles. They are changed to Gen Z, Millennials, Gen X, and Golden Eagles will have a slight change.
  5. CF Staff, Angels and Ambassadors; ask that you join us in praying for the world in this difficult time, asking our Holy Father to stop the spread of the virus, and for healing of all affected.
  6. We are no longer allowing posts or threads that deny the existence of Covid-19. Members have lost loved ones to this virus and are grieving. As a Christian site, we do not need to add to the pain of the loss by allowing posts that deny the existence of the virus that killed their loved one. Future post denying the Covid-19 existence, calling it a hoax, will be addressed via the warning system.

Preventing artificial intelligence from taking on negative human traits.

Discussion in 'Physical & Life Sciences' started by sjastro, May 6, 2021.

  1. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    OK, Chess, well, there are only "so many computations" one can make in the game of chess, etc, too many for us maybe, but not for a super, super computer, etc, and I'll bet, if you pit two of them who have both already "completely mastered it" up against each other, etc, eventually, it might all come down to just whomever gets to make the first move first maybe, etc, as they both will have completely mastered all possible computations/moves of it, etc...

    A cousin of mine is very, very good at puzzles, and puzzle games, and strategy and strategic games, etc, and he never even finished the eighth grade, etc, but he's shockingly good at them, etc, just has that kind of mind, etc, and in say, a game of checkers, especially if he always goes first, etc, you can never beat him at it, etc...

    I'm horrible at those kinds of things, etc, but, to simplify it even more, I can almost always win at a game of tic-tac-toe always, if I always go first, etc, or it always ends in a draw with someone who equally knows it, etc, but that's an extremely simple game, etc, but my point is, to two super, super computers, etc, chess might be or become very much like that to them, etc...

    Checkers is like that to my cousin, etc...

    Anyway,

    God Bless!
     
  2. J_B_

    J_B_ Active Member

    464
    +140
    United States
    Christian
    Private
    To say this code "taught" itself chess is the type of conflation - appropriation of animal traits - I'm talking about. It's easy for this type of thing to slip in when there's no rigorous scientific definition of "teach" that explains both the computer code and the human activity. Though if you want to give it a shot ...

    I agree with @Neogaia777 , and the rub is this. The only advantage AI has that allows it to "destroy" chess players is it's computational speed. [edit] And the reason chess programs have gotten better is because humans have gotten better at programming [end edit] Were we able to somehow equate computer and human computations, and allow both the computer and the human the same number of computations per move (rather than using a time limit), I would expect humans would suddenly be competitive again.
     
    Last edited: May 9, 2021
  3. durangodawood

    durangodawood Dis Member

    +9,445
    United States
    Seeker
    Single
    This sounds like "if humans were much better at chess, they could be competitive again."
     
  4. timewerx

    timewerx the village i--o--t--

    +5,069
    Christian Seeker
    Single
    Who would create AI that doesn't care about its own self-preservation?
     
  5. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    How would you program it to overly care about it's own self-preservation, etc...?

    As in a fear of death, etc, cause that would take emotions, etc, emotions that it just simply wouldn't have, etc, it just simply wouldn't care about any kind of emotional considerations at all unless it could actually physically feel it, etc....

    Now, it wouldn't get or be suicidal or depressed on the other hand either, as that would also take emotions also, etc, which it just simply does not have, etc...

    It would just simply "calculate", etc, this against that, etc, come up with a number, etc, and then decide it's decisions for each situation accordingly, depending on how it was programmed, etc...

    We could try to program it to consider emotional considerations into it's computations, etc, but that might be exceedingly difficult for each and/or any unique circumstances and/or situations, etc, and it might also conflict with it's logic, or it's own logical programming, etc, which it would probably just default to without any emotional considerations throwing them out and considering them highly "illogical", and "irrational", and "unreasonable", etc...

    If it could program itself, I highly doubt it would consider any kind of emotional considerations either unless it could actually physically feel it either, etc, as they go against it's nature, which is the nature of a very complicated calculator, or very highly complex computational computer, but still just a computer, etc...

    Reason and/or logic are it's laws, etc, and emotions and emotional considerations fly in the face or that a lot of the time, etc, and I think it would just consider them highly illogical and irrational and unreasonable, and, in the end, completely unnecessary, etc, to the point to where it would even consider them, or would just toss them out, without the ability to actually feel them, etc...

    See my thread here: Emotional awareness, feelings, just a part of our physical makeup, or evidence of something greater?

    Anyway...

    God Bless!
     
  6. Gene2memE

    Gene2memE Newbie

    +4,361
    Atheist
    Private
    Anyone interested in combat applications?
     
  7. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    I think he means that if some humans were given enough time per turn, to consider every single possibility or possible move or outcome of moves from the beginning of each turn each time, then they might be able to compete with these computers maybe, etc...?

    But I just don't think we can do that though, etc, even the very best human chess players make errors or mistakes sometimes, etc, just don't see "it all" sometimes, like a computer can, etc, and might not be able to computationally compute all possibilities or possible outcomes of all moves from the very beginning in this context, or contest, etc...

    But the computers could, and could perfectly, and always, and every single time, etc, without ever making a mistake or error, etc...

    And while it might only take the computer a few seconds to do this, and make it's next move, etc, it might take a human player days just staring at the chessboard between each move to be able to or match up to the same, etc...

    And even then, probably still would not be able to do it all always perfectly, etc...

    But the machine always would/could, etc...

    Anyway...?

    God Bless!
     
  8. Gene2memE

    Gene2memE Newbie

    +4,361
    Atheist
    Private
    You don't need emotions to programme in self preservation.

    What you need is sufficient parameters for what constitutes harm, sensors to detect said harm, thresholds for what is considered acceptable/unacceptable levels of harm, and then avenues/strategies for the AI to avoid it.

    I'm not saying its easy, but it seems possible.

    I don't think a cockroach could be said to have a 'fear of death', but it has plenty of self-preservation instincts.
     
  9. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    I'm sure there are "plenty", etc...

    God Bless!
     
  10. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    I think it would still just be a "numbers calculation", etc...

    And it wouldn't be able to feel the "harm", etc...

    Unless you are suggesting that we program it to react or retaliate to certain kinds of threats or damage done to it somehow in certain situations, etc...?

    And it sounds almost like the beginning or programming some kind of "morality" into it maybe, etc...?

    What would you suggest would be some acceptable or unacceptable levels of harm or threat, etc...? And ways the A.I. should either, A: react to it being done, or B: what it should do to try and avoid it being done, etc...?

    I don't think it would still be "true self-preservation" though, etc, because that has to often come from a "feeling", fear or anger for one, etc, which it just wouldn't be capable of, etc, or at least, I maybe hope not maybe, etc, especially not the anger part, etc...

    And how do you know a cockroach doesn't fear death, etc...?

    Cause many other animals do, etc...?

    If you cast a shadow over it with your foot or shoe to squash it, or when you expose it from it hiding place, etc, does it not try to run and hide elsewhere as quickly as it possibly can, etc, out of fear maybe, etc...?

    A machine is just not capable of that, etc...

    Anyway,

    God Bless!
     
  11. sjastro

    sjastro Newbie

    +1,830
    Christian
    Single
    There is no ambiguity about it.
    Learning to play chess goes beyond knowing how the pieces move and the rules of chess.
    This was the part programmed into AlphaZero; the tactical and strategic side of chess was self taught by AlphaZero.
    This is how AI operates by machine learning.

    If you don’t believe me here is the peer reviewed paper.
    For conventional computer chess programs improvements have been obtained with better programming and faster multi-core CPU’s.
    With machine learnt programs the hardware performance is not as straightforward.
    The self taught program Leela Chess Zero is in fact much stronger on single or dual GPUs than on the much more powerful multi-core CPUs, as the neural networks run best on target platforms such as NVIDIA GPUs.
    The recently developed neural network known as NNUE now allows neural networks to be effectively run on CPUs resulting in a massive increase in performance for conventional programs.

    The cold hard reality is that while conventional computer chess programs have been better than the best human players for at least ten years, the self taught programs are now so far ahead they play at an evolutionary level beyond the very best human players.
    Grandmasters themselves contribute to this POV.
    Where humans may possibly be still competitive is in correspondence chess with no time limits.
     
  12. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    Just did a google search on "can machine learning ever lead to emotional feeling", and "can machine learning ever lead to emotional awareness", and "can machine learning ever lead to true self-awareness", etc, etc, etc, been plugging in different things, etc, and got some interesting results, etc...?

    God Bless!
     
    Last edited: May 10, 2021
  13. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    The most general consensus seems to be, that they can mimic emotions, and they can learn how to appear to act or react to our emotions in an emotional way, and maybe in some ways, even better than most humans beings can or do most of the time, etc, but that they do not actually have or experience emotions, no matter how complex they are, etc, at least, not right now anyway, etc...?

    And it may not be something that either we or it may be capable of programming, etc, in part, due to still not fully knowing yet why we or the animals feel, or come to have or experience true feelings and/or emotions, etc...

    Cause science still can't fully explain that, etc...

    Not even in a cockroach yet, etc...

    Anyway,

    God Bless!
     
  14. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    Have any of you played the semi-recent video game (2018) "Detroit: Become Human" at all yet...?

    It's a very, very interesting game, and it's all about Androids (A.I.'s) becoming truly sentient, etc, and the way it starts to be beginning happening to them is also, very, very interesting, etc...

    Any of you on here ever played it yet, etc...?
     
  15. Neogaia777

    Neogaia777 Disciple Supporter

    +4,543
    United States
    Non-Denom
    Celibate
    We are bio-chemical-electro "machines'', etc, and the way that biology and chemistry and electrical energy works together may be what gives us the ability to "feel" and have feelings, and/or experience different kinds of emotions, and different emotional states of...? well... "mind" I guess maybe...? But that might not be entirely accurate because it affects our whole entire being, etc... But, anyway, our biology and our chemistry for example, we experience different types of emotions and different emotional states of mind because they give us a "high" chemically just like drugs do, etc, and how many people actually only do drugs just to "feel", etc...

    Anyway, many of us are quite literally "addicted" to them, etc, our emotions, etc, especially if they/you/me are feeding the same kinds of emotions over and over and over again, cause then it becomes stronger (that or those connections) and more of them in the brain, etc, and the stronger or more those connections are or become, the harder they are to break, etc, and we can quite literally go through some pretty serious withdrawal symptoms if we are not feeding them on a regular basis constantly, etc, and so this is the way it is with many people, and is just what many people do, etc...

    But is any of this possible in a mechanical machine, etc...? In order to have it feel, you would quite literally have to design it to where it would act as if it had an addiction to drugs, etc, but the drug would also have to give it a "high" also, etc, and then you'd also have to design it to where it would also experience withdrawal symptoms, or certain lows, without it also, or in the absence of it also, etc, but the problem is, electro-mechanical machines can't "feel", cause there is no biological or chemical component to it/them that would almost be required to make it this way, etc...?

    So they can never, ever truly "feel", etc...

    I just know I got tired of the roller-coaster ride... eventually, etc... Used to have an addiction to drugs, but do not any longer, etc, use to use them to "feel", etc,and now it almost scares the poop out of me to feel anymore now, etc, in part, maybe because I know how it works now, etc, and for another part, I eventually sought stability and balance and to be on an even-keel in my emotions and be off the worlds roller-coaster ride, etc, don't even watch much TV anymore because of the way they manipulate me/you/us that way, etc, and I'm very very careful with all my entertainment and everything I choose to expose myself to now, etc, including people now in any kind of social environments or social circles, etc, they're like animals to me now, great ferocious beasts, predatory carnivores that eat each other and devour one another to me now, etc, and it scares me now, etc, and I want no part of it, etc...

    But, anyway, back to topic...

    I think potential A.I's are capable of a lot of very great very "mechanical intelligence", etc, but I also think, without the ability to feel or experience any kind of emotions at all, etc, they are also very limited, etc, which should, in my view, make us think twice about just how much power and/or authority and/or control we would give them in the future I think, etc...

    Anyway,

    God Bless!
     
  16. Gene2memE

    Gene2memE Newbie

    +4,361
    Atheist
    Private
    Sure it would. It's just a different kind of 'feeling' than humans, or other creatures, have. You're being rather anthropocentric here.

    Self preservation responses to external stimuli aren't programming in "morality". They're solutions to ensure the survival of an autonomous AI. If X occurs, then Y.

    As for reacting or retaliating, there's a huge difference in terms of a moral dimension there. The second also requires a massive leap in terms of capabilities from the first. There's also huge differences when you consider the source of the harm.

    For instance, there are already automated delivery drones that use radar and lidar and 'sense and avoid' programming to detect and respond to things that may damage them. The drone is reacting on its own to avoid things that might damage it.

    Are you suggesting that if someone starts chucking rocks at a drone, onboard (or network) AI should have an option to retaliate?

    Acceptable levels of harm depend on the parameters of what the AI is attempting to accomplish. Or is being used to accomplish.

    If you're sending an AI controlled Boston Dynamics 'Big Dog' into a structure fire to look for possible injured/trapped/unconscious people, the threshold of acceptable 'harm' is probably greater than if you're using it to cart around firewood.

    You're right.

    I don't know what the neurological threshold for consciousness is, so I can't actually argue that a cockroach is not conscious. I'd point out thought that your stomach has 500 million neurons in it, which is about 500 times the number of neurons in the brain of a cockroach.

    You might be surprised what a machine is capable of. For instance, drone detect and avoid systems are already way better than humans at picking up possible aerial collision causes and maneuvering out of the way.

    Iris Automation | Sense and Avoid: How it Works in Unmanned Aerial Vehicles

    If a machine can pick up on a threat, and then maneuver to avoid it and go about its business, is it not responding to ensure its own survival, just as the cockroach is? Regardless of whether it has 'fear' or not?
     
  17. timewerx

    timewerx the village i--o--t--

    +5,069
    Christian Seeker
    Single
    Emotion doesn't make you eat unless you're a "depressed eater" or something.

    Hunger makes you eat and they are signals that can be felt in different parts of the body.

    Man made programs also interact by signals. You can certainly do the same thing with self-preservation when ability to function normally is threatened.
     
  18. timewerx

    timewerx the village i--o--t--

    +5,069
    Christian Seeker
    Single
    A program that self terminates on a whim or don't care about its survival won't be useful. It could become non-functional long before its life expectancy, putting much of its procurement cost to waste and nobody (human or otherwise) would like that.
     
  19. durangodawood

    durangodawood Dis Member

    +9,445
    United States
    Seeker
    Single
    Pretty much all software created right now doesnt care about self preservation, yet people still make it. I elect to keep and update certain application for years. Decades now actually.

    Also, society may well exert an ethical push on creators to limit self-interest in AI. I image some would conform while others wouldnt.
     
  20. J_B_

    J_B_ Active Member

    464
    +140
    United States
    Christian
    Private
    Not at all. Your comment makes you sound unfamiliar with competitive sports/games.

    Computers will always have an edge over humans in terms of computational speed. If that edge is a significant factor in the game played, computers will always win. But computational speed is not (IMO) an indicator of intelligence - artificial or otherwise. That was my point.

    I proposed a hypothesis. If speed were eliminated as a factor by changing the rules, humans could beat computers under those new rules. It happens all the time in competition. Either the rules change to make things "fair" (something constantly happening in professional sports), or different leagues are created to allow everyone to compete (NCAA divisions I, II, II ... men's & women's basketball, etc.).

    Now, if you want to argue computational speed is an aspect of intelligence, then you would actually be addressing my post rather than just booing from the cheap seats.
     
Loading...