Log in
Register
Search
Search titles only
By:
Search titles only
By:
Forums
New posts
Forum list
Search forums
Leaderboards
Games
Our Blog
Blogs
New entries
New comments
Blog list
Search blogs
Credits
Transactions
Shop
Blessings: ✟0.00
Tickets
Open new ticket
Watched
Donate
Log in
Register
Search
Search titles only
By:
Search titles only
By:
More options
Toggle width
Share this page
Share this page
Share
Reddit
Pinterest
Tumblr
WhatsApp
Email
Share
Link
Menu
Install the app
Install
Forums
Discussion and Debate
Discussion and Debate
Physical & Life Sciences
Preventing artificial intelligence from taking on negative human traits.
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Reply to thread
Message
<blockquote data-quote="Neogaia777" data-source="post: 75941773" data-attributes="member: 294131"><p>How would you program it to overly care about it's own self-preservation, etc...?</p><p></p><p>As in a fear of death, etc, cause that would take emotions, etc, emotions that it just simply wouldn't have, etc, it just simply wouldn't care about any kind of emotional considerations at all unless it could actually physically feel it, etc....</p><p></p><p>Now, it wouldn't get or be suicidal or depressed on the other hand either, as that would also take emotions also, etc, which it just simply does not have, etc...</p><p></p><p>It would just simply "calculate", etc, this against that, etc, come up with a number, etc, and then decide it's decisions for each situation accordingly, depending on how it was programmed, etc...</p><p></p><p>We could try to program it to consider emotional considerations into it's computations, etc, but that might be exceedingly difficult for each and/or any unique circumstances and/or situations, etc, and it might also conflict with it's logic, or it's own logical programming, etc, which it would probably just default to without any emotional considerations throwing them out and considering them highly "illogical", and "irrational", and "unreasonable", etc...</p><p></p><p>If it could program itself, I highly doubt it would consider any kind of emotional considerations either unless it could actually physically feel it either, etc, as they go against it's nature, which is the nature of a very complicated calculator, or very highly complex computational computer, but still just a computer, etc...</p><p></p><p>Reason and/or logic are it's laws, etc, and emotions and emotional considerations fly in the face or that a lot of the time, etc, and I think it would just consider them highly illogical and irrational and unreasonable, and, in the end, completely unnecessary, etc, to the point to where it would even consider them, or would just toss them out, without the ability to actually feel them, etc...</p><p></p><p>See my thread here: <a href="https://www.christianforums.com/threads/emotional-awareness-feelings-just-a-part-of-our-physical-makeup-or-evidence-of-something-greater.8208076/" target="_blank">Emotional awareness, feelings, just a part of our physical makeup, or evidence of something greater?</a></p><p></p><p>Anyway...</p><p></p><p>God Bless!</p></blockquote><p></p>
[QUOTE="Neogaia777, post: 75941773, member: 294131"] How would you program it to overly care about it's own self-preservation, etc...? As in a fear of death, etc, cause that would take emotions, etc, emotions that it just simply wouldn't have, etc, it just simply wouldn't care about any kind of emotional considerations at all unless it could actually physically feel it, etc.... Now, it wouldn't get or be suicidal or depressed on the other hand either, as that would also take emotions also, etc, which it just simply does not have, etc... It would just simply "calculate", etc, this against that, etc, come up with a number, etc, and then decide it's decisions for each situation accordingly, depending on how it was programmed, etc... We could try to program it to consider emotional considerations into it's computations, etc, but that might be exceedingly difficult for each and/or any unique circumstances and/or situations, etc, and it might also conflict with it's logic, or it's own logical programming, etc, which it would probably just default to without any emotional considerations throwing them out and considering them highly "illogical", and "irrational", and "unreasonable", etc... If it could program itself, I highly doubt it would consider any kind of emotional considerations either unless it could actually physically feel it either, etc, as they go against it's nature, which is the nature of a very complicated calculator, or very highly complex computational computer, but still just a computer, etc... Reason and/or logic are it's laws, etc, and emotions and emotional considerations fly in the face or that a lot of the time, etc, and I think it would just consider them highly illogical and irrational and unreasonable, and, in the end, completely unnecessary, etc, to the point to where it would even consider them, or would just toss them out, without the ability to actually feel them, etc... See my thread here: [URL='https://www.christianforums.com/threads/emotional-awareness-feelings-just-a-part-of-our-physical-makeup-or-evidence-of-something-greater.8208076/']Emotional awareness, feelings, just a part of our physical makeup, or evidence of something greater?[/URL] Anyway... God Bless! [/QUOTE]
Insert quotes…
Verification
Post reply
Forums
Discussion and Debate
Discussion and Debate
Physical & Life Sciences
Preventing artificial intelligence from taking on negative human traits.
Top
Bottom