Starting today August 7th, 2024, in order to post in the Married Couples, Courting Couples, or Singles forums, you will not be allowed to post if you have your Marital status designated as private. Announcements will be made in the respective forums as well but please note that if yours is currently listed as Private, you will need to submit a ticket in the Support Area to have yours changed.
"Thou shalt not make a machine in the likeness of a man's mind."
The Bicentennial Man - especially the novel - really does put this conundrum on its head. For those that haven't read it, it's about a robot that, through faulty programming, gets a will of his own (all robots in Asimovian novels are already sentient). He gets his own hobbies, a profession, owns his own property etc, but longs to be human. He becomes an expert at prosthetics and cybernetics and eventually has built himself a completely biological body. At that point, there are many humans who are more artificial than he is.
By Gottservants definition, he would at that point have to be defined as human. Of course, in the novel, he isn't - it isn't until he gives himself a finite lifespan by making his brain slowly destroy itself that the government formally declares him human, and the first bicentennial man - hence the title.
I think the novel aptly points out the problems with declaring that artificial beings are different. If we can create an artificial being, and if we can replace parts of ourselves with artificial prosthetics, then where do we draw the line? At what point does a human with prosthetics become a robot, and what makes a human different from a biological robot? Its origins? That's a meaningless distinction. Its soul? Before asserting that, we must show that the soul exists to begin with - otherwise that distinction is meaningless, too. So what, then?
*shakes head sadly*Not really what I was saying. If you did create a machine exactly like a man's mind, it would be some kind of intelligent machine, but not a robot. A robot implys controll from outside the robot and a lack of ability to think for itself.
*shakes head sadly*
Are you telling me no one got the Dune Reference?
On what basis could we justify killing them?
and they had artificial intelligence, and showed emotions like humans, but were in fact made up of circuits and wires (like the terminator), is there any reason why we shouldn't treat them the same as normal human beings?
This post is 100% rambling, so be forewarned.
.....
Of course all of this is only a problem if you believe that it is possible for robots to be somehow human (eventually) in the first place. I suppose such a definition would have to be mainly behavioral, since that is the only thing that we can test with robots and have it line up with humanity. But if the robots we call human are the ones that act (sufficiently) like humans, I have to ask why we would want to make robots like that in the first place. They certainly wouldn't be better at the jobs for which we currently like to use robots. And if we really do think that such robots should be treated as humans we lose one of the major benefits of robots: it's not a big deal putting them in dangerous situations. So what I mean is that you certainly wouldn't want a human bomb squad robot, even if you could do it.
and they had artificial intelligence, and showed emotions like humans, but were in fact made up of circuits and wires (like the terminator), is there any reason why we shouldn't treat them the same as normal human beings?
Doesn't that depend a lot on just waht a robot can do? We use a lot of farm machinery, something that can actually make judegement calls could do this a lot better. On the bomb squad or other dangerous there is a solution that raises even more questions. Think of a computer decended robot. Is the personallity in the memories? For the mechanical man these could be copied. A robot in bomb disposal could download before going out. Then 'death' just loses the last day.
BTW there has been a test for a long time the Turing test. I wonder just what percentage of humans pass it however.
Good point.Why bother giving a robot a personality? All it needs to do is to do its job.
You know, sort of playing the devils advocate here...The bible teaches that "the life is in the blood".
robots = No blood = no life = no equality.
(I'm speaking briefly because I don't want to slow any discussion down).
Doesn´t follow.considering in simple terms the choice to believe or to do good vs evil
for an atheist, humans have complicated brains and their choices are governed by the mechanics of these brains and how they devolop
for the religious, humans have free choice regardless of the complexity of the brain
so for a religious person, the creator of the robot is responsible for its actions, and for the atheist the robot is responsible for its actions
If we create them in our own image, then they are human. Without soul, but still human.
Would we take responsibility for their sentience?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?