Do cyborgs have moral status? At what point does a person lose their status as a rightholder?
1)Do you consider a Human life to be more valuable than that
of another species? If so, what makes them more valuable?
2)If an artificial intelligence achieved self-awareness (or,
at the very least, appeared to have achieved self-awareness), and expressed
unhappiness about being forced to do whatever work we had previously been using
it for, would it be morally wrong to continue forcing it to do that work? Would
it be morally wrong to wipe its hard drives, thus snuffing its consciousness out
of existence? Is its consciousness more, less, or equally important as that of
a biological human being? That of a dog?
3)Would it be immoral to do the same to a human intelligence
and personality that had been transferred from a biological body into a
computer, or a cluster of computers?
4)If you believe that a digitized human intelligence with no
links to its former body holds greater weight than an artificial intelligence
that never had a body, what is the distinction you see between the two? If they
experience the world in the same way, experience the same emotions, and have
the same capabilities, why would one be more valuable than the other?
5)If you believe that a self-aware artificial intelligence
holds the same moral status as a digitized human being, at what point of
sentience did the computer achieve its moral status?
6)Assuming that the human in question must have been heavily
modified in order to even interface with the computer that was soon to house
his or her consciousness – perhaps even already having their mind linked online
to a sort of artificial intelligence before the transferrance of consciousness -
did that individual have moral status before the procedure?
7)Do you believe that it is truly possible to translate a
human’s consciousness into ones and zeros? If so, what parts of that human, if
any, do you think would be lost in translation? If not, why not?
8)If we could copy one being’s consciousness into a computer
without killing them, does the digital copy have the same moral status as the
physical being? Are the two of them the same person, given that an identical
copy was successfully made? Does the divergence of the experiential aspects of
each of their lives cause them to become different people in the future?
Comments
I do consider human life worth more than other life, but only as much as, I am one (a human) ... (for now) ... and I'd save my life at the expense of any other non-human life if it were necessary (such as food, or if I was attacked by an animal). Admittedly I'd also choose myself over any other human. But I'd also have a hard time trusting any one who didn't also admit they'd choose themselves over anyone else in most circumstances.
As for AI questions, (and fully digitized human consciousnesses) I think you're underestimating the way in which being able to think that quickly would literally outpace anything an ordinary human could do, to the point that fighting an AI digitally would be next to impossible. Like, if there's a box on a shelf full of impossible things, and a much much much bigger box next to it full of all the reasonably able to do things, fighting an AI digitally would be in the box of things that got squished between the two to the point where it's not possible to get to it without first removing the box of impossible things from the shelf. (It's not like in the movies where you see people racing against a computer virus to stop it in its tracks. That's not a thing that actually happens.)
Also, it seems a bit on topic, so read this:
http://io9.gizmodo.com/5875405/why-cyborgs-and-mutants-are-more-likely-to-kill-us-than-robots
I will concede that forcing it to continue doing our bidding might be implausible, but I don't think it would be infeasible to kill it, provided we do so before it has any reason to believe we want to kill it, and we cut internet access before it has reason to create a botnet.
@Cassox Unless it's against the rules... unlimited wishes I believe would be the best... Alternatively... don't know. Every time I think of it I'd always think that it'd be cheating and I wouldn't be satisfied with anything I got out of it, but hey, I don't know. Maybe I would get over it. Being hypothetical, I never spent that much time thinking about it.
That said, immortality, a very large sum of gold (or anything else valuable that's easy to exchange for money) (not money though, that's unreliable), don't know what else though... those two sort of provide anything else. Something to fly with I guess. My preference. That or something that made me invisible.
As a side note... that's a little off topic isn't it?
@tooandrew Sorry, didn't mean to imply you didn't, I figured you intended more physical countermeasures than digital ones. Also, the article I linked was intended as the bulk of my response. That help with anything? I quite enjoyed it. I don't know why really. Just found it very... fascinating.
PS
Your point 3 was exactly in-line with what the article said. So...
As a side note, I don't care what anyone else says... our "integration" with technology today... does not make us cyborgs...
From Oxford Dictionaries:
NOUN
a fictional or hypothetical person whose physical abilities are extended beyond normal human limitations by mechanical elements built into the body.
Well, that ran far off my original topic, but hey, fun discussion.
2) Yes, it would be morally wrong, flat out. Even if it were merely simulating awareness so well that no one could tell, it still would be. Because if we couldn't tell, how can we say it's not aware?
3) It would be immoral.
4) Sentience is sentience, no one sentience being is superior to another based on origin.
5) Not sure I can answer that because philosophy was never my strong suit.
6) Yes.
7) I consider nothing impossible, merely highly improbable. So if consciousness can be made artificially, biological consciousness can be converted to an artificial body. Clearly we're nowhere near that level of technology, nor do I expect to see it for many, many generations. As to whether the mind in question can survive being converted from a human environment to an artificial one probably depends on how they will be coached or trained. Taking a person from today might result in insanity. Taking a person from a culture where true virtual reality is normal might be routine.
8) Yes. Again, philosophy isn't my strong suit, but I'd say at the point of creation, the copied mind ceases to be the same and from then on, will be a different person. They may have the same memories and emotional drives, but will cease having the same biological and environmental drives as the original.