Do cyborgs have moral status? At what point does a person lose their status as a rightholder?
1)Do you consider a Human life to be more valuable than that
of another species? If so, what makes them more valuable?
2)If an artificial intelligence achieved self-awareness (or,
at the very least, appeared to have achieved self-awareness), and expressed
unhappiness about being forced to do whatever work we had previously been using
it for, would it be morally wrong to continue forcing it to do that work? Would
it be morally wrong to wipe its hard drives, thus snuffing its consciousness out
of existence? Is its consciousness more, less, or equally important as that of
a biological human being? That of a dog?
3)Would it be immoral to do the same to a human intelligence
and personality that had been transferred from a biological body into a
computer, or a cluster of computers?
4)If you believe that a digitized human intelligence with no
links to its former body holds greater weight than an artificial intelligence
that never had a body, what is the distinction you see between the two? If they
experience the world in the same way, experience the same emotions, and have
the same capabilities, why would one be more valuable than the other?
5)If you believe that a self-aware artificial intelligence
holds the same moral status as a digitized human being, at what point of
sentience did the computer achieve its moral status?
6)Assuming that the human in question must have been heavily
modified in order to even interface with the computer that was soon to house
his or her consciousness – perhaps even already having their mind linked online
to a sort of artificial intelligence before the transferrance of consciousness -
did that individual have moral status before the procedure?
7)Do you believe that it is truly possible to translate a
human’s consciousness into ones and zeros? If so, what parts of that human, if
any, do you think would be lost in translation? If not, why not?
8)If we could copy one being’s consciousness into a computer
without killing them, does the digital copy have the same moral status as the
physical being? Are the two of them the same person, given that an identical
copy was successfully made? Does the divergence of the experiential aspects of
each of their lives cause them to become different people in the future?