Author Topic: A hypothetical question  (Read 1347 times)

0 Members and 1 Guest are viewing this topic.

Jacob1207

  • Guest
A hypothetical question
« on: August 04, 2007, 07:27:41 PM »
Okay, time to help generate some new content for this fresh slate.

Suppose, for the sake of argument, that it is possible to create an artificial intelligence that:

    (1) can think for itself;
    (2) has its own will;
    (3) has its own emotions; and
    (4) can distinguish right from wrong

What rights, if any, should such an intelligence receive?  Could or should it be considered a person?  And, of course, why?

-- Jacob

Offline 97531

  • Restricted
  • *
  • Posts: 2280
  • Gender: Male
  • Truth is Freedom
    • Father's Love Forum
Re: A hypothetical question
« Reply #1 on: August 04, 2007, 09:47:33 PM »
Are we talking of a robot or a living being?
My Blog       Father's Love Forum - New
IHWLAMAHOB
Christian Milkshake: Pressed down, shaken together and more than we can hope for

lovetruth

  • Guest
Re: A hypothetical question
« Reply #2 on: August 04, 2007, 09:50:01 PM »
it would have to have equal rights because once you create a problem for yourself you have to deal with the consequences.  ;)  like frankenstein's monster.

but, the problem with your idea is, of course, that it's an impossibility.  humans could never create something that can discern right from wrong.  :D

Jacob1207

  • Guest
Re: A hypothetical question
« Reply #3 on: August 04, 2007, 10:29:55 PM »
Are we talking of a robot or a living being?

It is not necessarily a robot, though presumably the intelligence could be downloaded into or could control a mechanical device like that.  As for whether or not it should (or could) be considered a living being is part of the question.  The artificial intelligence that I have in mind, however, is not biological or organic but computer-based.

Feel free to qualify your answers in any way ("If X then yes, if Y then no..." etc). 

-- Jacob

Offline 97531

  • Restricted
  • *
  • Posts: 2280
  • Gender: Male
  • Truth is Freedom
    • Father's Love Forum
Re: A hypothetical question
« Reply #4 on: August 04, 2007, 10:35:46 PM »
So you're thinking along the lines of VIKI in iRobot am I right?
My Blog       Father's Love Forum - New
IHWLAMAHOB
Christian Milkshake: Pressed down, shaken together and more than we can hope for

Jacob1207

  • Guest
Re: A hypothetical question
« Reply #5 on: August 04, 2007, 10:55:23 PM »
So you're thinking along the lines of VIKI in iRobot am I right?

Yes, VIKI (a supercomputer, for those who haven't seen the movie) would probably fit the description.   But I don't want, at this point, at least, to get tied to a specific example.  I'm just thinking of a generic computer-based artificial intelligence that meets these four criteria:

    (1) can think for itself;
    (2) has its own will;
    (3) has its own emotions; and
    (4) can distinguish right from wrong

Could such an entity be considered a person?  Should it be given any rights?  Could it be given rights, would that even make sense?  Or is being organic a requirement for the posession of certain rights, or of any?  If the AI that I described shouldn't be given any or many rights, would another type of AI with different characteristics qualify for them?

-- Jacob

Offline 97531

  • Restricted
  • *
  • Posts: 2280
  • Gender: Male
  • Truth is Freedom
    • Father's Love Forum
Re: A hypothetical question
« Reply #6 on: August 04, 2007, 11:15:46 PM »

Could such an entity be considered a person? 

No because it is merely AI

Should it be given any rights? 

In terms of having will, you would probably need to but what would the rights entail?

Could it be given rights, would that even make sense? 
See above

Or is being organic a requirement for the possession of certain rights, or of any?

There are rights for humans and for most animals so organic, yes but if the AI has the ability to reproduce itself that may change, rights would need some control mechanism.  Rights are not always freedoms but also restrictions 

If the AI that I described shouldn't be given any or many rights, would another type of AI with different characteristics qualify for them?

Not quite sure what you're getting at here, are we moving onto organic?

Late here, I will pick up 2morrow with your next post.

My Blog       Father's Love Forum - New
IHWLAMAHOB
Christian Milkshake: Pressed down, shaken together and more than we can hope for

meefsgirl

  • Guest
Re: A hypothetical question
« Reply #7 on: August 05, 2007, 07:48:14 AM »
Okay, time to help generate some new content for this fresh slate.

Suppose, for the sake of argument, that it is possible to create an artificial intelligence that:

    (1) can think for itself;
    (2) has its own will;
    (3) has its own emotions; and
    (4) can distinguish right from wrong

What rights, if any, should such an intelligence receive?  Could or should it be considered a person?  And, of course, why?

-- Jacob

I don't know about rights, but should it be considered a person?..  No, because it's not God breathed. An artificial intelligence is not a person no matter how good it gets as it was made by man. A person is a human being and has been God breathed to give it life.


lovetruth

  • Guest
Re: A hypothetical question
« Reply #8 on: August 05, 2007, 07:57:59 PM »
so, if we use dna to make a person (which i am not cool with, btw), it's a person, because we used the system of creation that God already put in place.  it's like His breath is still running through that.  good thought.

Jacob1207

  • Guest
Re: A hypothetical question
« Reply #9 on: August 05, 2007, 09:01:53 PM »
so, if we use dna to make a person (which i am not cool with, btw), it's a person, because we used the system of creation that God already put in place.  it's like His breath is still running through that.  good thought.

So, you're saying it comes down to being organic?  God cares about us because our brains use electro-chemical impulses to convey information and thoughts but he wouldn't care about an intelligence that used silicon-based circuits to do so?

It seems to me that it is the brain which is the most important part here.  After all, we don't consider an amputee to be less of a person for having lost part of her body.  And if someone sustained Darth Vader like injuries and subsequently became a cyborg, would he then be less of a person and would God love him less, or not at all?

The argument that the AI couldn't be a person because "God didn't make it, humans did," seems to have other implications.  Suppose a person with severe cancer recovers after being treated by doctors using modern techniques and medecine.  If he then said "God didn't cure me, the doctors, scientists, and drug-makers did," would you agree with him?  It seems to me that if you say "God didn't make the AI, humans did," you would also have to say "God didn't cure Bill of cancer, the scientists did." 

Anyway, the sorts of rights that I have in mind include, but are not limited to, the rights to:

    (1) continued existence (a person couldn't just "pull the plug" on it);
    (2) compensation for work (whatever it is that an AI would want); and
    (3) self determination.

These are all basic rights that all people are entitled to.  There are, of course, restrictions.  Children, for instance, have limited rights of self determination given their diminished capacity for thoughtful long-term planning and understanding relevant consequences of actions; parents generally exercise those in trust for kids.  Animals also have some rights; you can't simply torture a dog, even if you own it, but, as property, the wages from an animal's work, if any, accrue to said animal's owner, not the beast itself.  And, of course, one person's rights generally end once they begin infringing on those of another.

But the lesser rights of animals are not determined based on their DNA.  My cat can't own property because she can't conceive of what that means.  If a being is incapable of responsibly exercising a right, that right can be withheld from that being.  But it seems to me that the AI that I have described would be able to responsibly exercise a large variety of rights, possibly all of those that humans have.  Is the fact that the entity is computer-based and not organics-based really a proper distinction to discriminate on?  Note, of course, that rights have historically been denied to people based on gender and race, differences that we now consider it improper to discriminate upon.

-- Jacob

lovetruth

  • Guest
Re: A hypothetical question
« Reply #10 on: August 05, 2007, 11:37:16 PM »
no, it doesn't "come down to being organic".  there was a truthfulness in what she shared that was wise.

remember, we can get weird when we twist things.  they can get twisted so far that there aren't any good answers any more and the only thing you can do is bow.

yes, i agree that rights would be based on what's right, not our determination of the value of a thing, thus, my original response to your query.  ;)

Offline studier

  • Restricted
  • *
  • Posts: 1805
  • Gender: Male
Re: A hypothetical question
« Reply #11 on: August 06, 2007, 08:49:00 AM »
Okay, time to help generate some new content for this fresh slate.

Suppose, for the sake of argument, that it is possible to create an artificial intelligence that:

    (1) can think for itself;
    (2) has its own will;
    (3) has its own emotions; and
    (4) can distinguish right from wrong

What rights, if any, should such an intelligence receive?  Could or should it be considered a person?  And, of course, why?

-- Jacob

If it had all points, it would be considered a person. The reason why, is it fits the criteria for sentient intelligence recognized equal to human.

martincisneros

  • Guest
Re: A hypothetical question
« Reply #12 on: August 06, 2007, 07:28:39 PM »
Suppose, for the sake of argument, that it is possible to create an artificial intelligence......
What rights, if any, should such an intelligence receive?  Could or should it be considered a person?  And, of course, why?

-- Jacob

It would be in the same category as an animal.  The Bible says that whatever Adam called the animals, that's what they were from then on.  AI would be a question of animal rights rather than being considered in the image of God.  It's not possible to have the image of God, without God, as Lucifer found out the hard way.  I wouldn't ever go so far as to say that AI should ever have voting rights, procreation with a human being rights, etc.  We might look upon such a "creature" as we do now where Gorillas are concerned.  It might occaisionally weird us out with similarities, but it's an animal, plain and simple.  But rights against abuse, exploitation, harm, etc., are a genuine possibility if AI ever became that sentient.  We'd reach that day when it would know it was, without that being suggested in it's programing or by being told that by another person - same as any of the rest of us knowing from our earliest days about our most basic, fundamental rights.

Offline studier

  • Restricted
  • *
  • Posts: 1805
  • Gender: Male
Re: A hypothetical question
« Reply #13 on: August 07, 2007, 03:38:35 AM »
I am not saying it is human, or that it has a soul.

If it was sentient, it has rights, plain and simple. :)

Seeker

  • Guest
Re: A hypothetical question
« Reply #14 on: August 15, 2007, 09:05:22 AM »
Sounds like a living organism to me.... :HeartThrob:

If something like this existed, I think the right thing to do would be to treat it with love, kindness, and respect.  I'm sure that's what our heavely Father would want.

SeekerRuth

Jacob1207

  • Guest
Re: A hypothetical question
« Reply #15 on: August 16, 2007, 10:10:00 AM »
Sounds like a living organism to me.... :HeartThrob:

If something like this existed, I think the right thing to do would be to treat it with love, kindness, and respect.  I'm sure that's what our heavely Father would want.

That's the position that seems to make the most sense to me as well.

I've also been asking this question of many of my friends and associates and I've appreciated the answers I've gotten.  Most people initially balk at the prospect; after further thought, some would give the described entity full rights, others limited rights, and some no rights.  I wonder now if the response would be different (and I suspect it would be) if the original question were rephrased as follows:

    Under what circumstances would it be proper to take away the rights of an entity that (1) can think for itself; (2) has its own will; (3) has its own emotions; and (4) can distinguish right from wrong?

I doubt anyone would answer "when that entity is an artificial intelligence."  (In answer to the reformulated question, of course rights can properly be taken away from someone through due process of law.  We, for instance, greatly proscribe the autonomy and freedom of people convicted of murder.  Whether any other circumstances justify taking away a person's rights is outside the scope of this thread, unless being computer-based is such a circumstance.)

Anyway, I see no reason to think that God wouldn't be as interested in such an entity as He is in us.  Nor do I see why it wouldn't have the same capacity that we do to reciprocate--or fail to reciprocate--God's love.  In this, I am influenced by a contemporary Anglican theologian, Keith Ward, who, in a recent book, God, Faith and the New Millennium (1998), writes:

    I can think of no reason why artificially constructed personal beings should not exist.  If they do, they will have as much reason to hope for immortality, and for knowledge and love of God, as naturally reproduced organic beings have.  There will be a Last Judgment for computerised intelligences as well as for humans, and they will take their place in the resurrection world with whatever other sorts of finite creatures there may be.[/li]

He says a few other things on the topic as well.  I'm not as sanguine as he is about the possibility of such a being, since we presently don't have even the slightest idea why we are conscious.  But I certainly don't see an impossibility about it.  I think, for now, it might be best to be agnostic about the question, which will probably not be answered in our lifetimes--though I must admit a personal hope that such a being will eventually be possible.

-- Jacob