Re: computer pain

From: Brent Meeker <meekerdb.domain.name.hidden>
Date: Fri, 29 Dec 2006 11:32:08 -0800

Stathis Papaioannou wrote:
>
>
>
>
>
>
> Brent Meeker writes:
>
>> > Do you not think it is possible to exercise judgement with just a >
>> hierarchy of motivation?
>> Yes and no. It is possible given arbitrarily long time and other
>> resources to work out the consequences, or at least a best estimate of
>> the consequences, of actions. But in real situations the resources
>> are limited (e.g. my brain power) and so decisions have to be made
>> under uncertainity and tradeoffs of uncertain risks are necessary:
>> should I keep researching or does that risk being too late with my
>> decision? So it is at this level that we encounter conflicting
>> values. If we could work everything out to our own satisfaction maybe
>> we could be satisfied with whatever decision we reached - but life is
>> short and calculation is long.
>
> You don't need to figure out the consequences of everything. You can
> replace the emotions/values with a positive or negative number (or some
> more complex formula where the numbers vary according to the situation,
> new learning, a bit of randomness thrown in to make it all more
> interesting, etc.) and come up with the same behaviour with the only
> motivation being to maximise the one variable.

I think your taking "behavior" in a crude, corare-grained sense. But I thought when you wrote "with just a hierarchy of motivation" you meant without emotions like "regret", "worry", etc. I think those emotions arise because in the course of estimating the value of different courses of action there is uncertainity and there is a horizon problem. They may not show up in the choice of immediate action, but they will be in memory and may well show up in subsequent behavoir.

>
>> >Alternatively, do you think a hierarchy of > motivation will
>> automatically result in emotions?
>> I think motivations are emotions.
>>
>> >For example, would > something that the AI is strongly motivated to
>> avoid necessarily cause > it a negative emotion,
>> Generally contemplating something you are motivated to avoid - like
>> your own death - is accompanied by negative feelings. The exception
>> is when you contemplate your narrow escape. That is a real high!
>>
>> >and if so what would determine if that negative > emotion is pain,
>> disgust, loathing or something completely different > that no
>> biological organism has ever experienced?
>>
>> I'd assess them according to their function in analogy with biological
>> system experiences. Pain = experience of injury, loss of function.
>> Disgust = the assessment of extremely negative value to some event,
>> but without fear. Loathing = the external signaling of disgust.
>> Would this assessment be accurate? I dunno and I suspect that's a
>> meaningless question.
>
> That you can describe these emotions in terms of their function implies
> that you could program a computer to behave in a similar way without
> actually experiencing the emotions - unless you are saying that a
> computer so programmed would ipso facto experience the emotions.

That's what I'm saying. But note that I'm conceiving "behave in a similar way" to include more than just gross, immediate bodily motion. I include forming memories, getting an adrenaline rush, etc. You seem to be taking "function" in very crude terms, as though moving your hand out of the fire were the whole of the behavior. A paramecium moves away from some chemical stimuli, but it doesn't form a memory associating negative feelings with the immediately preceding actions and environment. That's the difference between behavior, as I meant it, and a mere reaction.

> Consider a simple robot with photoreceptors, a central processor, and a
> means of locomotion which is designed to run away from bright lights:
> the brighter the light, the faster and further it runs. Is it avoiding
> the light because it doesn't like it, because it hurts its eyes, or
> simply because it feels inexplicably (from its point of view) compelled
> to do so? What would you have to do to it so that it feels the light
> hurts its eyes?

Create negative associations in memory with the circumstances, such that stimulating those associations would cause the robot to take avoiding action.

>Once you have figured out the answer to that question,
> would it be possible to disconnect the processor and torture it by
> inputting certain values corresponding to a high voltage from the
> photoreceptors? Would it be possible to run an emulation of the
> processor on a PC and torture it with appropriate data values?

I think so. Have you read "The Cyberiad" by Stanislaw Lem?

>Would it
> be possible to cause it pain beyond the imagination of any biological
> organism by inputting megavolt quantities, since in a simulation there
> are no actual sensory receptors to saturate or burn out?

Pain is limited on both ends: on the input by damage to the physical circuitry and on the response by the possible range of response.

Brent Meeker

--~--~---------~--~----~------------~-------~--~----~
 You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Fri Dec 29 2006 - 14:40:33 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:12 PST