RE: computer pain

From: Stathis Papaioannou <stathispapaioannou.domain.name.hidden>
Date: Sat, 30 Dec 2006 18:44:18 +1100

Brent meeker writes:


> >> >and if so what would determine if that negative > emotion is pain,
> >> disgust, loathing or something completely different > that no
> >> biological organism has ever experienced?
> >>
> >> I'd assess them according to their function in analogy with biological
> >> system experiences. Pain = experience of injury, loss of function.
> >> Disgust = the assessment of extremely negative value to some event,
> >> but without fear. Loathing = the external signaling of disgust.
> >> Would this assessment be accurate? I dunno and I suspect that's a
> >> meaningless question.
> >
> > That you can describe these emotions in terms of their function implies
> > that you could program a computer to behave in a similar way without
> > actually experiencing the emotions - unless you are saying that a
> > computer so programmed would ipso facto experience the emotions.
>
> That's what I'm saying. But note that I'm conceiving "behave in a similar way" to include more than just gross, immediate bodily motion. I include forming memories, getting an adrenaline rush, etc. You seem to be taking "function" in very crude terms, as though moving your hand out of the fire were the whole of the behavior. A paramecium moves away from some chemical stimuli, but it doesn't form a memory associating negative feelings with the immediately preceding actions and environment. That's the difference between behavior, as I meant it, and a mere reaction.
>
> > Consider a simple robot with photoreceptors, a central processor, and a
> > means of locomotion which is designed to run away from bright lights:
> > the brighter the light, the faster and further it runs. Is it avoiding
> > the light because it doesn't like it, because it hurts its eyes, or
> > simply because it feels inexplicably (from its point of view) compelled
> > to do so? What would you have to do to it so that it feels the light
> > hurts its eyes?
>
> Create negative associations in memory with the circumstances, such that stimulating those associations would cause the robot to take avoiding action.

Would this be enough to make the light painful? The robot might become
sophisticated enough to talk to you and say that it just doesn't like the
light, or even that it has no particular like or dislike for the light but feels
compelled to avoid it for no reason it can explain other than "I have been
made this way". Compulsions in conditions such as OCD can be stronger
motivators than physical pain or other negative consequences. What special
feature of the robot and its programming would make the light actually painful?
 
> >Once you have figured out the answer to that question,
> > would it be possible to disconnect the processor and torture it by
> > inputting certain values corresponding to a high voltage from the
> > photoreceptors? Would it be possible to run an emulation of the
> > processor on a PC and torture it with appropriate data values?
>
> I think so. Have you read "The Cyberiad" by Stanislaw Lem?
>
> >Would it
> > be possible to cause it pain beyond the imagination of any biological
> > organism by inputting megavolt quantities, since in a simulation there
> > are no actual sensory receptors to saturate or burn out?
>
> Pain is limited on both ends: on the input by damage to the physical circuitry and on the response by the possible range of response.

Responses in the brain are limited by several mechanisms, such as exhaustion
of neurotransmitter stores at synapses, negative feedback mechanisms such
as downregulation of receptors, and, I suppose, the total numbers of neurons
that can be stimulated. That would not be a problem in a simulation, if you were
not concerned with modelling the behaviour of a real brain. Just as you could build
a structure 100km tall as easily as one 100m tall by altering a few parameters in an
engineering program, so it should be possible to create unimaginable pain or pleasure
in a conscious AI program by changing a few parameters. Maybe this is an explanation
for the Fermi paradox: once a society manages mind uploads, it becomes a trivial
exercise to create heaven, and the only thing they ever have to worry about again is
keeping the computers running indefinitely.

Stathis Papaioannou
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
 You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Sat Dec 30 2006 - 02:44:35 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:12 PST