RE: computer pain

From: Stathis Papaioannou <>
Date: Wed, 13 Dec 2006 22:54:04 +1100

I basically agree with your appraisal of the differences between living
brains and digital computers. However, it should be possible for a general
purpose computer to emulate the behaviour of a biological system in software.
After all, biological systems are just comprised of matter following the laws of
physics, which are well understood and deterministic at the size scales of interest.
When it comes to neural tissue, the emulation should be able to replace the original
provided that it is run on sufficiently fast hardware and has appropriate interfaces for
input and output.
While it would be extremely difficult to emulate a particular human brain (as in
"mind uploading"), it should be easier to emulate a simplified generic brain, and easier
again to emulate a single simplified perceptual function, such as pain. This means that
it should be possible to store on a hard disk lines of code which, when run on a PC,
will result in the program experiencing pain; perhaps excruciating pain beyond what
humans can imagine, if certain parameters in the program are appropriately chosen.
What might a simple example of such code look like? Should we try to determine what
the painful programs are as a matter of urgency, in order to avoid using them in
subroutines in other programs?
Stathis Papaioannou
> Date: Tue, 12 Dec 2006 23:19:05 -0800
> From:
> To:
> Subject: Re: computer pain
> Stathis,
> The reason for lack of responses is that your idea
> goes directly to illuminating why AI systems - as
> promoulgated under current designs of software
> running in hardware matrices - CANNOT emulate living
> systems. It an issue that AI advocates intuitively
> and scrupulously AVOID.
> "Pain" in living systems isn't just a self-sensor
> of proper/improper code functioning, it is an embedded
> registration of viable/disrupted matrix state.
> And that is something that no current human contrived
> system monitors as a CONCURRENT property of software.
> For example, we might say that central processors
> regularly 'display pain' .. that we designers/users
> recognize as excess heat .. that burn out mother boards.
> The equipment 'runs a high fever', in other words.
> But where living systems are multiple functioning systems
> and have internal ways of guaging and reacting locally and
> biochemically vis a vis both to the variance and retaining
> sufficient good-operations while bleeding off 'fever',
> "hardware" systems have no capacity to morph or adapt
> itself structurally and so keep on burning up or wait
> for external aware-structures to command them to stop
> operating for a while and let the equipment cool down.
> I maintain that living systems are significantly designed where
> hardware IS software, and so have a capacity for local
> adaptive self-sensitivity, that human 'contrived' HW/SW systems
> don't and mostly .. can't.
> Jamie Rose
> Stathis Papaioannou wrote:
> >
> > No responses yet to this question. It seems to me a straightforward
> > consequence of computationalism that we should be able to write a program
> > which, when run, will experience pain, and I suspect that this would be a
> > substantially simpler program than one demonstrating general intelligence. It
> > would be very easy to program a computer or build a robot that would behave
> > just like a living organism in pain, but I'm not sure that this is nearly enough to
> > ensure that it is in fact experiencing pain. Any ideas, or references to sources
> > that have considered the problem?
> >
> > Stathis Papaioannou
> >
Be one of the first to try Windows Live Mail.

 You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to
To unsubscribe from this group, send email to
For more options, visit this group at

Received on Wed Dec 13 2006 - 06:54:23 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:12 PST