Hi Colin,
I thought you'd react in this way. It is a prediction of computationalism that
running certain lines of code should generate pain (and every other type of
experience). I realise it seems absurd when put like this, but there you have it.
I very much doubt that a superficial or top-down copy of an organism in pain
(which would be very easy to build) would actually experience pain, but a
bottom-up copy, an emulation of the individual neurons which resulted in behaviour
similar to the original organism... I find it as difficult to imagine such a being not
being conscious as a fellow organic human not being conscious. But I certainly don't
expect the "pain code" for such a being to be anything like what you have indicated
below.
If you believe that computer emulation of neural tissue behaviour will fail, at which
step do you think it will fail? The action potential, the cytoskeleton, the effect of the
neurotransmitters at the synapses, or where?
Also, you never explained if Marvin + machine is a zombie or not a zombie.
Stathis Papaioannou
----------------------------------------
> Date: Thu, 14 Dec 2006 07:35:05 +1100
> From: c.hales.domain.name.hidden
> Subject: RE: computer pain
> To: everything-list.domain.name.hidden
>
>
> Hi Stathis/Jamie et al.
> I've been busy else where in self-preservation mode ....deleting emails
> madly .....frustrating, with so many threads left hanging...oh well...but
> I couldn't go past this particular dialog.
>
> I am having trouble that you actually believe the below to be the case!
> Lines of code that experience pain? Upon what law of physics is that
> based?
>
> Which one hurts more:
>
> if (INPUT A) >= '1' then {
> OUPUT "OUCH!"
> }
> or
> if (INPUT A) >= '1' then {
> OUPUT "OUCH!""OUCH!""OUCH!""OUCH!"
> }
> or
> if (INPUT A) >= '10' then {
> OUPUT "OUCH!""OUCH!""OUCH!""OUCH!"
> }
>
> Also: In a distributed application....If I put the program on earth, the
> input on Mars and the CPU on the moon, which bit actually does the hurting
> and when? It's still a program, still running - functionally the same
> (time delays I know - not quite the same... but you get the idea).
>
> The idea is predicated on the proven non-existance of a physical mechanism
> for experience - that it somehow equates with manipulation of abstract
> symbols as information rather than the fabric of reality as information. -
> That pretending to be a neuron necessarily results in everything that a
> neuron participates in as a chunk of matter.
>
> It also completely ignores the ROLE of the experiences. There's a reason
> for them. Unless you know the role you cannot assume that the software
> model will inherit that role. With no role why bother with it? I don't
> have to put "OUCH!""OUCH!""OUCH!" in the above.
>
> What you are talking about is 'strong-AI' --- its functionalist
> assumptions need to be carefully considered.
>
> Another issue: If a life-like artefact visibly behaves like it is in agony
> the only thing actually getting hurt are the humans watching it, who have
> real experiences and empathy based on real qualia. It might be OK if it
> were play. But otherwise? hmmmm.
>
> cheers,
>
> colin
>
> >
> >
> > Jamie,
> >
> > I basically agree with your appraisal of the differences
> > between living brains and digital computers. However, it
> > should be possible for a general purpose computer to
> > emulate the behaviour of a biological system in
> > software. After all, biological systems are just
> > comprised of matter following the laws of
> > physics, which are well understood and deterministic
> > at the size scales of interest.
>
> > When it comes to neural tissue, the emulation should be
> > able to replace the original provided that it is run on
> > sufficiently fast hardware and has appropriate
> > interfaces for input and output.
> >
> > While it would be extremely difficult to emulate a
> > particular human brain (as in "mind uploading"), it should
> > be easier to emulate a simplified generic brain, and easier
> > again to emulate a single simplified perceptual function,
> > such as pain.This means that it should be possible to store
> > on a hard disk lines of code which, when
> > run on a PC, will result in the program experiencing pain;
> > perhaps excruciating pain beyond what
> > humans can imagine, if certain parameters in the program
> > are appropriately chosen. What might a simple example of
> > such code look like? Should we try to determine what
> > the painful programs are as a matter of urgency,
> > in order to avoid using them in
> > subroutines in other programs?
> >
> > Stathis Papaioannou
> >
>
>
>
>
>
>
>
> >
_________________________________________________________________
Be one of the first to try Windows Live Mail.
http://ideas.live.com/programpage.aspx?versionId=5d21c51a-b161-4314-9b0e-4911fb2b2e6d
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list-unsubscribe.domain.name.hidden
For more options, visit this group at
http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Thu Dec 14 2006 - 08:02:52 PST