Re: MGA 2

From: Kory Heath <kory.domain.name.hidden>
Date: Thu, 27 Nov 2008 13:36:08 -0800

On Nov 26, 2008, at 5:29 AM, Stathis Papaioannou wrote:
> Yes. Suppose one of the components in my computer is defective but,
> with incredible luck, is outputting the appropriate signals due to
> thermal noise. Would it then make sense to say that the computer isn't
> "really" running Firefox, but only pretending to do so, reproducing
> the Firefox behaviour but lacking the special Firefox
> qualia-equivalent?

It seems to me that this reasoning creates just as serious a problem
for your perspective as it does for mine. Suppose we physically remove
the defective component from the computer, but, with incredible luck,
the surrounding components continue to act as though they were
receiving the signals they would have received. Your experience of
using Firefox remains the same, so (by your argument above) it
shouldn't make sense to say that the computer isn't "really" running
Firefox. But we can keep removing components until all that's left is
a monitor that, with incredible luck due to thermal noise, is
displaying the pixels that would have been displayed if your computer
was actually functioning, doing things like displaying a mouse-pointer
that (very improbably!) happens to move when you move your mouse, etc.

This is, of course, just a recapitulation of the argument we've
already been considering - the slide from Fully-Functional Alice to
Lucky Alice to Empty-Headed Alice. I have an intuition that causality
(or its logical equivalent in Platonia) is somehow important for
consciousness. You argue that the the slide from Fully-Functional
Alice to Lucky Alice (or Fully-Functional Firefox to Lucky Firefox)
indicates that there's something wrong with this idea. However, you
have an intuition that order is somehow important for consciousness.
(Without trying to beg the question, I might use the term "mere
order", to indicate the fact that, for you, it doesn't matter whether
the blinking bits in some hypothetical 2D array were generated by
(say) a random process, it just matters that they display the
requisite order.) But the slide from Lucky Alice to Empty-Headed Alice
is just as problematic for that view as the slide from Fully-
Functional Alice to Lucky Alice is for mine.

My point isn't that your intuition must be incorrect. My point is that
the above argument fails to show me why your "mere order" intuition is
more correct than my "real order" intuition, since the argument is
equally destructive to both intuitions. Instead of giving up your
intuition, you make a move to Platonia. But in that new context, I
think it still makes sense to ask if "mere order" (for instance, in
the binary digits of PI) is enough for consciousness, and the Alice /
Firefox thought experiments don't help me answer that question.

> If by "Unification" you mean the idea that two identical brains with
> identical input will result in only one consciousness, I don't see how
> this solves the conceptual problem of partial zombies. What would
> happen if an identical part of both brains were replaced with a
> non-concious but otherwise identically functioning equivalent?

I was referring to the idea that my Conway's Life version of Bruno's
MGA 2 may only present a problem for Duplicationists. If one believes
that physically re-performing all of the Conway's Life computations
would create a second experience of pain (assuming that there's a
creature in there with that description), and if you *don't* believe
that the act of playing the move back creates a second experience of
pain, then you have a partial zombie problem. But it you accept
Unification, the problem might go away (although I'm unsure of this).

I still feel like I don't have a handle on how you feel the move to
Platonia solves these problems. If we imagine the mathematical
description of filling a 3D grid with the binary digits of PI,
somewhere within it we will find some patterns of bits that look as
though they're following the rules to Conway's Life. If we see
creatures in there, would they be conscious? What about the areas in
that grid where we find the equivalent of Empty-Headed Alice, where
most of the cells seem to be "following the rules" of Conway's Life,
but the section where a creature's "visual cortex" ought to be is just
filled with zeros? In other words, why doesn't the "partial zombie"
problem still exist for us in Platonia?

-- Kory



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Thu Nov 27 2008 - 16:36:25 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:15 PST