Re: MGA 2

From: Brent Meeker <meekerdb.domain.name.hidden>
Date: Sun, 23 Nov 2008 12:21:29 -0800

Bruno Marchal wrote:
>
> On 23 Nov 2008, at 15:48, Kory Heath wrote:
>
>>
>> On Nov 22, 2008, at 6:52 PM, Stathis Papaioannou wrote:
>>> Which leads again to the problem of partial zombies. What is your
>>> objection to saying that the looked up computation is also conscious?
>>> How would that be inconsistent with observation, or lead to logical
>>> contradiction?
>> I can only answer this in the context of Bostrom's "Duplication" or
>> "Unification" question. Let's say that within our Conway's Life
>> universe, one particular creature feels a lot of pain. After the run
>> is over, if we load the Initial State back into the array and iterate
>> the rules again, is another experience of pain occurring? If you think
>> "yes", you accept Duplication by Bostrom's definition. If you say
>> "no", you accept Unification.
>>
>> Duplication is more intuitive to me, and you might say that my thought
>> experiment is aimed at Duplicationists. In that context, I don't
>> understand why playing back the lookup table as a movie should create
>> another experience of pain. None of the actual Conway's Life
>> computations are being performed. We could just print them out on
>> (very large) pieces of paper and flip them like a book. Is this
>> supposed to generate an experience of pain? What if we just lay out
>> all the pages in a row and move our eyes across them? What if we lay
>> them out randomly and move our eyes across them? And so on. I argue
>> that if running the original computation a second time would create a
>> second experience of pain, we can generate a "partial zombie".
>>
>> Stathis, Brent, and Bruno have all suggested that there is no "partial
>> zombie" problem in my argument. Is that because you all accept
>> Unification? Or am I missing something else?
>
>
> Unification, I would say. But we have to be careful, unification
> becomes duplication or n-plication if the computations diverge. This
> does not change the content of the experience of the person, which
> remains unique, but it can change the relative personal probabilities
> of such content. I wrote once: Y = || ("multiplication" of the future
> secures the past). Third person bifurcation of histories/computations
> = first person differentiation of consciousness. But to go in the
> detail here would confront us with the not simple task of defining
> more precisely what is a computation, or what we will count has two
> identical computations in the deployment. Eventually I bypass this
> hard question by asking directly what sound Lobian machines can
> "think" about that ... leading to AUDA (arithmetical uda). But
> unification, in Bostrom's sense, is at play, from the first person
> experience. Alice dreamed of the Mushroom only once. But if we wake up
> by projecting the end of the movie on an operational optical boolean
> graph, simultaneously (or not) in Washington and in Moscow, then,
> although the experience of the dreams remains unique, the experience
> of remembering the dream will be multiplied by two. Indeed one in
> Moscow, once in Washington.

Why do they count as two instances? Because they supervene on physical
processes that are spacially distinct? That would assume that spacetime is
fundamental. Or is it because you assume that remembering the dream isn't
distinct process but must be mixed with other experiences related to the location?

Brent

Brent

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Sun Nov 23 2008 - 15:21:40 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:15 PST