Re: QTI & euthanasia (brouillon)

From: Jason Resch <jasonresch.domain.name.hidden>
Date: Tue, 4 Nov 2008 16:21:58 -0600

Bruno,
Thanks I understand now. I must have misread previous posts of yours
because I had thought you had said "if I = the world, then the world is not
turing-emulable", but what you are saying is that "if I = the world and the
world is not turing emulable then comp is false". Regarding step 6 I
believe one's consciousness continues if they were to "upload" their brain
into a computer even if it necessitated the destruction of their biological
brain. To me this is no different logically from teleportation, although I
agree with Brent, if the simulated world in the computer is entirely cut off
from causal effects of the physical world where the computer is running,
then you have also created an entirely new world/reality.

Jason

On Mon, Nov 3, 2008 at 1:37 PM, Bruno Marchal <marchal.domain.name.hidden> wrote:

>
> On 03 Nov 2008, at 18:10, Jason Resch wrote:
>
> On Mon, Nov 3, 2008 at 5:22 AM, Bruno Marchal <marchal.domain.name.hidden> wrote:
>
>>
>>
>> To accept this I have to assume "I = the world", and that world is not
>> turing-emulable. But then comp is false.
>>
>>
>>
> Bruno,
>
> I have seen you say this many times but I still don't understand why it is
> so, perhaps I don't know how you are defining "I" or "world", but I was
> hoping you could point me to a paper of yours or a past post which explain
> this. In particular I do not follow how only one of "I" or "the world" can
> be computable, why not both? Does the UDA not enumerate all possible worlds
> and all possible Is?
>
>
>
> Jason, People,
>
> Well I apologize because I have send the draft (brouillon) of my answer to
> Brent by error on the list. I intended to send it to my home computer so
> that I can make corrections before. But's ok.
>
>
> Here I was recalling the definition of "generalized brain": the portion of
> the universe that you have to emulate digitally for surviving in a comp
> teleportation.
> Some people indeed want to make consciousness supervening on the brain +
> some context (the world), and see that as an objection to the uda. but that
> is why I put such context in the "generalized brain", and the argument still
> go through, unless that generalized brainis supposed to be not turing
> emulable.
> The thought experiment per se is harder to perform (how to put the moon in
> the teleportation box for those who put the moon as part of their
> context-brain!), but when the DU is introduced we see that the "bigness of
> whatever is taken as a context" is not relevant, as far as it is
> computable.
> COMP assumes that such a digital relatively relevant descriptive portion of
> universe exists (by definition), so if you put the moon or the entire cosmos
> in the definition of your brain, we are still under the comp assumption.
> Unless, of course, the moon or the context or world is assumed to be non
> turing emulable. In that case comp is false, because you are saying that
>
> -my real generalized brain (by definition the things on which your
> consciousness supervenes here and now) is equal to my organic brain in my
> skull + my body + the moon + the cat, and then you add
> - and my cat is non turing emulable,
>
> then of course comp is false, your generalized brain is not turing emulable
> (it works only the non turing emulable cat).
> This is simple logic (any difficulty here can only be explained by my poor
> english or something like that. Please tell me if you grasp what I try to
> say here. It is not particularly deep).
>
> The point of all this is that we can reason *despite* we cannot define "I"
> or "the world". Comp is just the bet that the I, the I that I feel, can be
> recovered by a third person "I" description, whatever it is, under the
> condition of belonging to the computable things locally. Brent seems to
> pretend that he is able to distinguish real and virtual reality.
>
> (Note that in a post to Brett Hall, I explain tat we *can* do that in a
> relative way, but it takes a long time, and we have to survive through it
> before, and also it works only statistically. Indeed quantum evidence
> gives, from a comp pov, such an evidence, I mean that we are in number
> matrix).
>
> What do you think of step six? Do you think you die, in step six?
> I use the generalized brain explicitly for preventing the move, for
> objecting the derivation, consciousness supervenes on brain + context.
>
> Brent, what if I send you regularly on mars by teletransportation,
> assuming you are a fidel tourist of my Mars-teleportation company. yet
> during the year 2007 (but not 2008), due to budget restriction, I fail you,
> and send you to virtual mars. And then again on real mars after in 2008
> (better year!).
> You think this scenario is impossible in practice? If the comp level of
> substitution exist, I can fail you for any finite period of time, even
> without intervening directly on your brain memories (I need some high budget
> too for this of course).
>
> Sorry if I am unclear, but feel free all to ask for any clarifications,
>
> Bruno
>
>
> http://iridia.ulb.ac.be/~marchal/
>
>
>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Everything List" group.
To post to this group, send email to everything-list.domain.name.hidden
To unsubscribe from this group, send email to everything-list+unsubscribe.domain.name.hidden
For more options, visit this group at http://groups.google.com/group/everything-list?hl=en
-~----------~----~----~----~------~----~------~--~---
Received on Tue Nov 04 2008 - 17:22:06 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:15 PST