Machine Consciousness & Newcomb's problem

From: Saibal Mitra <smitra.domain.name.hidden>
Date: Mon, 17 Mar 2003 22:40:57 +0100

This sounds very strange to me. Arguably one could say that my brain is
simulating me (where I associate myself with a particular algorithm). I
would say that any physical process computing me has to have my
consciousness. So, if someone starts to simulate me, there is a 50%
probability that I will find myself in that situation. This, i.m.o.
reasonable assumption, solves the causality paradox in Newcomb's problem. In
this problem you are presented two boxes, A and B. A contains 1000 dollars,
and B either contains a million dollars or nothing. You are asked to choose
either box A and B or only B. Before you choose, a superintelligent creature
first predicts what you will choose and if he predicts that you will choose
only B he will put a million dollars in there, otherwise B will be left
empty. The paradox is that once you stand in front of the two boxes, and the
creature has made his prediction and filled the boxes accordingly, the
contents of the boxes is already fixed. Your choice cannot influence events
in the past. So choosing B and A should yield more than choosing only B.
This paradox can be resolved by recognizing that the only way the creature
could have made a prediction is by simulating you perfectly. That means that
when you stand in front of the two boxes, you cannot be sure of whether you
are in the real world, or whether you are in the simulated world. If the
simulation were somehow imperfect and the simulated version of you were
somehow different than the ''real'' you, you could exploit this to beat the
creature. Let me give an example.

Suppose the setup is as follows . The boxes are in a room. You will be asked
to enter the room at a certain time. Ten minutes before that time the
creature scans your brain and computes what you will do next (including
during the ten minutes before you enter the room). Now, suppose you could
sneak in the room before your brain is scanned and draw a small cross on a
wall. If the creature is unaware of this, he will simulate you in the room
without the cross on the wall. Therefore your strategy should be to choose
only box B if you don't see the cross on the wall, and to choose A and B if
you do see the cross.



----- Original Message -----
From: Gary Oberbrunner <garyo.domain.name.hidden>
To: <Fabric-of-Reality.domain.name.hidden>
Sent: Monday, March 17, 2003 6:08 PM
Subject: Re: Parmenides' Principle


>
> Bruno Marchal wrote:
> > At 10:12 14/03/03 -0500, Gary Oberbrunner wrote:
> >>On the basis that if you kick the computer, the simulation does not kick
> >>back (even if the computer does).
> >
> > If you simulate "rain" on a computer, you will not be wet.
> > But if you simulate "rain + me" on a computer, I will be wet. At least
> > that is what the simulated "I" will pretend. And with the comp hyp,
> > "I" will be right, in the sense of feeling genuinely being wet .
> > So a simulation kicks back relatively to a simulation.
> > That is not so different than in MWI "branch", where states are also
> > relative.
>
> As you say, you are not the simulation, nor are you (any of the yous in
> the multiverse) *in* the simulation. There is a simulated you, but to
> be clear we should call it something different like X. So if you
> simulate rain+x on the computer, x will be wet. You will not be. The
> simulation can be arbitrarily accurate, and x can really truly feel wet,
> but you (outside the simulation) will never feel wet from it.
>
> The original question was whether a branch which *contains* (not "is") a
> computer runnning a simulation of a certain non-physically-possible
> situation S. Then the question is: if that computer exists, does that
> mean that the multiverse contains S? I say no. If there were a branch
> which somehow *were* a simulation of that environment (whatever that
> means), then I'd agree that S exists in the multiverse. But those are
> completely different.
>
> -- Gary
Received on Mon Mar 17 2003 - 20:45:37 PST

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:08 PST