Re: more torture

From: Stathis Papaioannou <stathispapaioannou.domain.name.hidden>
Date: Wed, 15 Jun 2005 00:01:51 +1000

Hal Finney writes:

>Let us consider these flavors of altruism in the case of Stathis' puzzle:
>
> > You are one of 10 copies who are being tortured. The copies are all
>being
> > run in lockstep with each other, as would occur if 10 identical
>computers
> > were running 10 identical sentient programs. Assume that the torture is
>so
> > bad that death is preferable, and so bad that escaping it with your life
>is
> > only marginally preferable to escaping it by dying (eg., given the
>option of
> > a 50% chance of dying or a 49% chance of escaping the torture and
>living,
> > you would take the 50%). The torture will continue for a year, but you
>are
> > allowed one of 3 choices as to how things will proceed:
> >
> > (a) 9 of the 10 copies will be chosen at random and painlessly killed,
>while
> > the remaining copy will continue to be tortured.
> >
> > (b) For one minute, the torture will cease and the number of copies will
> > increase to 10^100. Once the minute is up, the number of copies will be
> > reduced to 10 again and the torture will resume as before.
> >
> > (c) the torture will be stopped for 8 randomly chosen copies, and
>continue
> > for the other 2.
> >
> > Which would you choose?
>
>For the averagist, doing (a) will not change average happiness. Doing
>(b) will improve it, but not that much. The echoes of the torture and
>anticipation of future torture will make that one minute of respite
>not particularly pleasant. Doing (c) would seem to be the best choice,
>as 8 out of the 10 avoid a year of torture. (I'm not sure why Stathis
>seemed to say that the people would not want to escape their torture,
>given that it was so bad. That doesn't seem right to me; the worse it
>is, the more they would want to escape it.)
>
>For the totalist, since death is preferable to the torture, each
>person's life has a negative impact on total happiness. Hence (a)
>would be an improvement as it removes these negatives from the universe.
>Doing (b) is unclear: during that one minute, would the 10^100 copies
>kill themselves if possible? If so, their existence is negative and
>so doing (b) would make the universe much worse due to the addition
>of so many negatively happy OMs. Doing (c) would seem to be better,
>assuming that the 8 out of 10 would eventually find that their lives
>were positive during that year without torture.
>
>So it appears that each one would choose (c), although they would differ
>about whether (a) is an improvement over the status quo.
>
>(b) is deprecated because that one minute will not be pleasant due to
>the echoes of the torture. If the person could have his memory wiped
>for that one minute and neither remember nor anticipate future torture,
>that would make (b) the best choice for both kinds of altruists. Adding
>10^100 pleasant observer-moments would increase both total and average
>happiness and would more than compensate for a year of suffering for 10
>people. 10^100 is a really enormous number.

This analysis would be fine were it not for the fact that we are discussing
*exact copies* running in lockstep with each other. You have to take into
account the special way observers construct their identity as a unique
individual persisting through time, which you admitted in a recent post "is
a purely contingent, artificial, manufactured set of beliefs and attitudes
which have been programmed into us in order to help our genes survive." With
choice (a), although it seems like a good idea to end the suffering of 9/10
copies, it doesn't make the slightest bit of difference. In order to end a
person's suffering at a particular observer moment, you have to either
ensure that there will be no successor OM's ever again (i.e., death), or
provide a successor OM which does not involve suffering. As long as at least
one copy remains alive, that copy will always provide a successor OM for any
of the other copies which are killed. Subjectively, it will be impossible
for any of the copies to notice that anything has changed when they are
killed. This reasoning applies whether you consider the selfish interests of
one of the copies or the altruistic interests of all of them.

You might argue, as you have with your example of increased measure on
alternate days of the week, that it is still better to try to reduce the
total number of unpleasant experiences in the world, even if we cannot see
any change that may result. Perhaps that would be OK, all else being equal.
However, I provided choice (c) to show how this sort of reasoning can lead
to unfortunate outcomes. In (c), unlike (a), alternative successor OM's to
the torture exist. The result is that at the moment the choice is made, each
copy is looking at a 20% chance that the torture will continue and an 80%
chance that it will stop. At first glance, this doesn't look quite as good
as choice (a), if you follow the "try to reduce the number of unpleasant
OM's in the world" rule. But as shown above, it would be a terrible mistake
to choose (a), as you would be ensuring that the torture will continue.

Now consider choice (b). Let's assume, as you suggest, that the minute of
respite choice (b) provides is not spoilt by the knowledge that it is
bookended by torture. Certainly, if you were to randomly pull one OM out of
the set of all possible OM's the probability that it would involve torture
is now, for practical purposes, zero. But what does this mean for the poor
wretches populating my thought experiment? I suggest that it means nothing
at all. At the first person level there is no possible test, observation or
line of reasoning which could give the copies any clue as to whether they
are one of 10, 10^100 or unique. If all the copies kept a diary, you would
find that each of them tells exactly the same story, with a single narrator
and a single narrative, and each considers himself to be one individual from
start to finish. If you asked them what the best choice out of (a), (b) and
(c) is, they would all pick (c), because from their point of view (a)
results in continuous torture, (b) continuous torture with only one minute
respite, and (c) gives them an 80% chance of ending the torture for good.
Moreover, as each of them want (c) as the selfish choice, it must also be
the altruistic choice.

Finally, is the averagist/totalist distinction of relatively recent vintage?
Its application to left/right wing politics is something I had never thought
of before, and it's a rare thing when you come across a completely new way
of looking at something you have been familiar with for years.

--Stathis Papaioannou

_________________________________________________________________
REALESTATE: biggest buy/rent/share listings
http://ninemsn.realestate.com.au
Received on Tue Jun 14 2005 - 10:07:14 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:10 PST