Re: Observation selection effects

From: Stathis Papaioannou <stathispapaioannou.domain.name.hidden>
Date: Thu, 07 Oct 2004 23:35:55 +1000

Jesse Mazer wrote:

>>>I don't think that's a good counterargument, because the whole concept of
>>>probability is based on ignorance...
>>
>>
>>No, I don't agree! Probability is based in a sense on ignorance, but you
>>must make full use of such information as you do have.
>
>Of course--I didn't mean it was based *only* on ignorance, you must
>incorporate whatever information you have into your estimate of the
>probability, but no more. Your argument violates the "but no more" rule,
>since it incorporates the knowledge of an observer who has seen how much
>money both envelopes contain, while I only know how much money one envelope
>contains.

Sorry Jesse, I can see in retrospect that I was insulting your intelligence
as a rhetorical ploy, and we shouldn't stoop to that level of debate on this
list.

You say that you "must incorporate whatever information you have, but no
more" in the envelopes/money example. The point I was trying to make with my
envelope/drug example is that you need to take into account the fact that
the amount in each envelope is fixed, but again you are right, it was not
exactly analogous. But you have passed over the final point in my last post,
which I now restate:

(1) The original game: envelope A and B, you know one has double the amount
of the other, but you don't know which. You open A and find $100. Should you
switch to B, which may have either $50 or $200?

(2) A variation: everything is the same, up to the point where you are
pondering whether to switch to envelope B, when the millionaire walks in,
and hidden from view, flips a coin to decide whether to replace whatever was
originally in envelope B with either double or half the sum in envelope A,
i.e. either $50 or $200.

Now, which game would you prefer to play, (1) or (2)? They are not the same.
In game (1), if the $100 in A is actually the higher amount, if you switch
you will get $50 for sure; but in game (2) if the $100 is actually the
higher amount you have a 50% chance of getting $200 if you switch. It works
in the reverse way if the $100 is the lower amount - you could lose $50
rather than gain $100 - but the possible gain outweighs the possible loss.

Look at it another way: game (2) is actually asymmetrical. The amount you
win if you play it many times will be different if you switch, because you
really do have more to gain than to lose by switching (and the millionaire
will have to pay out more on average). On the other hand, intuitively, you
can see that your expected gains in game (1) should be the same whether you
switch or not. The paradox comes from reasoning as if you are playing game
(2) when you are really playing game (1).

Stathis Papaioannou

_________________________________________________________________
Discover how everyone & everything in our world's connected:
http://www.onebigvillage.com.au?&obv1=hotmail
Received on Thu Oct 07 2004 - 09:37:44 PDT

This archive was generated by hypermail 2.3.0 : Fri Feb 16 2018 - 13:20:10 PST