• 0

Whether to switch

Question

Posted · Report post

Is there no end to circumstances that ask whether we should switch?

This one might take some thought.

Your rich uncle places a sum of money into an envelope, twice that amount into a second envelope, places them both into a black bag and invites you to draw one out of the bag. You open the envelope that you draw and find $1000. You now know the envelope in the bag contains either $500 or $2000. Your uncle gives you the option to take the other envelope in exchange for the one you drew.

Calculate the expected gain that comes from switching.

0

Share this post


Link to post
Share on other sites

22 answers to this question

  • 0

Posted (edited) · Report post

Assuming that your Uncle has $3000, so he could stuff one envelope with 1000 and the other with 2000:

By switching you should have EV of 1250. You win 500 50% of the time and 2000 50% of the time, 50% of 2500 is 1250. So it's worth 250 extra to switch.

The real issue with all of these puzzles are the definitions. I don't think it's worth anything to switch, unless you have a clearly defined range of possible amounts in the envelopes. The possible range of values must have an upper limit, unless we are to accept a possible infinite envelope (I'm not). Let's say the range of possible values is 1-100 inclusive. And all possible parings are equally likely. Then the actual best strategy is if you open an envelope with 50 or less in it, you switch, otherwise you stand pat.

If you're not given a clear range, then a perfect strategy is not possible to construct. For instance, if the range was 1-1,000,000, but you didn't know it, all the value you might gain by switching, will be lost when you draw an envelope higher than 500,000 and switch into an envelope that must be smaller.

Edited by bubbled
0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Let us assume we knew before drawing that one envelope has $1000 and the other could be either 500 or 2000.


In that case, what is the expected value of the draw? The expected value is then 1125.

In our problem we have now drawn the $1000 envelope. This is disappointing since the expected value is higher. Given the option to switch, it now makes sense to do so!
0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Whatever the amount of money you found in the envelope you pulled let's call it X. Unless X is 1 penny, it's a 50% chance that the other envelope has higher or lower amount. The potential gain by switching is X. The potential loss is X/2. So, by switching you gain X * 1/2 - X/2 * 1/2 = X/4 on average.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Whatever the amount of money you found in the envelope you pulled let's call it X. Unless X is 1 penny, it's a 50% chance that the other envelope has higher or lower amount. The potential gain by switching is X. The potential loss is X/2. So, by switching you gain X * 1/2 - X/2 * 1/2 = X/4 on average.

Can you gain as much by switching back? Using the same reasoning?

There is a paradox: A gain for switching can be anticipated.

Yet, there is a preferred envelope, and if we initially chose it we should not switch.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

I would say no. The switching in and of itself is not what causes the gain in expected value. Assuming that the chances of 500 or 2000 (in your example) are equally likely in the other envelope, then the gain in EV, comes from calculating that value and determining it is higher then 1000, and then switching. Even though you haven't looked into the other envelope, its value can be calculated and switching back would lose the EV gain you made by switching the first time.

If you were to know the exact distribution (possible values and frequencies of those values) of all envelopes, then before you look into any envelope, you can assign it an expected value of an unknown random envelope. At that time, switching would not gain any EV, because absent any additional information, the other envelope has the exact same EV. Once you look into the first envelope, by knowing the overall distribution of all envelopes, then adding in the new and valuable information you have gleaned by reveling the contents of the first envelope, you can then make a proper choice as to whether or not to switch.

0

Share this post


Link to post
Share on other sites
  • 0

Posted (edited) · Report post

Let's see who can tell us what's wrong with this line of reasoning.

First, consider the case where the amount of money in the envelope could be any real number, not restricted to integers or whole cents.

There must be some probability distribution of how likely a value of $X is to appear in the envelope with the smaller amount of money and how likely it is to appear in the envelope with the larger amount of money. Since we're dealing with real numbers instead of integers, we should really deal with ranges of values instead of single values, and talk in terms of the probability that the money in an envelope is between any two values $X and $X+Y.

If we have absolutely no prior knowledge of how much money is in each envelope, I could propose that there is an equal chance of finding any value in the smaller envelope. (This should also work if the upper limit of how much money is in the envelopes is large compared to the value that you find.) That is, for any X and Z, the probability that the smaller envelope contains a value from $X to $X+Y is equal to the probability that it contains a value from $Z to $Z+Y. Let the probability that the smaller envelope contains a value from $X to $X+Y or from $Z to $Z+Y be denoted P(S,Y).

Now, what is the probability that the envelope with the larger amount of money contains a value over the range $X to $X+Y – denoted P(L,Y)? It must be exactly the same as the probability that the envelope with the smaller amount of money contains a value over the range $X/2 to $(X+Y)/2, which is equivalent to a range $Z to $Z+(Y/2), or P(S,(Y/2)). That would be equal to one half of P(S,Y).

That proves that P(L,Y) = P(S,Y) / 2. So if you are shown any arbitrary amount of money in the envelope, you will know that you are twice as likely to have the smaller envelope as you are to have the larger envelope. You should most definitely switch.

Next, consider the case where the amount of money in the envelope must be some integer value of dollars and/or cents. Let us also assume for the sake of the problem that there is an equal probability that the smaller envelope contains any possible amount of money, odd or even.

In this case, even before opening the envelope, we know that there is a 50% chance we picked the smaller envelope and a 50% chance we picked the larger envelope. If we see an odd number, we know for sure that we're looking at the envelope with the smaller amount of money – the probability of that happening is 25%. If we see an even number, then we know we initially had a 50% chance of picking either envelope before seeing what's inside, but now that we found an even number we've ruled out the 25% of cases where we picked the smaller envelope and found an odd number. That means that if we see an even number, we're twice as likely to have picked the larger envelope than we are to have picked the smaller envelope. So if you switch from an envelope with $X where X is even, you have a 1/3 chance of increasing your amount by $X, and a 2/3 chance of decreasing your amount by $X/2. So you could do either.

We also know that if we are restricted to integer values of dollars and/or cents, that if X is even, the probability that the larger envelope contains $X (denoted P(L,X)) must be equal to the probability that the smaller envelope contains $X/2 (denoted P(S,X/2)). If there is an equal probability that the smaller envelope contains any given amount of money, then the probability that the smaller envelope contains $X/2 must be equal to the probability that the smaller envelope contains $X. That is, P(S,X/2) = P(S,X). Combining all the equations in this paragraph: P(L,X) = P(S,X/2) = P(S,X), so P(L,X) = P(S,X). Unlike what we concluded from the previous paragraph, we now know that we are just as likely to see any even amount of money in the smaller envelope as we are to see it in the larger envelope, so you should switch envelopes.

I believe this proves all possibilities: one where you're more likely to have the smaller amount of money, one where you're more likely to have the larger amount of money, and one where they are equally likely.

Edited by plasmid
0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Here the probabilities on both occaisons seems to be equal as 50-50.


But you have already used 50 % chances to select right envelop having bigger amount.
Now if you take further chance Probility to win is reduced to half i.e. 25%. So switching is not advisable.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Whatever the amount of money you found in the envelope you pulled let's call it X. Unless X is 1 penny, it's a 50% chance that the other envelope has higher or lower amount. The potential gain by switching is X. The potential loss is X/2. So, by switching you gain X * 1/2 - X/2 * 1/2 = X/4 on average.

Can you gain as much by switching back? Using the same reasoning?

There is a paradox: A gain for switching can be anticipated.

Yet, there is a preferred envelope, and if we initially chose it we should not switch.

Not sure what you mean by "switching back". The 50/50 comes from randomly picking one of 2 envelopes from which we know one has double the money of the other. I don't see a paradox.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

I think not worth it to switch - there's no gain.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Whatever the amount of money you found in the envelope you pulled let's call it X. Unless X is 1 penny, it's a 50% chance that the other envelope has higher or lower amount. The potential gain by switching is X. The potential loss is X/2. So, by switching you gain X * 1/2 - X/2 * 1/2 = X/4 on average.

Can you gain as much by switching back? Using the same reasoning?

There is a paradox: A gain for switching can be anticipated.

Yet, there is a preferred envelope, and if we initially chose it we should not switch.

Not sure what you mean by "switching back". The 50/50 comes from randomly picking one of 2 envelopes from which we know one has double the money of the other. I don't see a paradox.

The paradox comes from the fact that $1000 is arbitrary.

Suppose I were to point to an envelope before any were opened, and I said "That envelope contains some amount of money; call it $X. The other envelope therefore must contain $2X or $X/2." You could now argue that the expected earnings from picking the other envelope are $5X/4, so you should choose the other envelope. But in reality, have I actually given you any more information than you already had when you only knew that one envelope contains twice as much money as the other?

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Thanks, plasmid. It's clear now.

It doesn't really matter if you switch or not.

Let's consider 2 players - Alex and Bob. Alex always switches after picking the first envelope and Bob always keeps the envelope he picked originally. The amounts in the envelopes are always the same and they play this game many times in succession. If switching strategy was beneficial then Alex should end up with more money than Bob, but in reality they will have the same amount of money. Since both pick randomly, both strategies have 50% chance of picking the envelope with the larger amount, so both players will get the large amount 50% of the time and small amount 50% of the time.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Whatever the amount of money you found in the envelope you pulled let's call it X. Unless X is 1 penny, it's a 50% chance that the other envelope has higher or lower amount. The potential gain by switching is X. The potential loss is X/2. So, by switching you gain X * 1/2 - X/2 * 1/2 = X/4 on average.

Can you gain as much by switching back? Using the same reasoning?

There is a paradox: A gain for switching can be anticipated.

Yet, there is a preferred envelope, and if we initially chose it we should not switch.

Not sure what you mean by "switching back". The 50/50 comes from randomly picking one of 2 envelopes from which we know one has double the money of the other. I don't see a paradox.

The paradox comes from the fact that $1000 is arbitrary.

Suppose I were to point to an envelope before any were opened, and I said "That envelope contains some amount of money; call it $X. The other envelope therefore must contain $2X or $X/2." You could now argue that the expected earnings from picking the other envelope are $5X/4, so you should choose the other envelope. But in reality, have I actually given you any more information than you already had when you only knew that one envelope contains twice as much money as the other?

I think k-man is right. There is no infinite switching paradox in this formulation. That is because being able to see the envelope amount ($1000) nails down the value of one envelope. In other words, the amount $1000 is not arbitrary. The moment we see it, Reverend Bayes has already entered the room.

So, let's say that we do some simple statistics. Given that the first envelope has value 1000, then the second envelope either has 500 or 2000.

So let's say we switch because the expected value of the unopened envelope is larger than 1000. Now we have an unopened envelope, and the other envelope has 1000. The value of the unopened envelope is still either 500 or 2000. The expected value of the unopened envelope in our possession is still larger than 1000, so switching again is unwise.

Now, if both envelopes are unopened, then that is a different situation. See discussion below.

Let me try to disentangle the paradox using Bayesian statistics

The discussion below will look to put a proper Bayesian prior on the envelope values A and B. We will show that when we have a proper prior, there is no infinite switching scenario, regardless of whether the first envelope is opened or not.

Let the envelope values be A and B. We need some prior distribution on the smaller value and the larger value. Without lack of generality, let's say that the smaller value min(A,B) has a uniform distribution between [0, 10,000]. Consequently, the larger value max(A,B) has the uniform distribution between [0, 20,000]. Their probability density functions are

pmin( x ) = 1/10000 for x in [0, 10000],

pmin( x ) = 0 otherwise

pmax( x ) = 1/20000 for x in [0, 20000],

pmax( x ) = 0 otherwise

So, let's look at the case where we open the first envelope and it is $1000. Given that A = 1000, we need to compute

P( A is smaller number | A = 1000) = pmin( 1000 )/[ pmin( 1000 ) + pmax( 1000 ) ] = 2/3

P( A is bigger number | A = 1000) = pmax( 1000 )/[ pmin( 1000 ) + pmax( 1000 ) ] = 1/3

This coincidentally is the same conclusion that plasmid came to in post #7. The expected value of envelope B is now (2000)*2/3 + (1/3)*(500) = 1500. So we should switch.

However, we see that there is no point to switching a second time. If we compute the expected value of envelope B given that A = 1000, we still see that E( B | A = 1000) = 1500.

What is happening here is that we are gaining some information from having seen the value of A. If A happened to be equal to 15,000, for instance, then from our prior we can calculate the following probabilities

P( A is smaller number | A = 15000) = pmin( 15000 )/[ pmin( 15000 ) + pmax( 15000 ) ] = 0

P( A is bigger number | A = 15000) = pmax( 15000 )/[ pmin( 15000 ) + pmax( 15000 ) ] = 1

In this case, if A = 15000, then A is 100% guaranteed to be the larger envelope. We should not switch.

So, let's consider the case where we don't see the value A or B. In that case, we will need to compute the integral of expected value for switching by integrating the equations above for A between [0, 20000]. If A falls within [0, 10000) then switching would on average gain money. However, if A falls within [10000, 20000], then switching would lose money. If we integrate over the entire range, we get expected value of 0 for switching. So that removes the infinite switching scenario.

Now, let me discuss the infinite switching paradox, which hinges on an improper prior distribution for A and B. That is, it assumes that A and B are uniformly distributed over the entire real line, and that leads to all sort of unintuitive trouble. I propose such a scenario is not possible, since the rich uncle can not possibly have unbounded money amounts in the two envelopes.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

I wrote a quick Java program to generate the total amount gained from switching or not over 100,000 trials with an envelope value of 1,000. Although the random number generator isn't entirely random, I thought it might help nonetheless.

import java.util.Random;



public class ProblemTest {

public static void main(String[] args) {

Random gen = new Random();
System.out.println(100000000);
long total = 0;
int otherenv = 0;
for (int i = 0; i < 100000; i++) {
otherenv = gen.nextInt(2);
if (otherenv == 0) {
total += 500;
} else if (otherenv == 1) {
total += 2000;
}
}
System.out.println(total);
System.out.println((double) 100000000 / total);

}

}

Total from not switching: 100,000,000.


Total from switching: 124,854,500.
The ratio: 0.8009322851799494.
0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

Morningstar: what would happen if you changed your program so the amount of money in the first envelope was random? And would that prove that it's always better to switch from whichever envelope you're looking at?

Bushindo: It's certainly better to not switch if you know that there is a ceiling for how much money could be in an envelope and you see an envelope containing more than half of that. But the problem doesn't make any mention of a ceiling. And it may very well be that you don't know how much money is in the bank account, or even if you did then you know that the amount of money in the envelopes is small compared to the size of the bank account but not precisely how small.

That might lead one down the road of looking for a probability distribution with an interesting property. But a complete answer to the question should take into account that the probability distribution could be anything.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

I wrote a quick Java program to generate the total amount gained from switching or not over 100,000 trials with an envelope value of 1,000. Although the random number generator isn't entirely random, I thought it might help nonetheless.

import java.util.Random;

public class ProblemTest {

public static void main(String[] args) {

Random gen = new Random();

System.out.println(100000000);

long total = 0;

int otherenv = 0;

for (int i = 0; i < 100000; i++) {

otherenv = gen.nextInt(2);

if (otherenv == 0) {

total += 500;

} else if (otherenv == 1) {

total += 2000;

}

}

System.out.println(total);

System.out.println((double) 100000000 / total);

}

}

Total from not switching: 100,000,000.

Total from switching: 124,854,500.

The ratio: 0.8009322851799494.

I think when you code this problem as plasmid suggested, you have to be careful to specify what kind of random distribution you are drawing from since that essentially will define your prior assumptions about A and B. Most random number generator have an explicit upper range, so it might be a problem to sample uniformly from a infinite real line.

Morningstar: what would happen if you changed your program so the amount of money in the first envelope was random? And would that prove that it's always better to switch from whichever envelope you're looking at?

Bushindo: It's certainly better to not switch if you know that there is a ceiling for how much money could be in an envelope and you see an envelope containing more than half of that. But the problem doesn't make any mention of a ceiling. And it may very well be that you don't know how much money is in the bank account, or even if you did then you know that the amount of money in the envelopes is small compared to the size of the bank account but not precisely how small.

That might lead one down the road of looking for a probability distribution with an interesting property. But a complete answer to the question should take into account that the probability distribution could be anything.

I think the discussion should be about what prior distribution is more representative of the puzzle conditions

The discussion above shows that the switching paradox is completely resolved for any well-defined probability distributions pmin( ) and pmax( ). That is, that integral of pmin( ) and pmax( ) over the entire real line must sum up to one, and they can be any well-defined distribution (gaussian, exponential, poisson, etc.). If we have those, then for any x, we can compute the posterior probabilities straightforwardly

P( A is smaller number | A = x) = pmin( x )/[ pmin( x ) + pmax( x ) ]

P( A is bigger number | A = x) = pmax( x )/[ pmin( x ) + pmax( x ) ]

I think the discussion here is precisely that we are not given any information about the prior distribution of A and B. pmin and pmax are our prior distribution, and we must construct an uninformative prior that reflects our beliefs about the distribution of A and B given the totality of circumstance.

I don't think it is reasonable to believe that pmin and pmax are distributed uniformly on the real line. That is because the universe is finite, and last I checked the number of atoms in the universe is less than 10100. So the uncle can not possibly have more than $10100 in the envelopes. That defines a hard upper limit on the envelope values.

I think given this compact support, it is more reasonable to define a non-informative prior for the sum of A and B as the uniform distribution on [0, S], where S < 10100. The solution of the puzzle is then subjective. Given what you know of the uncle, what do you think the value of for the sum S is? If S is 2000 or greater, then switch your $1000 envelope. If not, don't switch.

Suppose we remove the physical limitation on the money, and allow the money values A and B to be uniformly distributed on the real line. In this case, I counter that the resolution of the paradox comes from the fact that the expected value in each envelope is infinite. If both envelopes are unopened, so there is no point in switching since they both have infinite expected value.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

I believe there is a good reason why a probability distribution was not stated in the problem.

I can think of an "experiment" that I believe would be considered satisfactory by most people, that doesn't depend on the probability distribution of how much money is in the smaller or larger envelope. The answer would be appropriate if our only information were that $1000 is small compared to the available bank account (ruling out the trivial possibility that the bank account is less than $3000 which would make it impossible for the other envelope to hold $2000 and obvious that you should not switch).

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

I can think of an "experiment" that I believe would be considered satisfactory by most people, that doesn't depend on the probability distribution of how much money is in the smaller or larger envelope.

I'd love to hear about this experiment that does not depend on the probability distribution of how much money is in A and B. My feeling is that Reverends Bayes is hiding somewhere, possibly heavily disguised, in the set-up. But I may be wrong, I often am.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

The experiment is fairly simple. Randomly generate howevermany "smaller" envelopes you want {s1, s2, s3 ... sn} and for each of them generate a matching "greater" envelope {g1, g2, g3 ... gn} where the value in gx = 2sx. Let the value of all sx and gx be small relative to the bank account. The participant is given a random envelope and given the choice of whether or not to switch -- if he was originally given an envelope sx then he will switch to envelope gx, and if he was originally given an envelope gy then he will switch to envelope sy. Now compare what happens if the participant takes a strategy of switching vs a strategy of staying with the initial envelope.

The probability distribution is random, and results should generalize to any probability distribution that you face.

One could argue that it doesn't fit the OP because the player isn't presented a value of $1000. I would counter that the precise value that the player finds when he opens the envelope is arbitrary (you could multiply all of the s and g terms by any value you like) and would not affect the conclusions of the experiment.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

The experiment is fairly simple. Randomly generate however many "smaller" envelopes you want {s1, s2, s3 ... sn} and for each of them generate a matching "greater" envelope {g1, g2, g3 ... gn} where the value in gx = 2sx. Let the value of all sx and gx be small relative to the bank account. The participant is given a random envelope and given the choice of whether or not to switch -- if he was originally given an envelope sx then he will switch to envelope gx, and if he was originally given an envelope gy then he will switch to envelope sy. Now compare what happens if the participant takes a strategy of switching vs a strategy of staying with the initial envelope.

The probability distribution is random, and results should generalize to any probability distribution that you face.

One could argue that it doesn't fit the OP because the player isn't presented a value of $1000. I would counter that the precise value that the player finds when he opens the envelope is arbitrary (you could multiply all of the s and g terms by any value you like) and would not affect the conclusions of the experiment.

Can you clarify the part highlighted in red? Do you mean specifically to generate N random numbers from the uniform distribution between, say, 0 and L?

If I'm writing code for this experiment, I can't generate a random number without telling the computer precisely which probability distribution to use (and the corresponding distribution parameters). Most computer programs, for instance, will allow one to generate a random number uniformly between [0, L], but then you will need to supply the value for the upper limit L. (Reverend Bayes, is that you?)

Randomness comes in many forms (e.g., normal, uniform, exponential, etc. ) and I don't think it is possible to generate random numbers without specifying which probability distribution we are working with.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

It doesn't matter what random number generation method you use, the results will be the same. And there's no need to code the program if you can tell by looking at it what the results would be.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

It doesn't matter what random number generation method you use, the results will be the same. And there's no need to code the program if you can tell by looking at it what the results would be.

I think I see where we agree and where we diverge now. This two-envelope paradox has two variants,

A) There are two envelopes, both of which are unopened. We reach the same conclusion on this one.

I think we both agree on this one. Your (experiment) and (Bayesian posterior expectation) both agree that when we have two unopened envelopes, switching will not gain anything regardless of prior probability distribution. So we should not switch in this situation.

B) One of the envelope is opened and has $1000. This is where we disagree

My stance on this is that seeing the $1000 impart some information, which may affect your decision, depending on your prior information ( see for details).

I think your stance on this is that the $1000 does not give any information (from : "I would counter that the precise value that the player finds when he opens the envelope is arbitrary (you could multiply all of the s and g terms by any value you like) and would not affect the conclusions of the experiment."). That implies that the player should not switch regardless of what value he finds in the opened envelope.

But consider the situation where the uncle can only put monetary amounts that is discrete in the number of cents in the envelope. That is, the envelopes may hold $10.10, $9.01, or $10.99, but never any amount of finer precision like $10.9991 or $100.28631. Let's say that we open the first envelope and find $9.99.

The no-information-gained argument would say don't switch. But we gained some information from seeing the value of $9.99. Obviously, the envelope can not be the larger amount, since $9.99 is odd in the number of cents, so we should switch. If we change our strategy depending on what we see, then the value of the opened envelope is no longer arbitrary and irrelevant.

0

Share this post


Link to post
Share on other sites
  • 0

Posted · Report post

I had been considering the scenario where the amount of money in the envelopes could be any real number. In that case, if you have no information about the probability distributions, both an integration of your post 13 over the entire range of possible values in the envelope and the experiment of making random probability distributions shows that there is no gain for switching.

However, I'm still not sure I can make an adequate math - to - english translation of those results; in particular showing how this is fundamentally different from a game where you are given $1000 and asked whether you want to flip a coin to either double or half your winnings (which is a no-brainer) in a way that makes intuitive sense.

If the amount of money in the envelope is restricted to integers, that's a whole new can of worms because being even or odd gives information. I'll have to mull over post 7 again and decide which of those two conclusions I like the best.

0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now

  • Recently Browsing   0 members

    No registered users viewing this page.