Jump to content
BrainDen.com - Brain Teasers
  • 0


superprismatic
 Share

Question

Assume you start off with a bankroll of $1,000,000. Consider the following 3 games:

Game 1: Flip a biased coin which lands Heads 45% of the time, and Tails the rest of the time.

If you play this game, you get a $1 to add to your bankroll if you flip Heads and you lose

$1 from your bankroll if you flip Tails.

Game 2: There are 2 cases:

_______Case 1: If your bankroll modulo 3 is 0, you flip a coin which lands Heads 1% of the

time, and Tails 99% of the time. You get $1 to add to your bankroll if you flip Heads,

Otherwise you lose $1 from your bankroll.

_______Case 2: If your bankroll modulo 3 is 1 or 2, you flip a coin which lands Heads 90%

of the time, and Tails 10% of the time. You get $1 to add to your bankroll if you flip Heads,

Otherwise you lose $1 from your bankroll.

Game 3: Flip and unbiased coin (50% of the time Heads, 50% of the time Tails). If the coin

lands Heads, play Game 1; if it land Tails, play game 2.

After 1,000,000 plays of each game (always starting with a bankroll of $1,000,000), which of

these games should increase your bankroll and which should decrease your bankroll?

Link to comment
Share on other sites

14 answers to this question

Recommended Posts

  • 0

Quick answer without doing the math

Case 1, decrease since odds are against you

Case 2, increase since 2/3rds of the time significant probability of winning.

Case 3.increase, quick estimate of odds in case 2 > 60%

Link to comment
Share on other sites

  • 0

I am never going to this casino :mad:

Obviously game 1 isn't going to work out in your favor. You will on average lose 100k and the chances of you breaking even or gaining money is about 10%

Game 2 is interesting. You will most likely start off losing $1 to start off and since the increment is $1 every time are you will always pull back to that 1 million you started out with at best and since that has a modulo of zero you have to lose money again. Since it is 10x more likely that you will lose money on module 1 or 2 versus gaining money on modulo 0 then you will probably end up losing money here too. Again not in a probability running mood, but I would assume that you would lose $10,000 on average through 1 million plays of game 2 but your chances of breaking even on this game are probably less than in Game 1.

Well..this one is obvious. If both games involve you losing money than letting a coin decide which game you lose on isn't going to net you any cash. However I am pretty sure you asked this to see that maybe by going between the two games, one where you lose more money overall but have a better shot of breaking even/gaining money and the other where you have less chance of breaking even but would lose less money in general, you could gain something. I don't see the point as far as figuring out this as my logic tells me that I would lose money so as far as answering the question goes...don't gamble

:P
Link to comment
Share on other sites

  • 0

They all (or technically both [in reference to Games 1 & 2 considering 3 merely redirects you to one of the aforementioned]) have the capability to increase and/or decrease your bankroll. You can't really ascertain probabilistic circumstances; you could land Heads one million times, you could land Tails one million times, or you could land a mixture of both Heads and Tails with either one outweighing the other -- how do we know which one specifically? It's impossible to tell.

Edited by Omniscience
Link to comment
Share on other sites

  • 0

Got a question:

After 1,000,000 plays of each game (always starting with a bankroll of $1,000,000), which of

these games should increase your bankroll and which should decrease your bankroll?

Does this mean:

(a) Start with $1M, make 1M plays of game 1; start with $1M, make 1M plays of game 2, start with $1M, make 1M plays of game 3, or

(b) (1 M times: (start with $1M, play game 1)), (1M times: (start with $1M, play game 2)), (1M times (start with $1M, play game3))

(b)

Link to comment
Share on other sites

  • 0

i'd pick game 2.

on your first play, you're at mod 3 = 1. So 90% to win and increase $1. Then mod 3 = 2. Again 90% to win and increase $1. Now mod 3 = 0 so 99% to lose and decrease $1.

Working it out, the chances to lose 2 games in a row when mod 3 = 1 or 2 is 1% which is the same chance to win 1 game when mod 3 = 0. So it seems like you're likely to bounce around between $1,000,002 and $1,000,000. That doesn't seem too bad to me.

Link to comment
Share on other sites

  • 0

Game 1 is pretty obviously a losing game.

For game 2, this isn't a totally legit formal calculation, but could help provide insight.

Suppose you start of with N dollars and want to find out what the odds are that after three plays you would have N+3 dollars or N-3 dollars, thus putting yourself at the same spot modulo 3 but with a net gain or loss of $3.

PW3 = 0.9 * 0.9 * 0.01 = 0.0081

PL3 = 0.1 * 0.1 * 0.99 = 0.0099

So game 2 is also a losing game.

For game 3, if your bankroll is 0 mod 3 then your probability of winning the next game is

PW = 0.5 * 0.45 + 0.5 * 0.01 = 0.23

If your bankroll is 1 or 2 mod 3 then your probability of winning the next game is

PW = 0.5 * 0.45 + 0.5 * 0.9 = 0.675

Since the probability of winning is dependent on your cash in mod 3, using a similar approach as for game 2 of calculating the probability of winning or losing three consecutive games gives

PW3 = 0.675 * 0.675 * 0.23 = 0.1048

PL3 = 0.325 * 0.325 * 0.77 = 0.0813

And this is a winning game.

I realize that my math isn't totally legit since I'm only considering the probability of getting from $N to $N+3 or $N-3 by winning three consecutive games, and it doesn't consider more circuitous routes. If you really want to, you can make a spreadsheet to get exact probabilities after each round.

Link to comment
Share on other sites

  • 0

Got a question:

Does this mean:

(a) Start with $1M, make 1M plays of game 1; start with $1M, make 1M plays of game 2, start with $1M, make 1M plays of game 3, or

(b) (1 M times: (start with $1M, play game 1)), (1M times: (start with $1M, play game 2)), (1M times (start with $1M, play game3))

(b)

I meant (a). I said 1M trials, but what I really want is how these compare for a lot of trials -- the long run, if you will.

Link to comment
Share on other sites

  • 0

Game 1 is pretty obviously a losing game.

For game 2, this isn't a totally legit formal calculation, but could help provide insight.

Suppose you start of with N dollars and want to find out what the odds are that after three plays you would have N+3 dollars or N-3 dollars, thus putting yourself at the same spot modulo 3 but with a net gain or loss of $3.

PW3 = 0.9 * 0.9 * 0.01 = 0.0081

PL3 = 0.1 * 0.1 * 0.99 = 0.0099

So game 2 is also a losing game.

For game 3, if your bankroll is 0 mod 3 then your probability of winning the next game is

PW = 0.5 * 0.45 + 0.5 * 0.01 = 0.23

If your bankroll is 1 or 2 mod 3 then your probability of winning the next game is

PW = 0.5 * 0.45 + 0.5 * 0.9 = 0.675

Since the probability of winning is dependent on your cash in mod 3, using a similar approach as for game 2 of calculating the probability of winning or losing three consecutive games gives

PW3 = 0.675 * 0.675 * 0.23 = 0.1048

PL3 = 0.325 * 0.325 * 0.77 = 0.0813

And this is a winning game.

I realize that my math isn't totally legit since I'm only considering the probability of getting from $N to $N+3 or $N-3 by winning three consecutive games, and it doesn't consider more circuitous routes. If you really want to, you can make a spreadsheet to get exact probabilities after each round.

I agree that this has got to be it. I figured there was something to the combination of the two games that would make everything work. I think that a good follow-up to this problem would be if you had the choice to switch freely between each game and if so what is the best strategy to make money on these 2 losing games

Link to comment
Share on other sites

  • 0

Game 1 is simple enough, you have a 45% chance of winning and 55% chance of losing.

For game 2, let P0, P1, and P2 be defined as probabilities of having a bankroll congruent to 0, 1, and 2 respectively (modulo 3) after already having played a large number of games. Doing the math, which is a bit tedious, we get P0 = 910/2018, P1 = 109/2018, and P2 = 999/2018. So the probability of winning your next game is P0*1/100 + P1*9/10 + P2*9/10 = 10063/20180, which is roughly 0,499. In the long run, the actual probability of winning a game will converge to this number, so game 2 is a slightly losing game. It is deceptively close to even money, but the fact remains: it will decrease your bankroll in the long run.

For game 3, define W0 and W12 as the probability of winning a game if your bankroll is congruent to 0 and not 0 respectively (modulo 3). We get W0 = 23% and W12 = 67,5% = 27/40. Now define P0, P1, and P2 as we did for game 2. They will be different for this game. We get P0 = 3842/17489, P1 = 6245/17489, and P2 = 7402/17489. The probability of winning the next game is P0*23/100 + P1*27/40 + P2*27/40, which is roughly 0,577. In the long run, the actual probability of winning a game will converge to this number, so game 3 is a winning game.

Edited by shakingdavid
Link to comment
Share on other sites

  • 0

game 1: clearly a losing game by the expected value principle

game 2: after each 3 rounds, you will gain $3 with .81% chance and lose $3 with .99% chance. if neither of these happen, then you just gain or lose one, remaining in the same set of 3. Therefore, it is also a losing game.

Game 3: you are given an equal chance of playing one of two losing games. therefore, it is also a losing game.

Link to comment
Share on other sites

  • 0

game 1: clearly a losing game by the expected value principle

game 2: after each 3 rounds, you will gain $3 with .81% chance and lose $3 with .99% chance. if neither of these happen, then you just gain or lose one, remaining in the same set of 3. Therefore, it is also a losing game.

Game 3: you are given an equal chance of playing one of two losing games. therefore, it is also a losing game.

You are correct for game 1 of course, but your logic is too hasty on game 2 and 3. Case in point, what if we copy the logic you use for game 2 and apply it to game 3?

Clearly if your bankroll is congruent to 0, you have a winning chance of (45%+1%)/2 = 23%

If your bankroll is congruent to 1 or 2, you have a winning chance of (45%+90%)/2 = 67.5%

So after each 3 rounds you will gain $3 with ~10% chance (.23*.675*.675) and lose $3 with ~8% chance (.77*.325*.325). This would make it a winning game according to your logic for game 2. (It actually is a winning game, but it's not so easy to conclude.)

Edited by shakingdavid
Link to comment
Share on other sites

  • 0

I realize that the credibility of my earlier solution can be doubted since I omitted the math part, so here we go with lenghty math:

Game 2: P, Q, and R (previously P0 through P2) are defined as the probabilities of having a bankroll congruent to 0, 1, and 2 respectively (modulo 3) after already having played a large number of games. Let #n mean "congruent to n mod 3". We know that in order to reach a bankroll #0, you must either lose from a bankroll #1 or win from a bankroll #2. So we get

(1) P = Q*.1 + R*.9, and similarly we get

(2) Q = P*.01 + R*.1, and

(3) R = P*.99 + Q*.9

Using (2) to substitute Q with P*.01 + R*.1 in (1), we get

(4) P = (P*.01 + R*.1)*.1 + R*.9, which can be simplified to

(5) P*.999 = R*.91

Using (3) to substitute R with P*.99 + Q*.9 in (2), we get

(6) Q = P*.01 + (P*.99 + Q*.9)*.1, which can be simplified to

(7) Q*.91 = P*.109

(5) and (7) combine to give us the ratio P to Q to R = .91 to .109 to .999. Since we also know that P+Q+R=1, we merely add .91+.109+.999=2.018 and conclude that P = 910/2018, Q = 109/2018, and R = 999/2018.

Game 3 is done the same way, but with different probabilities (23% to win from #0, 77% to lose. 67.5% to win from #1 or #2, 32.5% to lose)

Link to comment
Share on other sites

  • 0

Plasmid and shakingdavid got it correct. Magician would have been correct had Games 1 and

2 been independent. But the fact that Game 1 influences the bankroll used in Game 2 makes

then dependent (albeit, rather weakly). This type of situation is called a Parrondo Paradox,

after Spanish physicist Juan Parrondo, who discovered the paradox in 1996. Of course, it

is not a paradox at all, but can be explained by the dependencies between the games.

By the way, based on a simulation of 100,000,000 of each game, the Approximate expected $ win

in each game:

Game 1: -$0.10

Game 2: -$0.00269

Game 3: +$0.00323

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...