Jump to content
BrainDen.com - Brain Teasers

Prime

Members
  • Content Count

    871
  • Joined

  • Last visited

  • Days Won

    7

Posts posted by Prime


  1. I thought lions could pounce. But even if that is not allowed, the lion still must eat.

    Maybe, cutting the prey off, like you do it with zebra, will work for this hunt as well.

    After running directly towards the tamer for a bit (half a unit, or so,) the lion could just move directly to the cage wall. Whereupon, he will project the rendezvous point at the perimeter (assuming the trainer continues to walk along the wall) and sprint directly to that point in a straight line of a chord.
    E.g. if initially the central angle between the lion on the perimeter and the man on the perimeter is a, and the angle between lion’s starting position and projected meeting point is x, the lion could solve an equation to project the meeting place (all the while continuing to run):
    (x-a) = x*(2-2*cos x)1/2. (Using the formula for the chord length and taking into account the unit radius.)
    If the quarry changes his path, the lion can recalculate meeting point for the new trajectory of the prey, head to it in a straight line and have his lunch even sooner.


  2. Why should a hungry lion devise the best strategy of escape for a tasty tamer?

    Suppose, the man runs on some spiral curve approaching the middle of the cage. Then the lion has an inside track and must catch up with the tamer in the center of the cage at worst. And I smell, it would take a finite amount of time to reach the center.

    In a desperate attempt to escape retribution, the lion’s oppressor might try running along the perimeter of the cage. Then deviating from the imaginary radius could cause the target to change the direction and gain some distance. So cutting the prey off like you do with zebra may not work here.
    If you trace the lion’s path, it will look like a spiral with rings packed more and more densely as it approaches the perimeter. Until the lion comes within the pouncing distance...


  3. I be the Lion.

    After walking stealthily to the center of the cage, I draw an imaginary line from the center to the trainer. As the man runs sheepishly along some chord, the imaginary line rotates and I stick to it, while advancing towards my meal.

    There. Someone solve it for me.

    BTW, it was an Ogre last time, not a dog.


  4. There is that little technicality in the above post. And then there are couple other points to make. Suppose, each of the 6 individual variants deviate within a chosen interval with a probability approaching 1. Don't we need to show that the ratio of all combinations of the deviating variants to all possible combinations (of 6N, where N tends to infinity) also approaches 1?

    And then, of course, that “Law of Big Numbers”. Who passed, that law? When was it ratified? How is it enforced? The entire proof is riding on it.

    But, I suppose, that is beyond the scope of this forum, unless there is some clever and funny way of demonstrating that Normal Distribution thingy without using any integrals. So never mind that.

    If I accept the proof for ending up with nothing after riding my winnings infinite number of times, there are only few differences remaining in the interpretation of the OP and gambling philosophies.

    1) Does BMAD's casino compel its patrons to play forever once they started? I hope -- not. At least the OP does not mention it. (Could be in the fine print, though:)

    2) While I am being held to playing a pure form of riding the winnings (Post#38), somehow, Rainman is allowed a modification of betting just a half of his bankroll on each turn (Post#40). Which is a very good prudent winning form, but I like my modification (Post#33) with the minimum bet requirement of $1 and a bankroll of $20 better -- it seems faster and more exciting.

    3) Suppose we have limited time, limited resources, and BMAD allows his patrons to leave the game whenever they want to cash in their winnings. Then what is better: showing up at the casino with a bunch of singles and betting exactly $1 on each turn until closing time; or walking in with just $1 and riding your entire bankroll on each turn until satisfied with the amount won (more than a million) or rolling the die for 1000 times, whichever comes first?

    The riding seems more appealing to me. And 1000 consecutive rides look very good. After all, it is a long long way from 1000 to infinity.

    With 1000-roll riding, you cannot lose more than $1;

    your overall chance of ending up ahead looks like 48.78%;

    and your chance of winning more than a Million $$ looks like 4%, or better*.

    Whereas betting $1 at a time, you are basically working for $33 all day long.

    So choose.

    *My simulation does not stop at the Million $$, but keeps rolling until 1000 rolls are done. In so doing it loses in some cases the millions procured in the interim, or wins some crazy amounts to the tune of 1012, which BMAD's casino is not able to honor.

    Again, I invite anyone, who feels up to the challenge, to post theoretical justification (or disproof) to the statistics of a 1000-roll ride presented here.


  5. ...........

    Theorem: the probability of winning (or even getting at least a fixed non-zero return) approaches 0 as the number of rolls approaches infinity.

    Proof: There are six equally likely outcomes of one roll, let's denote them a1 through a6 (a1 being 0.7 and ascending to a6 being 1.5). Let fi denote the relative frequency of the outcome ai during a sequence of rolls. For example, if we roll 12 times and get 1*a1, 3*a2, 0*a3, 1*a4, 2*a5, and 5*a6, then f1 = 1/12, f2 = 3/12, f3 = 0/12, f4 = 1/12, f5 = 2/12, and f6 = 5/12. The exact number of outcomes ai is the relative frequency fi times N. The final outcome of our game will be G = 0.7f1N*0.8f2N*0.9f3N*1.1f4N*1.2f5N*1.5f6N = (0.7f1*0.8f2*0.9f3*1.1f4*1.2f5*1.5f6)N.

    The expected value of each fi is 1/6. By the law of large numbers, P(0.1666<fi<0.1667, for all i) approaches 1 as N approaches infinity. But if 0.1666<fi<0.1667, then G = (0.7f1*0.8f2*0.9f3*1.1f4*1.2f5*1.5f6)N < (0.70.1666*0.80.1666*0.90.1666*1.10.1667*1.20.1667*1.50.1667)N < 0.9998N. So the probability approaches 1 that your outcome is smaller than a value which approaches 0 as N approaches infinity. Conversely, for any given E>0, the probability approaches 0 that your outcome is at least E.

    I like the reasoning, but the proof by example uses a slanted interval. The true Median of 1/6 is not in the middle of the interval 0.1666 to 0.16667. What happens if you use a truly median interval, e.g., from 9/60 to 11/60? It seems to be leading to the opposite conclusion.


  6. It always amazes me when people can take what I thought to be a simple fun question and derive much complicated analysis out of it. Surprisingly this question came from a discussion with my 7 year old child.

    What makes these topics so interesting to me, is my lack of education. Math education in particular.

    Oh sorry, I didn't see that you already had the spoiler in post 22 showing that there's a <50% chance of ending up with $1 or more after repeatedly letting everything ride. So that shows that the median and most likely outcome is in fact a loss.

    Perusing your argument from the post 24, imagine a casino offered you a choice between two games, 1000 dice roll each:

    1) Stake $1 with a 5% chance of winning $1,000,000 or more; OR

    2) Deposit $300 with a 50% chance of winning something between $16 and $50, while in the extremely unlikely cases losing your entire $300 or winning $500.

    Which game would you play? (The percentages here are illustration -- not an actual calculation.)

    Prime, if you have a bankroll of $20 and can run method 2 twenty times, then of course you're going to win. No one here is arguing against that. But method 2 in OP does not say anything about you being able to reset your stake at $1 whenever you feel like it. You get one shot. One. Shot. For all purposes your bankroll is $1. If you lose you lose. The most likely result is you will lose. Your simulations all show that. If you run 10 simulations of 1000 rolls each and 6 of them are losing, it means 6/10 people will lose. The other four people will win. As N grows larger the probability of winning approaches 0, as I have proven mathematically. Which means, if all 7,000,000,000 people in the world run method 2 for long enough, the probability approaches 1 that they will all lose.

    I think, what we are arguing here is called Normal Distribution, or Gaussian Distribution, or Bell Curve, or some other such name.

    The math formulas describing this concept, involve

    Integrals, and special numbers such as Pi and e. I was too lazy to even start studying things like that when I was young, let alone now when it's been years since my daughter has finished her high school.

    However, I would be interested to see someone here to post a mathematical formula for calculating the Probability of ending up with the amount of X or more after N consecutive turns of riding the winnings. I would make a sincere effort to understand such an equation and, maybe, even try to disprove it.

    If we had just two possible values, it could be a straightforward borrowing of a formula with integrals from an appropriate math book or website. Having 6 possible values seems to complicate matters a bit. It is not all that easy to see exactly how the Payoff is affected by deviations from the Median.

    The Median Payoff tends to zero, as noted by bonanova at the very beginning of this topic. If you ride your winnings to infinity, you are more likely to end up with zero.

    However, I am not at all convinced that the probability of ending up with a positive outcome is zero for infinite trials. To prove that, one must study the correlation between the deviations from the Median Variation and the resulting Payoff. After all, at the infinity there will be infinite number of variations with positive payoffs.

    In other words, even if you show that the probability of the ratio of variants within any given interval around the Median tends to 1, you'd still have to show that the resulting Payoff within that interval is zero.

    For riding your winnings for 12 consecutive rolls, the Average Payoff (EV) is approximately 1.4821. (1.0333...)12

    The chance of ending up with $1.48 or more after 12 rolls is about 0.32774 (almost one third.)

    The chance of ending up with $7 or more after 12 rolls is just over 0.0145 (a respectable 1.45%).

    The chance of winning $6 in a game when you bet $1 at a time is 1/612 , or nil. (And that's the most you can possibly win.)

    10 averages of 100 trials of 1000-dice rolls:

    Lowest average payoff: $26,807

    Highest average payoff: $199,503,018.

    Overall percentage of ending up ahead after 1000 rolls: 48.78%.

    In view of the above, I think, the wining likelihood of a 1000-roll ride is being underestimated.


  7. Now's my turn to disagree with Prime.

    If you ride your winnings, the most likely outcome is not that you will win millions. The most likely outcome is that you will lose money. It's just that your payoff if you do win is much larger than your odds of losing, like a favorable lottery.

    The mean outcome of always letting everything ride is greater than your initial $1 entry wager.

    The median outcome of always letting everything ride is less than your initial $1.

    If there's doubt about this, it could be tested by simulating a bunch of runs and calculating both the mean and median.

    Which of those two, the mean or median, is "most important" is a philosophical argument.

    But I have run a bunch of simulations. And some numeric analysis as well. Enough to convince myself. See my posts 12, 20, 22, and 33 inside the spoilers.

    Riding your winnings for 1000 rolls, seems like a good choice. That won millions in my simulations in very short order of time.

    At this point I feel that for further discussion, we need to solve analytically what is probability of winning X $$ or more while riding for N consecutive rolls. E.g., $1,000,000 or more after 1000 rolls. For that one must find exact number of 1000-roll variations yielding more than 1,000,000 and divide that number by 61000.

    I don't feel up to the task at the moment. But I can provide any such data for up to 12 rolls.


  8. When playing indefinitely, you cannot cash in your winnings and spend the money on this side of Infinity. And on that other side, who will care how much you've won, or what fraction of $1 you have remaining?

    Strange things happen at the infinity:

    1. You are expected to end up with nothing, if you ride your winnings infinite number of times.

    2. That is because the most likely outcome is the one where each of the six possible roll values will have appeared the same number of times. The probability of that most likely outcome is zero. (That requires a proof though. See the spoiler.)

    3. Your average payoff after infinite number of rolls should be infinite, if you bet just $1 on every roll.

    4. Your average payoff if you ride your winnings is going to be even more infinite. In fact, it is going to be infinitely greater than that from the previous point (3). That is because when riding your winnings, the infinite variable is an exponent whereas when betting a fixed amount on each roll the infinite variable is a multiplier.

    5. To see your average payoff, you must play infinite number of sets of infinite number of rolls each. You need not live forever to go on that gambling binge. Just roll an infinite number of dice every time and roll them infinitely fast.

    Let's take N in the increments of 6: N = 6, 12, 18, …


    The number of all possible variations for N rolls of a die where each of the 6 values was encountered exactly N/6 times is: A = N! / ((N/6)!)6.

    The total number of variations for N rolls is T = 6N. Thus the probability of that median string is M = A / T.
    What is the limit of M as N tends to infinity? I can't tell just by looking at it. Note that the ratio M grows smaller as N gets larger, but less and less so.
    I.e., MN+1 = MN*(N+1)*(N+2)*(N+3)*(N+4)*(N+5)*(N+6) / (N+6)6.
    The multiplier is less than 1, but it tends to 1 as N gets larger. I say, let math majors figure out, why this expression tends to zero.
    At any rate, you have about 50% chance or better that your very own infinite string of rolls has finished up ahead of that median payoff calculated by means geometrical. Still, very likely, the payoff is going to be zero.

    Coming back from the Infinity, as I said here before, for all practical purposes, it is impossible to lose in this game. And you must ride your winnings, if you want to win big.

    The Expected Value (average payoff) for a 1000 consecutive rolls is (1.0333...)1000 = 1.74*1014. You cannot run enough experiments to confirm it empirically. Nonetheless, the winnings are good.

    In order to help its patrons to win large sums of money, the casino added a small modification to the game. Namely, the minimum bet requirement of $1. So while rolling the dice and riding your winnings, if your total falls below $1, you must add enough cents to make your next bet at least $1.

    Having obtained change for

    $20 in coins I set to play few games of 1,000 consecutive rolls each starting with $1. If I rolled a single die by hand, 1,000 rolls would take me couple hours. (Could be more between ordering the drinks and having a conversation with a dealer.) I have repeated the experiment 10 times (a week's worth visits to the casino.)
    The total amounts I had to add to make the minimum bet in the course of 1000-roll settings ranged from $0.40 to $15.45. Four of the times I suffered a frustrating spell of bad luck, netting the losses between $3.52 and $8.26. The other six times I won, netting between $0.11 and $624,192,886.70. There were couple other wins in tens of thousands of $$.
    Running 10 averages for 100 of 1000-roll sets, I found average amounts of additions to the pot between $4.73 and $6.63. The average wins ranged between $2,321,368 and $249,479,144,535. (Of course, you must multiply that by 100 to calculate your total take home.) Most averaged wins were in hundreds of millions or more, and there were no average losses.

    For comparison a typical 1000-roll set of betting just $1 at a time would net somewhere around $33.33.

    To sum up:

    In this game having a bankroll of $20, and in reasonable amount of gambling time (few days)

    1) If you ride your winnings, you most likely end up with the winnings of millions of $$.

    2). When betting $1 at a time, in the same amount of time, you'll most certainly walk away with few hundred of $$. (Frankly, not fun and not worth your time.)


  9. So you're saying you wouldn't play a lottery with a 1/1,000,000 chance of winning $5,000,000 with a $1 ticket?

    +1

    Precisely. For $1 you'd be buying $5 worth of chances (provided the pot can't be split between several winners.) That's how I play lottery. In gambling EV is what counts.

    .....

    Theorem: the probability of winning (or even getting at least a fixed non-zero return) approaches 0 as the number of rolls approaches infinity.

    Proof: There are six equally likely outcomes of one roll, let's denote them a1 through a6 (a1 being 0.7 and ascending to a6 being 1.5). Let fi denote the relative frequency of the outcome ai during a sequence of rolls. For example, if we roll 12 times and get 1*a1, 3*a2, 0*a3, 1*a4, 2*a5, and 5*a6, then f1 = 1/12, f2 = 3/12, f3 = 0/12, f4 = 1/12, f5 = 2/12, and f6 = 5/12. The exact number of outcomes ai is the relative frequency fi times N. The final outcome of our game will be G = 0.7f1N*0.8f2N*0.9f3N*1.1f4N*1.2f5N*1.5f6N = (0.7f1*0.8f2*0.9f3*1.1f4*1.2f5*1.5f6)N.

    The expected value of each fi is 1/6. By the law of large numbers, P(0.1666<fi<0.1667, for all i) approaches 1 as N approaches infinity. But if 0.1666<fi<0.1667, then G = (0.7f1*0.8f2*0.9f3*1.1f4*1.2f5*1.5f6)N < (0.70.1666*0.80.1666*0.90.1666*1.10.1667*1.20.1667*1.50.1667)N < 0.9998N. So the probability approaches 1 that your outcome is smaller than a value which approaches 0 as N approaches infinity. Conversely, for any given E>0, the probability approaches 0 that your outcome is at least E.

    I think, you've misunderstood what I was asking to prove.

    With N rolls there are 6N total variations. There are TW variations ending in a payoff P > 1 and TL variations ending with the final payoff P < 1. TW+TL=6N.

    Find the limit of TW/6N as N tends to infinity.

    I agree that "Expected Outcome" tends to zero as the number of rolls N tends to infinity. I just don't see it all that relevant to winning in this game. If this game was offered in casinos, I would have most certainly won millions in a matter of days starting with a bankroll of $100 or so. And I wouldn't let the entire $100 ride in just one sitting. Money management still counts. However, exponential accumulation of winnings should be the predominant scheme to win big.

    Another sure thing is -- the Casino offering that game would go bust very quickly.


  10. Upon further reflection, I am leaning back to my first post(#3) and the solution therein. The Expected Outcome and Geometric Mean have no bearing on the winning scheme you must adopt in this game. In particular, the Geometric Mean formula does not offer any tangible means of predicting an outcome of a gambling binge, whereas the Expected Value formula works just fine in practice.

    To sum up:
    The betting pattern of staking your entire remaining bankroll on each turn yields average (Expected Value) payoff of P = $A*(( 0.7+ 0.8 + 0.9 + 1.1 + 1.2 + 1.5)/6)N = $A*(1.0333...)N,
    where $A is the starting amount, and N is the number of consecutive rolls of the die. For proof see post 12 inside the spoiler.
    That is all that counts and you need not read the rest of what I wrote here.

    We must keep in mind that N consecutive rolls of a die offer a total of 6Nvariations. E.g., 72 consecutive rolls yield 672 possible variations. That is a very large number, far beyond our computers' abilities. Therefore averaging out infinitesimally small fraction such as 100,000 samples of 72 dice rolls does not offer any significant statistical sample.
    Even so, the theoretical average winning factor of 72 dice roll is approximately 10.6. After running several 100,000 72-dice rolls sets, I got some odd average payoffs ranging from 8.2 to 13.7, but most experiments produced values rather close to 10.6.

    The argument in favor of geometric mean formula is that over a very large number of rolls, our six possible values would tend to be represented evenly. And the product (0.7*0.8*0.9*1.1*1.2*1.5) = 0.99792 being less than 1 would drag the resulting payoff down to zero. Therefore, it is reasoned that if you roll the dice many times, you surely will lose your bankroll.
    Actually, the median string of N dice rolls where the 6 values are represented in equal numbers (N/6) is a minute fraction of all possibilities. As N becomes larger that fraction becomes smaller.
    Say, we used this idea of median string to study an outcome of a gambling spree of 12 dice rolls. The geometric mean formula predicts 0.997922 = 0.995844. What practical use has that prediction? You still should expect an average win of 1.48 times your bet just as the EV formula predicts. And overall you will be winning (ending up with more than a $1) approximately 49.4% of the time.

    Here is an interesting question for further research.
    In this game what are your chances of finishing up with more than you started after N rolls?
    For N = 1, we know the probability of ending up ahead is 1/2, or 50%; for N = 2, it is 17/36. Observe the set of values my computer has provided:
    Number of Rolls...Chance ending up ahead
    1............................0.5
    2............................0.4722
    3............................0.4861
    4............................0.5069
    5............................0.4949
    6............................0.4892
    7............................0.4929
    8............................0.4947
    9............................0.4961
    10..........................0.4966
    11..........................0.4945
    12..........................0.4941

    For 12 consecutive rolls, the computer took few minutes to consider before spitting the answer and I decided to stop asking. We can see from the table above, the percentage of winning variations does not go straight down, but fluctuates back and forth. Empirical data from my game simulations suggests that the probability of ending ahead after 1,000 consecutive rolls is somewhere between 0.46 and 0.49.
    It would be interesting to find a formula. If not a formula, then answer to the question:
    Considering the median string of die rolls payoff tends to zero at the infinity, does that mean that the ratio of wins to overall variations goes to zero too? Or does it have some other limit? (A proof, if you please.)

    For all practical purposes, you cannot lose in this game (betting your entire remaining bankroll on each turn.)
    The Casino denied my application to allow rolling 100 dice at a time.
    Still, if you manage roll a single die 10 times per minute, it would take you less than two hours to complete a string of 1000 rolls. It would take you couple months to repeat that experiment 100 times when you most certainly would win millions (starting each set of 1000 rolls with a bankroll of just $1.) And for all practical reasons you don't even risk any money at all over that long run, since you can't lose more than $1 in each set, and about 47.5% of the time you come out ahead.
    In this very practical everyday situation, what kind of useful information the “Expected Outcome” of $0.70 carries?

    As we increase the number of consecutive rolls, we get less and less of a chance to see the expected average payoff. Yet, all the losses are packed very neatly between 0 and 1, while the wins are spread wide up to the amounts dwarfing the GNP of the entire Galaxy. (As already noted by Rainman.)

    Once again, those who don't believe it's a winning game, can play the Casino side.


  11. This example illustrates the difference between expected

    value and expected outcome. The expected value is +X/30, where X is the amount you put into the machine. The expected outcome of method 2 is something quite different though. Over the course of six games, you can expect one of each result, netting you 0.7*0.8*0.9*1.1*1.2*1.5*X - X = -0.00208X. Hence, the expected outcome is negative, but the expected value is positive. This might seem counterintuitive but it is not surprising, considering that you can never lose even one full dollar, while your potential winnings are unlimited.

    Suppose you play using method 2 for a very long time, starting with one dollar. What can you expect? Say you play N times, and let E>0 (small number). The probability that your bankroll is less than E approaches 1 as N approaches infinity. In other words, in the long run you will almost certainly lose almost the whole dollar.

    Suppose you try to play using method 2 until you have won 1 million dollars, and then you stop. Once again you're out of luck (unless you're in a whole lot of luck). The probability that your bankroll will ever exceed M approaches 0 as M approaches infinity. For M = $1,000,000, it is very unlikely that you will ever win that much.

    On the other hand, suppose we play using method 1 for a very long time, one dollar every game. What can you expect? Say you play N times, and let W>0 (big number). The probability that your winnings exceed W approaches 1 as N approaches infinity. In other words, in the long run you will almost certainly win as much as you want.

    So why will you get rich by method 1 but not by method 2? Because in method 1 your bankroll is infinite, while in method 2 your bankroll is a measly $1. In method 1, no matter how much you might lose in the short run, you will always be able to bet another dollar for the next game. This means you were already rich from the start, you have an infinite bankroll, and winning a finite amount of money is easy. In method 2, no matter how much you might lose in the short run, you will never bring in any outside money for the next game. This means you have a $1 bankroll, and winning a large amount of money is hard.

    The more you win using method 2, the more you stand to lose (as your bet goes up). But the more you lose, the less you stand to win (as your bet goes down).

    What of Prime's argument about your money growing exponentially with the factor of 1.0333... each game? If we were to actually increase our bet by that factor with each consecutive game, regardless of how the game went, then his argument would hold true. But then again we would have an infinite bankroll, as we would be able to compensate for any losses using outside money. In method 2, whenever we lose our next bet will be smaller, and it will be harder for us to make up for that loss. But whenever we win our next bet will be larger, and it will be easier for us to lose those winnings.

    So after 422 games, the expected value is over +1,000,000 dollars. But we can not expect to have nearly that much. We can expect to have (0.7*0.8*0.9*1.1*1.2*1.5)422/6 dollars, which is about 86 cents, because that is the expected outcome. This is an important concept for any gambler to be aware of. You can never expect to gain your full expected value. The expected outcome is always smaller than the expected value, and the reason for this is your limited bankroll. If you are only looking to maximize expected value, you will bet your entire bankroll on every favorable game you come across (higher bet equals higher EV), and inevitably lose all your money at some point.

    Finally, the big question, can you win money using this machine? We must disqualify method 1 because it assumes an infinite bankroll. We can't use method 2 because it expects to lose money. Let's say your bankroll is $1 to start with, and you want it to grow to $1,000,000. Can you do it? Can you make a poor man rich? The answer is yes. Rather than throwing in your whole bankroll each time, suppose you throw in a fraction of it. Let that fraction be k. We have 0<k<1. You have 1-k of your bankroll left over on the side. The possible returns are 0.7k, 0.8k, 0.9k, 1.1k, 1.2k, and 1.5k. Your new bankroll will be either 1-0.3k, 1-0.2k, 1-0.1k, 1+0.1k, 1+0.2k, or 1+0.5k. The expected outcome over six games is (1-0.3k)(1-0.2k)(1-0.1k)(1+0.1k)(1+0.2k)(1+0.5k), which has a local maximum of ~1.0492607 at k~0.491298 (courtesy of Wolfram|Alpha). A good approximation for the optimal strategy is to bet half your bankroll each game, for an expected bankroll growth factor of a little over 1.049 per six games. After no more than 1728 games you can truly expect to be a millionnaire.

    Yes, there it is -- Expected Value vs. Expected Outcome.

    I fancy myself as an avid gambler. And in practice I would choose the mix of the two methods, as Rainman has suggested here. (Playing infinite number of times is rather impractical.)

    I have run an experiment playing 422 consecutive rolls 10 times with the following results:

    4 times I had less than $0.01 left of my initial $1 bankroll.

    3 times I had between $0.02 and $0.35 left.

    3 times I won with the largest win of $83.62.

    That left me very much ahead (more than $80, while putting at risk just $10).

    The new question here is, what is the optimal string of consecutive rolls?

    3 times I

    It seems your post was cut off unfinished.

    Expected outcome is the true value for any gambler who seeks to eliminate the "gambling" part. The probability is 1 that your own outcome approaches the expected outcome as the number of games approaches infinity. So in the long run, you are practically guaranteed to win if your expected outcome is positive. The same is not true for expected value.

    Expected value is too influenced by the extremely high payoffs in the extremely unlikely variations. In this case, if you play for example 100 times, the extremely lucky variation where you hit 1.5 every time would yield a net payoff of roughly +$406,561,177,535,215,236. On the other hand, the extremely unlucky variation where you hit 0.7 every time would yield a net payoff of roughly -$1. The average net payoff, or expected value, would be (31/30)100-1, or roughly $27. So the variance (actual payoff - average payoff) is 406,561,177,535,215,209 for the luckiest variation and only -28 for the unluckiest variation. As follows, the expected value is extremely tilted by the high variance from impossibly lucky scenarios. You would need to be insanely lucky just to get anywhere close to the EV.

    Your own experiments illustrate this perfectly. The average net payoff for running 422 consecutive games 10 times is 10*(31/30)422-1, or roughly $10,220,338. Your actual net payoff was just more than $80, falling way short of the EV. Had you kept your entire bankroll running for all 4220 games, you would have lost almost everything. This was not a case of bad luck, but rather quite expected.

    Had you instead bet half your bankroll every time, with a starting bankroll of the same $10, for 4220 games, your expected outcome would have been ~10*1.049244220/6, or +$4,800,000,000,000,000. Feel free to simulate it, you will end up somewhere around that number. Your EV would of course be even higher, 10*(61/60)4220-1 or roughly +$2,000,000,000,000,000,000,000,000,000,000.

    I am not disagreeing with the Expected Outcome concept. (And I have described the same in the post #12 under “PERSPECTIVE 1”, before I read your post #8. Actually, I should have read your post first. Now I invite everyone to read the spoiler in the post #12, where I derive the formula for the EV with a proof.)

    However, in view of the question in the OP “Can you win in this game?” the Average/Expected Value seems very relevant for practical gambling.

    I also like your system where you bet half of your entire bankroll on each turn. Although, I can't say I follow the 1.049N/6 formula for this method. I'd have to think about it.

    Finding “optimal” system in terms of available time, initial bankroll, Expected Outcome, and Expected Value seems like more serious mathematical research.

    Another interesting and, perhaps, simpler thing to solve would be: What are the chances of ending up ahead after N rolls when staking your entire bankroll?

    (I could tell it's 17/36, or better than 47% for 2 consecutive rolls and 105/236, or better than 44% for 3 consecutive rolls.)

    I played few more games against my computer and it owes me a lot of money.

    After winning over $80 in 10 games of 422 consecutive rolls, where the Average Payoff, or EV is over $1,000,000; I played 10 settings of 1000 consecutive rolls each.

    Four of those sets I lost shamefully -- $1 each.

    The six wins, (discarding the cents, which I left as tips for the dealer,) were $7, $34, $112, $259, $544, and $546.

    After that I put the accumulation of rounding error issue out of my mind and set on playing 10,000-roll sets. Again, I played just 10. In six sets I got my entire bankroll of $1 way bellow a penny. The four wins were: $20; $114; $4,490,314; and a whooping $916,248,000,000.

    Am I just lucky, or am I that good, or is there a problem with my computer simulation? The 6 trailing zeroes in the last number are indicative of the rounding error.

    Notably, it takes the computer less than a second to go through 10,000 rolls of a die and spit out my winnings. If I did the same by hand in a Casino, each set of 10,000 rolls would take 3 hours or so. And to play 10 sets, could take a week, or more. But it would be worth it in this case.

    At any rate, who would like to play the part of the Casino in this game? Volunteers, please!


  12. ....

    On the second pull the entire stake is at risk. Correct.

    You subject your $1 to two pulls rather than to a single pull.

    .....

    I was not implying any actual "pulls" in my illustration. Nor did I say anything about withdrawing or adding any amounts to the stake.

    I did not mean it as an actual trial in the game. I meant it as the enumeration of all possible variations with their respective probabilities.


  13. This example illustrates the difference between expected

    value and expected outcome. The expected value is +X/30, where X is the amount you put into the machine. The expected outcome of method 2 is something quite different though. Over the course of six games, you can expect one of each result, netting you 0.7*0.8*0.9*1.1*1.2*1.5*X - X = -0.00208X. Hence, the expected outcome is negative, but the expected value is positive. This might seem counterintuitive but it is not surprising, considering that you can never lose even one full dollar, while your potential winnings are unlimited.

    Suppose you play using method 2 for a very long time, starting with one dollar. What can you expect? Say you play N times, and let E>0 (small number). The probability that your bankroll is less than E approaches 1 as N approaches infinity. In other words, in the long run you will almost certainly lose almost the whole dollar.

    Suppose you try to play using method 2 until you have won 1 million dollars, and then you stop. Once again you're out of luck (unless you're in a whole lot of luck). The probability that your bankroll will ever exceed M approaches 0 as M approaches infinity. For M = $1,000,000, it is very unlikely that you will ever win that much.

    On the other hand, suppose we play using method 1 for a very long time, one dollar every game. What can you expect? Say you play N times, and let W>0 (big number). The probability that your winnings exceed W approaches 1 as N approaches infinity. In other words, in the long run you will almost certainly win as much as you want.

    So why will you get rich by method 1 but not by method 2? Because in method 1 your bankroll is infinite, while in method 2 your bankroll is a measly $1. In method 1, no matter how much you might lose in the short run, you will always be able to bet another dollar for the next game. This means you were already rich from the start, you have an infinite bankroll, and winning a finite amount of money is easy. In method 2, no matter how much you might lose in the short run, you will never bring in any outside money for the next game. This means you have a $1 bankroll, and winning a large amount of money is hard.

    The more you win using method 2, the more you stand to lose (as your bet goes up). But the more you lose, the less you stand to win (as your bet goes down).

    What of Prime's argument about your money growing exponentially with the factor of 1.0333... each game? If we were to actually increase our bet by that factor with each consecutive game, regardless of how the game went, then his argument would hold true. But then again we would have an infinite bankroll, as we would be able to compensate for any losses using outside money. In method 2, whenever we lose our next bet will be smaller, and it will be harder for us to make up for that loss. But whenever we win our next bet will be larger, and it will be easier for us to lose those winnings.

    So after 422 games, the expected value is over +1,000,000 dollars. But we can not expect to have nearly that much. We can expect to have (0.7*0.8*0.9*1.1*1.2*1.5)422/6 dollars, which is about 86 cents, because that is the expected outcome. This is an important concept for any gambler to be aware of. You can never expect to gain your full expected value. The expected outcome is always smaller than the expected value, and the reason for this is your limited bankroll. If you are only looking to maximize expected value, you will bet your entire bankroll on every favorable game you come across (higher bet equals higher EV), and inevitably lose all your money at some point.

    Finally, the big question, can you win money using this machine? We must disqualify method 1 because it assumes an infinite bankroll. We can't use method 2 because it expects to lose money. Let's say your bankroll is $1 to start with, and you want it to grow to $1,000,000. Can you do it? Can you make a poor man rich? The answer is yes. Rather than throwing in your whole bankroll each time, suppose you throw in a fraction of it. Let that fraction be k. We have 0<k<1. You have 1-k of your bankroll left over on the side. The possible returns are 0.7k, 0.8k, 0.9k, 1.1k, 1.2k, and 1.5k. Your new bankroll will be either 1-0.3k, 1-0.2k, 1-0.1k, 1+0.1k, 1+0.2k, or 1+0.5k. The expected outcome over six games is (1-0.3k)(1-0.2k)(1-0.1k)(1+0.1k)(1+0.2k)(1+0.5k), which has a local maximum of ~1.0492607 at k~0.491298 (courtesy of Wolfram|Alpha). A good approximation for the optimal strategy is to bet half your bankroll each game, for an expected bankroll growth factor of a little over 1.049 per six games. After no more than 1728 games you can truly expect to be a millionnaire.

    Yes, there it is -- Expected Value vs. Expected Outcome.

    I fancy myself as an avid gambler. And in practice I would choose the mix of the two methods, as Rainman has suggested here. (Playing infinite number of times is rather impractical.)

    I have run an experiment playing 422 consecutive rolls 10 times with the following results:

    4 times I had less than $0.01 left of my initial $1 bankroll.

    3 times I had between $0.02 and $0.35 left.

    3 times I won with the largest win of $83.62.

    That left me very much ahead (more than $80, while putting at risk just $10).

    The new question here is, what is the optimal string of consecutive rolls?

    3 times I


  14. The six equally likely payoffs are .7 .8 .9 1.1 1.2 1.5.

    Their arithmetic mean AM is 1.033333 ...

    Their geometric mean GM is 0.9996530325.

    In Method 1 you pay $1.00 for each pull and remove your winnings.

    In Method 2 you pay $1.00 once and keep your stake at risk. Thus,

    AM (>1) gives you the expected return per dollar per pull for Method 1. (Winnings are added.)

    GM (<1) gives you the expected return per dollar per pull for Method 2. (Winnings are multiplied.)

    Post 3 paid $1.00, pulled the handle twice and then removed the winnings.

    This was done 36 times with the results averaged. This is not Method 2.

    That's not what I said in Post 3.

    It's actually Method 1 for a new set of payoffs: the 36 pairwise products of the original six payoffs.

    No, it is not Method 1. On the second turn the entire bankroll is staked -- not the original betting amount.

    This is a winning game, with an average return on a dollar bet of AM2.

    AM2 is in fact the arithmetic mean (AM) of the new set of payoffs.

    So this is one more example of a winning Method 1 result.

    To test Method 2 for 72 pulls of the handle, you bet $1.00 once and keep the result at risk for all 72 pulls.

    That final result is the just product of the 36 pairwise products, which is 0.9753235719 = GM72.

    And that is the expected result for Method 2: each pull multiplies your stake by GM.

    Again, why does Method 1 win and Method 2 lose?

    Because AM applies to added winnings and AM>1; GM applies to multiplied winnings and GM<1.

    Simulation: Pull the handle 100,000 times with random results:

    • Method 1:

      Average return on each $1.00 bet is 1.032955 <--> AM = 1.0333 ...

    • Method 2:

      Stake goes from $1.00 to $1.414863x10-31 = GM100000.

      The 100,000th root of 1.414863x10-31 is 0.9992899. <--> GM = 0.9996530325.

    There is an interesting thought provoking point there, presenting a kind of pseudo-paradox in this problem.

    I stand firmly by my solution in the post 3, though I am beginning to have some doubts about its straightforwardness property. The illustration I gave in support of the solution was not a proof. And since bonanova has misinterpreted my illustration, it must have been unintelligible. (It happens to me from time to time. In my head a statement I make seems precise and clear, but other people can't make out the ends of it.)

    So in the interest of befuddlement, perplexity, and creative thought...

    A fair white colored die may be adopted for playing this game. All one needs to do is to ignore the numbers engraved on it and write with a permanent marker the numbers 0.7, 0.8, 0.9, 1.1, 1.2, and 1.5 on its six sides.

    Since we all have agreed that the method 1, i.e. betting a fixed amount, yields an average factor of F = ( 0.7+ 0.8 + 0.9 + 1.1 + 1.2 + 1.5)/6 = 1.0333... on each bet, we can concentrate on the method 2, where we stake our entire proceeds from previous bets on each next turn.

    A product of all die rolls certainly suggests itself.

    Starting with an initial amount of A and having rolled r1, r2, …, rN, we end up with the final payoff P = A* r1* r2* …*rN. For example, starting with $1 and rolling 1.1, 1.1, and 0.7, we end with $1*1.1*1.1*0.7 = $0.847. (Let's always use $1 as an initial bet to dispense with the variable A.)

    PERSPECTIVE 1

    Suppose, I had plenty of free time and went on a long gambling streak rolling the die N times, where N is a very large number. Since all 6 die sides are equally probable, there would be approximately N/6 of each of the 6 possible values of a die roll. Then employing the commutative property of multiplication we could regroup the factors when calculating the final payoff like so:

    P = ( 0.7*0.8*0.9*1.1*1.2*1.5)*( 0.7*0.8*0.9*1.1*1.2*1.5)*... = ( 0.7*0.8*0.9*1.1*1.2*1.5)N/6.

    With a large N the deviation from this formula would seem insignificant. Noting that ( 0.7*0.8*0.9*1.1*1.2*1.5) = 0.99792, that would make a payoff P for a large N tend to zero.

    PERSPECTIVE 2

    N consecutive rolls of a die generate T = 6N total possible variations. The Average Payoff PN for N consecutive rolls of the die is defined as:

    PN = S1*W1 + S2*W2 + ... + ST*WT, where each Si is a payoff for a single variation of N die rolls, and Wi is the probability of that variation. In this case each W is equal to 1/T = 1/ 6N.

    Note: Si is not a payoff for some random experimentally obtained string of N rolls of dice, but rather a payoff for an individual theoretical possibility of such string. The possible payoffs range from 0.7N to 1.5N . Whereas each of those boundary values has only one possible string of N die rolls, most of the final payoff values in between may be obtained by many different strings of N die rolls.

    Let's define the average factor for our game as F = ( 0.7+ 0.8 + 0.9 + 1.1 + 1.2 + 1.5)/6 = 1.0333... .

    Theorem: For N consecutive rolls the Average Payoff PN = FN.

    Induction Proof

    1). For N = 1 the rule holds, namely P1 = F1= 1.0333...

    2). Suppose the rule holds for some N >=1, namely PN = FN. Let's prove that then it is also true for N+1.

    The total number of variations for N+1 rolls is 6N+1. The collection of all variations may be obtained by multiplying each of the variations of the N-roll set by 6 possible values of a die roll. Thus the Average Payoff for N+1 rolls becomes:

    PN+1 = S1*0.7/6N+1 + S1*0.8/6N+1 + S1*0.9/6N+1 + S1*1.1/6N+1 + S1*1.2/ 6N+1 + S1*1.5/6N+1 + S2*0.7/6N+1 + ... = S1*(0.7+0.8+0.9+1.1+1.2+1.5)/6N+1 + S2*(0.7+0.8+0.9+1.1+1.2+1.5)/6N+1 + ... = (S1/6N)*F + (S2/6N)*F+ … + (ST/6N)*F = F*(S1/6N+ S2/6N+ … + ST/6N) = F*PN = F* FN = FN+1

    QED.

    SIMULATION

    When running a simulation, keep in mind that number representations in a computer have a limited precision, 18 digits or so, depending on data type. Whereas, 72-roll string yields a payoff value with 72 digits past the decimal point and possibly a lot of digits before (e.g. 1.572). Truncating the interim values would generate a huge error rendering such simulation grossly inaccurate. I suggest, use a smaller string of die rolls, like 12 or so.

    For 12 consecutive rolls, theoretical average payoff is (1.0333...)12 = 1.4821264897.

    I wrote a simple computer simulation and ran 1,000,000 sets of 12 consecutive die rolls. The average of the 1,000,000 payoffs came to 1.484941558, which is slightly higher than theoretical, but nowhere near the theoretical of 0.995844326 for the PERSPECTIVE 1.

    PERSPECTIVE 1 seems so plausible, and yet it is a false method for calculation of Average Payoff. Why? I think, simply, the expected string of N rolls of a die is not representative of the Average Payoff. In the example of 12 consecutive rolls, the expected string of 12 rolls would be the one with 2 of each of our six possible values. The number of ways to construct such strings is 12!/(2!)6 = 7,484,400. That is a comparatively small number constituting approximately 0.34% of the total number of variations of 612.

    For anyone who is still not convinced, I am ready to play the game with a fair die, staking my entire bankroll on each roll of the die. (To speed up the process, we could roll 12 dice at a time.)


  15. I suspect John Steinbeck and Robert Burns are lurking somewhere nearby. ;)

    Since OP refers to A and B as points rather than cells, I see George and Lenny traveling on the lines,

    as if the lines are paths, and meeting, if they do, on any of the nine points on the major diagonal.

    One can enumerate the number of paths to each point on the grid from the nearer of A and B.

    The edge points all receive a 1, since they each terminate precisely one route.

    Numbers on other points are then the sum of the two numbers immediately upstream.

    This constructs Pascal's Triangle and places the binomial coefficients of increasing order on the diagonals.

    In particular, the number of paths from A or B to the points on the major diagonal are:

    1 8 28 56 70 56 28 8 and 1.

    These numbers sum to 28 = 256, as they should, since each path is a unique set of 8 binary decisions.

    The probability of reaching a particular point on the major diagonal is the number on that point divided by 256.

    The probability of meeting, as many have pointed out, is the sum of the squares of the individual probabilities. Thus,

    p(meet) = (1+64+784+3136+4900+3136+784+64+1) / 2562 = 12870 / 65536 ~ 0.1963...

    If George and Lenny travel from cell to cell, then there are 7 decisions for each path, and in that case

    p(meet) = ( 1 + 49+441+1225+1225+441+49+1) / 1282 = 3387 / 16384 ~ 0.2067...

    The problem variation has a speed ratio of 3:1.

    Analysis now is impossible unless George and Lenny combined take 16 steps, on lines, -- 12 and 4 in this case.

    Now the possible meeting locations are the points on the 4th diagonal from the slower runner's corner.

    The probabilities of the slower person reaching these points are {1 4 6 4 1} / 16.

    The probabilities of the faster person reaching these points is found by continuing Pascal's triangle to the 12th level.

    Retaining only the five central values, these probabilities are seen to be {495 666 672 666 495} / 2994.

    The pairwise products are 495 2664 4032 2664 495, and the meeting-probability denominator is 16 x 2994. Thus,

    p(meet) = (495+2664+4032+2664+495) / 16 x 2994 = 10352 / 47904 ~ 0.216098...

    Interestingly, this is only slightly higher than meeting at equal speed on the major diagonal.

    Does George shoot Lenny only if they meet?

    The corners of the board are dots. George and Lennie have square bottoms exactly equal in size to a board square. They move from one square to another by sliding in a straight line until the entire square is covered with their bottom. This way whenever they are moving into the same square, the collision is unavoidable.

    I concentrate on paths, rather than individual squares or points on the board. Seems easier that way.

    If George moves a total of 4 squares to the right and 3 squares up he is going to end up in the same square regardless of the order, in which he took those steps. The number of possible paths (different orders of steps) in this case is 7C3.

    Notably, Pascal’s triangle consists of Combination formulas, which add up to the corresponding power of 2. E.g., the Pascal triangle line (1, 8, 28, 56, 70, 56, 28, 8, 1) actually is (8C0, 8C1, 8C2,...,8C8). And it adds up to the 28.


  16. The average payoff being (0.7+0.8+0.9+1.1+1.2+1.5)/6 = 1.03333...


    So when betting $1 at a time you make on average $0.0333... per each game.
    When you bet all your winnings each time, your money grows exponentially with the factor of 1.0333..., which should bankrupt the casino very quickly. (Starting with $1, after just 422 games you should expect to have over million $$.)
    To convince yourself the exponential formula is applicable, calculate in a spreadsheet all 36 possibilities for just two turns, add them up and divide by 36: (0.7*0.7+0.7*0.8+...+1.5*1.2+1.5*1.5)/36 = 1.067777..., which coincides with (1.033...)2 = 1.06777...

  17. Since my last post has not been recognized as the answer, there must be something else to this problem. I suppose, it could be that different coins are encountered with different frequency. E.g., a penny is a lot more common than a half-dollar coin.

    I don’t have any data on how many of each coin type there are in circulation. However, I doubt an adjustment for each individual coin probability would change the final answer to the OP. I believe a 15-coin collection would still have a higher probability of spotting a dime.


  18. To solve an unequal speed case, I feel a need for some disambiguation of the OP.

    1). Moving 3 times faster does not mean moving in 3-square straight segments at a time. George still can zigzag in 1-squre steps.

    2). Collision occurs whenever George and Lennie are found anywhere inside the same square at the same time.

    While there may be a general formula for NxN grid and variable speed ratio, I’d rather demonstrate a solution using an example of 8x8 grid and George moving 3 times faster than Lennie. That should show a kind of algorithmic general solution.



    Together George and Lennie must traverse 14 squares at the combined speed of 4 times that of Lennie’s before they collide. 14/4 = 3.5. Thus, at the time of a collision, Lennie will be stepping into his 4th square, which will be the 10th square for George.
    In this scenario there is a total of L = 24 = 16 possible paths for Lennie. However, not that simple for George. While making a total of 10 1-square steps, he is limited to just 7 in one direction. Thus George’s number of all possible 10-square paths is: G = 10C3 + 10C4 + 10C5 + 10C6 + 10C7 = 912.
    The number of colliding paths is: C = 4C0*10C3 + 4C1*10C4 + 4C2*10C5 + 4C3*10C6 + 4C4*10C7 = 3432. Where 4C0*10C3 is the number of path combinations when Lennie moves straight down and George moves 7 squares to the right and 3 squares up, etc..
    And the final probability is P = C/(L*G) ~ 0.2352.

  19. ....

    ...

    It would be interesting to see your code's four payoff values for 10 9 7 and 10 9 8 when the first two cards are above and below 14. The code would have to be modified from "search" mode to where 10 9 are given, and results sorted into two categories with respect to 14. I think I see how to calculate it directly, at least the EV for 10 9. The EV contribution for the third draw, in the two cases is a little complicated, and I didn't finish it last night.

    It suddenly downed on me, what you meant. The (10, 9, 8) and (10, 9, 7) are not the actual cards drawn, but rather representations of two different strategies. So, (10, 9, 8) means: stay on 10 and up on the first card, on 9 and up – on the second, and on 8 - on the third.

    In this game, there are only 6 actual valid combinations where it is preferable to stay on 7 on the third turn, rather than draw a fourth card.


    E.g., (9,8,7) occurring at the probability of (4/52)*(4/51)*(4/50) with the 4th draw value: 350/49 ~ 6.979591837.
    The table of all six cases is as following:
    1ST CARD 2ND CARD 3RD CARD 4TH DRAW AVG PROBABILITY
    9 6 7 6.979591837 0.000482655
    9 7 7 6.959183673 0.000361991
    9 8 7 6.93877551 0.000482655
    8 7 7 6.979591837 0.000361991
    8 8 7 6.959183673 0.000361991
    7 8 7 6.979591837 0.000361991

    Multiplying the probability of each case by the difference between 7 and the 4th draw average, and adding the six results together, yields approximately 0.0000837258.
    If you subtract that from the analytical result of the average payoff of 10.2070305 that I got, it agrees with your computer simulation result of 10.2069.
    So the strategy S(10,9,8) has a disadvantage of approximately 0.008 of a cent compared to the strategy where you stop on you third turn on 7 or up if the first two cards add up to 15 or more, and stop on 8 and up otherwise.
    And of course, the strategy S(10,9,7), where you stay on 7 and up on your third turn regardless of the cards drawn in previous two turns, is even worse.

  20. Hard to say. Of course, each woman can count several times, if she has several brothers.

    Judging from the sample of population just around me, the answer is NO. However, that may not be accurate, since I am surrounded by people with zero, or one sibling.

    Most men I've met have no sisters. Of those who do, most have just one sister. And the general assumption here is that there are about the same number of men and women overall.

×
×
  • Create New...