BrainDen.com - Brain Teasers

# bushindo

VIP

719

5

## Everything posted by bushindo

1. I misread the OP on the Prisoner on a death row problem ( http://brainden.com/forum/index.php?showtopic=8355 ), and thus solved a completely different problem. This is that problem. The initial conditions are the same as kidsrange's There are 100 prisoners who are sentenced to die tomorrow. However, the warden has decided to give them a chance to live. He will put each of their names on a slip of paper inside an opaque, numbered (from 1-100) jar. Each prisoner will be able to open 50 jars in order to try to find their name. The prisoners will each do this individually, and in sequential order. 1) Each prisoner does not have to open the 50 jar sequentially. Each prisoner can remove the 50 slips from the jar and place them back in any order he desired. However, at the end, each jar must contain exactly 1 slip and no modification can be made to the jars' appearances, placement, orientation, etc. 2) After open and closing the 50 jars, each prisoner can select the jar that he thinks contain his name. That jar is not removed from the game. After this, the prisoner will be moved to a new room and have no communication with the ones who have yet to make their selection. At the end of the selection process, prisoners who correctly identify their jar will live, and those who don't will die. Devise a strategy to save, on average, the maximum number of prisoners.
2. So PW(19) = -a/b PW(20) Substituting in the first equation would give x = PW(20) - y( -a/b PW(20) ) = PW(20) + yaPW(20)/b = (ya+b)PW(20)/b PW(20) = bx / (ya+b) The math is too nasty for me to actually work it out by hand, but that should prove that if you have something like Matlab to plug the numbers into, you will get a solution for PW(20). As long as ya+b is not zero. Which it isn't, because we know that a solution must exist. (The only thing I was worried about disproving was non-independence of the equations.) Then it's just a question of whether \$1 x probability of winning > \$20 x probability of losing, or 1 x PW(20) > 20 x (1-PW(20)) This is correct. There are several ways to solve the equations plasmid set up.
3. That's right. Care to elaborate more on the calculation?
4. Just some clarification. The mouse can not immediately re-enter the room it just searched. It can, however, enter a room that was entered more than 1 turn ago. For example, if the mouse is in room 5, to start a new search, it will randomly select a room between 1 and 20, room 5 not included. Suppose it chooses room 10, it is possible for the mouse to enter room 5 again after it leaves room 10. By the way, the case of n=3 is already solved. See here http://brainden.com/forum/index.php?showto...mouse&st=10 For those checking their formulas, the average time required for n=3 is 9 minutes.
5. This is a slight modification of the Mouse and cheese problems posed earlier by psychic_mind. I'd like to see a generalization of his problem to the case of any arbitrary number of rooms. Assume that there are N rooms lined up in a hall way, each has a room number ranging from 1 to N. The first room, room 1, has the cheese. If the mouse goes into room 1 it searches for 3 minutes before finding the cheese. If it goes into room number 2, it searches for 4 minutes and then leaves. In general, if the mouse goes into room n, it searches for (n+2) minutes. The mouse first start by randomly choosing a room and search. Assume that any time the mouse leaves a room, it randomly select another room from all 20 except the one it just searched. Or in physic_mind's words, "THE MOUSE NEVER RE-ENTERS THE ROOM IT JUST LEFT". What is the average amount of time it takes to find the cheese, given a number of room N?
6. It is true that he either loses \$20 dollars or win \$1 every day. Let p be the probability that he wins 21 dollars in a single day. The chance that he'll lose all 20 dollars is (1- p), since we are working with the assumption that he continues to play until one of those events occur. Expected winnings per day = - \$ 20 * (1- p) + \$1 * p The key is finding the probability that he'll win 21 dollars for any given day, p. If p is close enough to 1, then the expected earnings might be positive, giving this scheme an advantage over the house. Let's assume that the bet amount is always \$1. Winning always give him a net gain of \$1, and losing a net gain of -1.
7. It is true that he either loses \$20 dollars or win \$1 every day. Let p be the probability that he wins 21 dollars in a single day. The chance that he'll lose all 20 dollars is (1- p), since we are working with the assumption that he continues to play until one of those events occur. Expected winnings per day = - \$ 20 * (1- p) + \$1 * p The key is finding the probability that he'll win 21 dollars for any given day, p. If p is close enough to 1, then the expected earnings might be positive, giving this scheme an advantage over the house.
8. Sorry my explanations are not very clear. Maybe the reasoning is not very clear in my head neither... 0.52^n * 0.48^(n+1) is actually the probability of 1 particular configuration of n wins and (n+1) lost, where the order is important. For instance, the chance of 1 wins followed by 1 loss and then 1 win is .48 * .52 * .48 = .48^2 * .52. The chance of 2 wins and 1 loss, where order is not important, is equal to C( 2, 1) * .48^2 * .52, where C(2,1) means 2 `choose' 1. We need a combinatoric term in the bold part of this argument. It is also possible for n in the bold part to exceed 19. For instance, we can win 100 games and lose 99 games, and still end up with 21 dollars. The answer to question 1 is correct. The answer to part II is not, however. The true probability is slightly higher.
9. This problem is inspired by a recent trip to Las Vegas. Mr. G is an avid gambler. After studying blackjack for a while, he realizes that the casino has a minor edge in winning, even if he plays with an optimal strategy. Consequently, he devises the following scheme. For simplicity, assume that each blackjack hand cost 1 dollar to play, no double-down or splitting allowed. Winning will always pays double the bet. His chance of winning every game is .48, and his chance of losing is .52. Everyday, he would take a bankroll of 20 dollars down to the casino. He would play as long as it takes until he either loses all his bankroll, or reaches a total bankroll of 21 dollars. As soon as he reaches either state, he would pack up his belongings and retire to his room. So for instance, suppose one day he loses 3 hands in a roll, and then wins 4 hands afterward, he would have 21 dollars total, at which point he would retire for the day. 1) Does this strategy allow him to beat the house advantage, i.e. get a positive expected winning per day? 3) Suppose that he uses this strategy for 100 days straight, what is his expected winnings, or losses, at the end of the 100 days period? Assume that he has a big enough bank account in the beginning so that he can always start with 20 dollars each day, even in the worst case scenario.
10. We can improve what Avalanche5x did a bit further. We can improve the total amount by 1 since we have some extra information about the normal stones.
11. You are quite correct in that the idea of infinity is a number that grows without any upper bound. However, the number of people on earth, even counting those deceased, is microscopically small in view of the grander scheme of things. The question here is potential. There is an upper bound to how many human that can ever exist in this universe. The number of total atoms in the universe, for instance, has an upper bound of somewhere around 10^100, or a google. I think it is fairly safe to say that the cumulative total human population won't ever exceed the total number of atoms in the universe. Even then, a google is trivially small compared to what 'infinity' could be, since we could easily construct larger numbers such as ( 10^100 * 10^100 ), or even (10^100^100) and so on. I prefer to solve this riddle from a mathematical perspective. All you need to do is prove that 1 =2, proofs of which are not hard to find. Once you have that, proving 1 + 1 = infinity is easy. Granted it is based on the incorrect proof of 1=2, but i feel it is more satisfying.
12. You know, technically, the grandmother is also a 'daughter' of somebody. So to be correct, the description for this group would have been 2 mothers and 3 daughters. Assuming that every person is either a son or a daughter, there is no way for a group of 3 unique people to consist of 2 daughters and 2 mothers.
×

• #### Activity

• Riddles
×
• Create New...