
Content Count
719 
Joined

Last visited

Days Won
5
Content Type
Profiles
Forums
Calendar
Gallery
Blogs
Posts posted by bushindo


I should say that he does not necessarily need to plough the whole field. He will still get payed for what he has done even if he doesn't finish.
Anyway good start so far guys.
If he moves diagonally, on average, his machine would clear an area of sqrt( 2 )=1.41 m^2 for every meter he travels. That is because if we diagonally move a square that is oriented upright, then diagonal of the square would sweep out a larger area. That would be a net income of 1.41*.02  .01 = 0.0182 / meter.
So, suppose that he sweeps the field row by row, but orients his machine so that the diagonal line is perpendicular to the direction of travel, he would need to travel roughly 100 * 100/sqrt( 2 ) = 7100 meters roughly, then he needs to pay 71 dollars for fuel. There is little bit of distance travelling from row to row too that I'm not including
He covers the entire field, so he gets paid 200 dollars. That is a net gain of 129 dollars. Refinements of the solution should change the answer a bit, but I expect it to be close to 129.

This is correct. There are several ways to solve the equations plasmid set up.I'll call the probability of winning a hand "x" and the probability of losing a hand "y".Define PW(n) as the probability of winning for the day (reaching $21) if you currently have $n.
Then we know that PW(20) = x + yPW(19)
Likewise PW(19) = xPW(20) + yPW(18)
And in general PW(n) = xPW(n+1) + yPW(n1) unless n=20 or n=1.
For cases other than n=20 and n=1, we can rearrange this to
0 = xPW(n+1) + PW(n)  yPW(n1)
For the case n=20, we can take PW(20) = x + yPW(19) and rearrange it to
x = PW(20)  yPW(19)
And for the case n=1, we can take PW(1) = xPW(2) and write it as
0 = xPW(2) + PW(1)
Set it up as a system of linear equations with 20 equations of 20 unknowns.
x = PW(20)  yPW(19) 0 =xPW(20) + PW(19)  yPW(18) 0 = xPW(19) + PW(18)  yPW(17) 0 = xPW(18) + PW(17)  yPW(16) 0 = xPW(17) + PW(16)  yPW(15) etc. 0 =xPW(4) + PW(3)  yPW(2) 0 = xPW(3) + PW(2)  yPW(1) 0 = xPW(2) + PW(1)
I know that you can use linear algebra to take all of the equations of the form 0 = ... (all of the equations except the first one) and multiply them by constants and add them together to get an equation of the form 0 = aPW(20) + bPW(19)Spoiler for Proof of that:You would take the nexttolast equation and multiply it by 1/y and add it to the last one to get rid of PW(1), to get 0 = cPW(3) + dPW(2) where c=x/y and d=(1/y)x Then take the thirdtolast equation and multiply it by d/y and add it to the equation above to get 0 = ePW(4) + fPW(3) (the math gets nasty) Keep doing that until you've added the second equation to eliminate PW(18) and be left with something of the form 0 = aPW(20) + bPW(19)So PW(19) = a/b PW(20) Substituting in the first equation would give x = PW(20)  y( a/b PW(20) ) = PW(20) + yaPW(20)/b = (ya+b)PW(20)/b PW(20) = bx / (ya+b) The math is too nasty for me to actually work it out by hand, but that should prove that if you have something like Matlab to plug the numbers into, you will get a solution for PW(20). As long as ya+b is not zero. Which it isn't, because we know that a solution must exist. (The only thing I was worried about disproving was nonindependence of the equations.) Then it's just a question of whether $1 x probability of winning > $20 x probability of losing, or 1 x PW(20) > 20 x (1PW(20))The following system of equations are what we need to solve.
x = PW(20)  yPW(19) 0 =xPW(20) + PW(19)  yPW(18) 0 = xPW(19) + PW(18)  yPW(17) 0 = xPW(18) + PW(17)  yPW(16) 0 = xPW(17) + PW(16)  yPW(15) etc. 0 =xPW(4) + PW(3)  yPW(2) 0 = xPW(3) + PW(2)  yPW(1) 0 = xPW(2) + PW(1)
We can break the above system of equation into the following
A t = b
where t is a 20x1 matrix of unknown variables PW(n).
t = [ PW(20), PW(19), ... , PW(1) ]
b is also a 20 x 1 matrix of coefficients on the left side
b = [ x, 0, 0, 0, ... , 0 ]
And A is is matrix. With ones on the diagonal, y to the right, and x to the left of every diagonals. All other entries are 0.
A is a band matrix, and it can easily proven that A is invertible, allowing for a unique solution. consequently,
t = A^{1} b
Another alternative solution is to iteratively solve for the following system
PW( 21 ) = 1
PW(0) = 0
PW( n ) = x * PW( n + 1 ) + y * PW( n  1 )
For n between 1 and 20.
Solving the system from bottom up by substituting is messy, like plasmid indicated. However, my attempts at it shows that approach will end up with some formed of continued fractions.
Solving the above, we see that the chance of leaving the table everyday with 21 dollars is .905. Consequently, the expected value of winnings everyday is 2.625 dollars. After 100 days of applying this strategy, we expect to see a loss of 262.5 dollars.

Took a little while but I'm quite sure that this is right. I have tested it for n = 1, 2 and 3.
Time = 3 + (N+6)(N1)(N1)/2N
That's right. Care to elaborate more on the calculation?

Just some clarification. The mouse can not immediately reenter the room it just searched. It can, however, enter a room that was entered more than 1 turn ago. For example, if the mouse is in room 5, to start a new search, it will randomly select a room between 1 and 20, room 5 not included. Suppose it chooses room 10, it is possible for the mouse to enter room 5 again after it leaves room 10.
By the way, the case of n=3 is already solved. See here
http://brainden.com/forum/index.php?showto...mouse&st=10
For those checking their formulas, the average time required for n=3 is 9 minutes.

This is a slight modification of the Mouse and cheese problems posed earlier by psychic_mind. I'd like to see a generalization of his problem to the case of any arbitrary number of rooms.
Assume that there are N rooms lined up in a hall way, each has a room number ranging from 1 to N. The first room, room 1, has the cheese. If the mouse goes into room 1 it searches for 3 minutes before finding the cheese. If it goes into room number 2, it searches for 4 minutes and then leaves. In general, if the mouse goes into room n, it searches for (n+2) minutes.
The mouse first start by randomly choosing a room and search. Assume that any time the mouse leaves a room, it randomly select another room from all 20 except the one it just searched. Or in physic_mind's words, "THE MOUSE NEVER REENTERS THE ROOM IT JUST LEFT".
What is the average amount of time it takes to find the cheese, given a number of room N?

Maybe I'm wrong but this is making no sense to me.
He either loses $20 or wins $1.
I don't see how the odds match up here. He wins 48 out of 100 hands regardless... Assuming the probability doesn't change. Over the course of an extended time he would lose.
edit: this is also removing the number of cards as a variable. assuming each hand of blackjack is a unique event.
It is true that he either loses $20 dollars or win $1 every day. Let p be the probability that he wins 21 dollars in a single day. The chance that he'll lose all 20 dollars is (1 p), since we are working with the assumption that he continues to play until one of those events occur.
Expected winnings per day =  $ 20 * (1 p) + $1 * p
The key is finding the probability that he'll win 21 dollars for any given day, p. If p is close enough to 1, then the expected earnings might be positive, giving this scheme an advantage over the house.
I think the probability will be the same regardless of the bet, but the question posed is if he bets $1 every time.Let's assume that the bet amount is always $1. Winning always give him a net gain of $1, and losing a net gain of 1.

Maybe I'm wrong but this is making no sense to me.
He either loses $20 or wins $1.
I don't see how the odds match up here. He wins 48 out of 100 hands regardless... Assuming the probability doesn't change. Over the course of an extended time he would lose.
edit: this is also removing the number of cards as a variable. assuming each hand of blackjack is a unique event.
It is true that he either loses $20 dollars or win $1 every day. Let p be the probability that he wins 21 dollars in a single day. The chance that he'll lose all 20 dollars is (1 p), since we are working with the assumption that he continues to play until one of those events occur.
Expected winnings per day =  $ 20 * (1 p) + $1 * p
The key is finding the probability that he'll win 21 dollars for any given day, p. If p is close enough to 1, then the expected earnings might be positive, giving this scheme an advantage over the house.

Do you think you can build a strategy to win money at a casino ? I don't think so...
Let's first suppose you don't make any successive lost / won (in this order) that cancel each other in terms of gain. The only possibilities to finish with 21 dollars is to lose n times and win n+1 times, with n more or equal to 0, and less or equal to 19. The probability of such an event to happen is 0.52^n * 0.48^(n+1). So the probability of leaving the casino with 21 dollars under this assumption is obtained by taking the sum over n : A = 0.48*(1+p+p^2+p^3+...+p^19) where p=0.48*0.52. I find A=0.64 approximatively.
However, you can also make any times of [one lost / one win] before and it doesn't change the outcome. So you have to multiply A by 1+p+p^2+p^3+...+p^infinity, which is around 1.33. So the overall probability of winning one dollar is W=0.85
The probability of losing the 20 dollars is of course 1W. So each day, the expected loss is (1W)*20$ + W*1$ = 2.10$. After 100 days, you should have lost 210$ in average. Also a lot of time...
Sorry my explanations are not very clear. Maybe the reasoning is not very clear in my head neither...
0.52^n * 0.48^(n+1) is actually the probability of 1 particular configuration of n wins and (n+1) lost, where the order is important. For instance, the chance of 1 wins followed by 1 loss and then 1 win is .48 * .52 * .48 = .48^2 * .52. The chance of 2 wins and 1 loss, where order is not important, is equal to C( 2, 1) * .48^2 * .52, where C(2,1) means 2 `choose' 1. We need a combinatoric term in the bold part of this argument.
It is also possible for n in the bold part to exceed 19. For instance, we can win 100 games and lose 99 games, and still end up with 21 dollars.
The answer to question 1 is correct. The answer to part II is not, however. The true probability is slightly higher.

This problem is inspired by a recent trip to Las Vegas.
Mr. G is an avid gambler. After studying blackjack for a while, he realizes that the casino has a minor edge in winning, even if he plays with an optimal strategy. Consequently, he devises the following scheme.
For simplicity, assume that each blackjack hand cost 1 dollar to play, no doubledown or splitting allowed. Winning will always pays double the bet. His chance of winning every game is .48, and his chance of losing is .52.
Everyday, he would take a bankroll of 20 dollars down to the casino. He would play as long as it takes until he either loses all his bankroll, or reaches a total bankroll of 21 dollars. As soon as he reaches either state, he would pack up his belongings and retire to his room. So for instance, suppose one day he loses 3 hands in a roll, and then wins 4 hands afterward, he would have 21 dollars total, at which point he would retire for the day.
1) Does this strategy allow him to beat the house advantage, i.e. get a positive expected winning per day?
3) Suppose that he uses this strategy for 100 days straight, what is his expected winnings, or losses, at the end of the 100 days period? Assume that he has a big enough bank account in the beginning so that he can always start with 20 dollars each day, even in the worst case scenario.

This gives the same results as unreality, for those who like to see equations and stuffs
Let T(n) represent the probability that the nth person's seat is already taken.
T(1) = 0
( Since Bob, n=1, has access to an empty plane. His assigned seat is empty. )
T(2) = 1/100
( The second person has a 1/100 chance that Bob took her seat )
T(3) = 1/100 + T(2)*1/99
( The chance that the 3rd person's seat is taken is equal to the chance that Bob took it or that the second lady took it )
T(4) = 1/100 + T(2)*1/99 + T(3)*1/98
.
.
.
In general
T(n) = T(n 1) + T(n1)/(102n)
T(n) = T(n1) * (103n)/(102n)
The solution amounts to finding T(100). This can be found as a solution of the above recursive function T(n) where T(2) = 1/100. The answer is .5, agreeing with what unreality and Big Red found earlier.

I'll tackle the total number of blocks first
First of all, this is a modified version of the sum of square problems. The sum of squared natural numbers up to n, S_{n}, is
S_{n} = 1^{2} + 2^{2} + ... + n^{2} = n^{3}/3 + n^{2}/2 + n/6
For n = even, the total number of blocks S_{e,n} is
S_{e,n} = 2^{2} + 4^{2} + ... + n^{2}
= 4( 1^{2} + 2^{2} + ... + (n/2) ^{2} )
= 4( S_{n/2} )
= 4( n^{3}/24 + n^{2}/8 + n/12 )
For n = odd, the total number of blocks, S_{o,n}, is quite easy if we use the above two equations
S_{o,n} = S_{n}  S_{e,n1}
S_{o,n} = n^{3}/3 + n^{2}/2 + n/6  4( (n1)^{3}/24 + (n1)^{2}/8 + (n1)/12 )
Now let's tackle the pyramids,
Lets assume n = even, just so that n/2 is a nice number too.
f5 = 0
f4 = 4
f3 = (n2)*4 + 4( n/2 1 )
f2 = 4( (n4)/2 (n/2 1 ) )
f1 = (n2)^{2}
f0 = S_{e,n}  ( f1 + f2 + f3 + f4 + f5)
The case for odd n should be easy to derive as a slight modification of the above. There should be a single set of equation that can accommodate any n regardless of whether it is odd or even, but I'm not feeling ambitious right now.

1) You won't return to the point of origin because the earth is a sphere, or close enough to it. When you travel along the second leg of the trip (400 miles due east), it takes a shorter amount of time to circumnavigate the earth. The problem is more obvious if we exaggerate the number of miles travel. Suppose we travel 2000 miles in each leg, then it is clear why we won't return to the point of origin.
2) The radius of the earth at the equator (latitude = 0) is r = 3963.2 miles, with diameter d = 24901.6 miles
After going 400 miles south and then traveling due east, we'll be traveling along a smaller circle at a constant latitude theta = 5.78 S. This is because
length of first leg = r * theta * pi / 180
400 = r * theta * pi / 180
5.78 = theta
The radius h of a circle at constant latitude 5.78 S is
h = r sin(90  theta )
h = 3943.036
The distance x needed to return to the origin on the last leg can be found using the proportional relationship
x/r = 400/h
x = 402.046
So you need to travel about 2.046 miles extra on the last leg to get home.

The man happened to catch a terrorist who is about to detonate a bomb in the museum, which would destroy the entire museum collection along with the building. The man bravely arm wrestled with the terrorist, destroying some painting in the struggles. Fortunately, he overpowered the terrorist in the end and the museum was saved.
The curator thanked the man because at the time of the struggle, he was crouching under a desk in his nearby office and praying for his life. If it weren't for the man he would have went up in smoke along with his museum.

The year is 1,000,000 AD. The corpse is the fossilized body of a man who died in one million years ago and was fossilized in the sediments. His swimming trunks were made of nonbiodegradable plastics. The area used to be a large of body of water but the intervening years have died up the water and converted it into a forest.

Since the shortest line connecting the center of the sphere to the cone will have to meet the cone perpendicularly, the angle made by the shortest connecting line between the sphere center and the cone with the vertical line in the center is a constant with value (90  theta/2 )
r = h cos( 90  theta/2 )
The volume of any sphere satisfying the requirement of `resting on the wine cup' is
V = 4 pi ( h cos( 90  theta/2 ) )^2
Here is where it gets messy. There are three different situations. Let H be the height of the cone and h be the height of the center of the sphere
h > H ( Less than half the sphere is inside the cone )
h < H  r ( The entire sphere lies inside the cone )
H  r < h < H (between 1/2 and the entire sphere lies inside the cone )
We can construct equations to determine the maximum volume for the displaced water for each scenario. My intuition tells me that its scenario 3. In that case, the displaced volume is equal to the volume of the sphere subtracting the volume made by the spherical cap above the cone. (see equation for volume of spherical cap here http://mathworld.wolfram.com/SphericalCap.html )
Volume = 4 pi ( r )^2  (1/3) pi ( r + h  H )^2 ( 3r  ( r + h  H ) )
Substitute the expâ€‹ression for h
Volume = 4 pi ( r )^2  (1/3) pi ( r + r/cos( 90  theta/2)  H )^2 ( 3r  ( r + r/cos( 90  theta/2)  H ) )
Taking derivative with respect to r and setting it equal to 0, we'll get an expâ€‹ression for the minimizing r under scenario 3. It'll be a quadratic equation with two answers, one of which probably will make no sense. The computation is rather messy, so maybe I'll do it later or someone else can do it. I'll have to get back to work on stuff I'm actually getting paid for.

I got it in 30. Divide into 3 groups of 9 and 1 group of 3. Weigh two of the 9's together like in my last one. If they are unequal follow my last post. But if they are equal, measure one of them against the last pile of 9, if they are again equal. Take only the pile of 3 and know that it has either a lighter or heavier coin in it. Measure two of them against eachother, if they're equal then the leftover is the lighter/heavier one. If they are not equal, measure the heavier one against the leftover one and you can figure out which coin is lighter/heavier. I don't think anything above 30 is possible though. Unless I'm wrong again.
We can improve what Avalanche5x did a bit further. We can improve the total amount by 1 since we have some extra information about the normal stones.
Following's Avalanche5x's reasoning, if after two weighting we determine that the 3 groups of 9 coins (27) are the same, then we reduced the problem to determining how many extra coins can we identify, given that we have 2 weighings and that the other 27 are normal. I think 4 is the maximum under this constraint. Divide the 4 into 2 groups of size 3 and 1. Take the 3 and weight against 3 of the normal coin. If the scale is balanced, then the last coin is the defective one, and an extra weighting will determine whether it is heavier or lighter.
If the scale if not balanced, then the group of 3 contains the defective coin. Remove 1 coin each from each side of the scale (let the coin we removed from defective side be called R), swap 1 defective coin (call this coin S )with a coin from the normal. One coin from the defective side will still be in its original position (call this O). We then measure the two groups of 2 together. Three scenarios can happen
1) The scale balances. The defective coin must be R. We can determine its relative weight from the third weighing.
2) The scale tips to the same side as the third weighing. O is the defective coin. Relative weight determined from third weighing.
3) The scale tips to the opposite direction as third weighing. S is the defective coin. Relative weight determined from third weighing.

How many people are not alive right now. There are 3 billion base pairs of DNA, therefore the mathematical potential for human life works out to be astronomically diverse. The crazy thing is, however, even if two people ar born with the same DNA (i.e. twins) they are still different people.
In the course of human history, more people have died than could ever live on Earth at the same time, a significantly larger number than 67 billion.
Again, the question is about potential. The whole idea of infinity is a number that is always growing, but that is what asymptotes are for in math, there is no such thing as infinity people, just a growth trend that points toward it.
You are quite correct in that the idea of infinity is a number that grows without any upper bound. However, the number of people on earth, even counting those deceased, is microscopically small in view of the grander scheme of things. The question here is potential. There is an upper bound to how many human that can ever exist in this universe. The number of total atoms in the universe, for instance, has an upper bound of somewhere around 10^100, or a google. I think it is fairly safe to say that the cumulative total human population won't ever exceed the total number of atoms in the universe. Even then, a google is trivially small compared to what 'infinity' could be, since we could easily construct larger numbers such as ( 10^100 * 10^100 ), or even (10^100^100) and so on.
I prefer to solve this riddle from a mathematical perspective. All you need to do is prove that 1 =2, proofs of which are not hard to find. Once you have that, proving 1 + 1 = infinity is easy. Granted it is based on the incorrect proof of 1=2, but i feel it is more satisfying.

i feel, there is a grandmother, mother and daughter
grandmother is mother of mother  1 Mother
mother is mother of daughter  2 Mothers
daughter is daughter of mother  2 Mothers, 1 Daughter
mother is daughter of grandmother  2 Mothers, 2 Daughters..
am i right?
You know, technically, the grandmother is also a 'daughter' of somebody. So to be correct, the description for this group would have been 2 mothers and 3 daughters. Assuming that every person is either a son or a daughter, there is no way for a group of 3 unique people to consist of 2 daughters and 2 mothers.
in New Logic/Math Puzzles
Posted · Edited by bushindo
I misread the OP on the Prisoner on a death row problem ( http://brainden.com/forum/index.php?showtopic=8355 ), and thus solved a completely different problem. This is that problem. The initial conditions are the same as kidsrange's
There are 100 prisoners who are sentenced to die tomorrow. However, the warden has decided to give them a chance to live. He will put each of their names on a slip of paper inside an opaque, numbered (from 1100) jar. Each prisoner will be able to open 50 jars in order to try to find their name. The prisoners will each do this individually, and in sequential order.
1) Each prisoner does not have to open the 50 jar sequentially. Each prisoner can remove the 50 slips from the jar and place them back in any order he desired. However, at the end, each jar must contain exactly 1 slip and no modification can be made to the jars' appearances, placement, orientation, etc.
2) After open and closing the 50 jars, each prisoner can select the jar that he thinks contain his name. That jar is not removed from the game. After this, the prisoner will be moved to a new room and have no communication with the ones who have yet to make their selection.
At the end of the selection process, prisoners who correctly identify their jar will live, and those who don't will die. Devise a strategy to save, on average, the maximum number of prisoners.