Jump to content
BrainDen.com - Brain Teasers
  • 0

A Necktie Paradox


BMAD
 Share

Question

Two men are each given a necktie by their respective wives as a Christmas present. Over drinks they start arguing over who has the more expensive necktie, and agree to have a wager over it. They will consult their wives and find out which necktie is the more expensive. The terms of the bet are that the man with the more expensive necktie has to give it to the other as the prize.

The first man reasons as follows: the probability of me winning or losing is 50:50. If I lose, then I lose the value of my necktie. If I win, then I win more than the value of my necktie. In other words, I can bet x and have a 50% chance of winning more than x. Therefore it is definitely in my interest to make the wager. The second man can consider the wager in exactly the same way; therefore, paradoxically, it seems both men have the advantage in the bet.

Is there a problem here?

Link to comment
Share on other sites

9 answers to this question

Recommended Posts

  • 0

the first and biggest problem is that neither realizes that their wife will kill them if they give away their christmas present
also...

The men are only looking at the payout ratio of the winner. If we change the method of winning the other necktie to flipping a coin, it becomes clear that one man is betting more than he stands to win, and the other is betting less. In other words, one man has a higher payout ratio than the other.



By choosing the winner by cheapest tie, all this accomplishes is forcing the lower payout ratio on the looser, and the higher on the winner. In essence, the trick is confusing the method of selecting a winner from the payout ratio of your bet and your probability of winning.

Link to comment
Share on other sites

  • 0

the two envelopes problem where the contents are sums of money differing by a factor 2. You choose one and then are given the opportunity to switch. I also heard the wager in terms of the amount of money in their wallets. They compare and the one with more money gives it to the other. In all three, it seems advantageous to switch envelopes or take the bet.

I'm not smart enough to prove the best course of action, I stop at the first fact that is evident -- the symmetry in each puzzle -- and conclude neither action can be advantageous, so why bother?

Link to comment
Share on other sites

  • 0

While thinking about this, I came across another paradox that seems relevant. Suppose that there are two unknown natural numbers A and B. Clearly, either number should be equally as likely to be larger than the other. But suppose you learn the identity of A. Say that it is 200. The probability that B is smaller than 200 is infinitesimal. Therefore, B is now almost definitely larger than A. How does that make sense?

Link to comment
Share on other sites

  • 0
The first man reasons as follows: the probability of me winning or losing is 50:50. If I lose, then I lose the value of my necktie. If I win, then I win more than the value of my necktie. In other words, I can bet x and have a 50% chance of winning more than x. Therefore it is definitely in my interest to make the wager.

If you are offered a wager, where

(1) probability of wining = probability of losing = 1/2,

and (2) when you win, you win more than you lose when you lose,

then indeed - in terms of EV - it is profitable to accept.

However in described wager only condition (1) is met, while condition (2) is not.

Amount you win/lose is just equal to value of the more expensive tie.

Claim that potential win is bigger than potential loss is groundless.

In other words, your potential win is bigger than value of your tie,

but comparing those values makes no sense. What you should compare

with potential win is potential loss, but those two values are equal.

Therefore there is no positive expectation in the wager.

Edited by witzar
Link to comment
Share on other sites

  • 0

While thinking about this, I came across another paradox that seems relevant. Suppose that there are two unknown natural numbers A and B. Clearly, either number should be equally as likely to be larger than the other. But suppose you learn the identity of A. Say that it is 200. The probability that B is smaller than 200 is infinitesimal. Therefore, B is now almost definitely larger than A. How does that make sense?

The expected value of an unspecified (positive) integer is infinity. Any finite integer is infinitesimally small in comparison. If you learn A's value to be a googolplex raised to the power of itself a googolplex number of times, the probability that an unspecified integer B (uniformly chosen from all possible positive integer values, if that is even possible) is less than A is nonetheless zero.
Link to comment
Share on other sites

  • 0

The most intuitive way (well, only intuitive way) I can think of to explain why the logic "I have a 50/50 chance of winning/losing" and "if I win then I gain more than if I lose" is faulty is because, particularly if this were to happen in the real world, the wives would not spend some amount of money on a necktie that could like anywhere within the range of positive real numbers. There would have to be some probability distribution of how much money they could spend on a tie. As soon as you create a probability distribution from which the two ties are drawn, it becomes clear that the more valuable your tie is, the more likely you are to lose the bet.

Link to comment
Share on other sites

  • 0

then intuitively would

The most intuitive way (well, only intuitive way) I can think of to explain why the logic "I have a 50/50 chance of winning/losing" and "if I win then I gain more than if I lose" is faulty is because, particularly if this were to happen in the real world, the wives would not spend some amount of money on a necktie that could like anywhere within the range of positive real numbers. There would have to be some probability distribution of how much money they could spend on a tie. As soon as you create a probability distribution from which the two ties are drawn, it becomes clear that the more valuable your tie is, the more likely you are to lose the bet.

then intuitively, wouldn't the same logic apply in reverse? A husband knowing the frugal nature of his wife believes with very much certainty that his necktie was bought heavily discounted and thus has a great chance of winning

Link to comment
Share on other sites

  • 0

Yes, you're more likely to win if your tie is cheap and more likely to lose if your tie is expensive.

That's why, even if you go into the problem thinking you have a 50/50 chance of winning if you don't have any prior knowledge of the values of the ties, you in fact have a greater chance of losing if your tie is expensive than if it's cheap. So the outcomes are really not equal and are dependent on the value of your tie.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...