Jump to content
BrainDen.com - Brain Teasers
  • 0

How many tickets?


bonanova
 Share

Question

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

Link to comment
Share on other sites

13 answers to this question

Recommended Posts

  • 0

I picked p in my draw. Lets estimate n = p. What would have been the likelihood that I got the max number ? : 1/p


If say n = p+1. What would have been the likelihood I picked p in my draw ? : 1/(p+1)
1/p > 1/(p+1).
So given that I picked p , the estimate of n that would give the best probability of me picking p is p itself.
So best estimate of n is p




  • Upvote 2
Link to comment
Share on other sites

  • 0

Well, we know there are at least p in n, but given the possibility of us picking p is at least 1 in p, the fact that we know there are at least p-1 other numbers we didn't pick.

I really can't picture the math, but I think the probability that we picked the highest number straight away is lower... I'm picturing it like a bell curve, which I know it can't be. But I'm guessing the answer is somewhere above p, if not double it.

Link to comment
Share on other sites

  • 0

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

It must be p only because then the Probabability of picking out p = 1/p, for n>p, the probability decreases.

Link to comment
Share on other sites

  • 0

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

It seems like this problem is dependent upon what is a reasonable a priori distribution for N.

Let P(N) be the prior distribution for the number N. Let P(p|N) be the conditional probability for p given N; we know that P(p|N) = 1/N. We wish to compute the conditional probability P(N|p), which can be expanded using Bayes theorem

post-14842-0-05617100-1363037833_thumb.p

So, if we are given a well-defined probability function P(N), we can easily compute the number N that gives that maximum probability P(N|p).

The problem is that we are not given what P(N) is. One reasonable choice would be to assume that P(N) is the improper uniform distribution from p to infinity (every number has an equal chance of occuring). However, if we plug that improper distribution into the equation above, we would be able to cancel out P(N), but we'd end up with a harmonic series in the denominator, which does not converge.

Link to comment
Share on other sites

  • 0

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

It seems like this problem is dependent upon what is a reasonable a priori distribution for N.

Let P(N) be the prior distribution for the number N. Let P(p|N) be the conditional probability for p given N; we know that P(p|N) = 1/N. We wish to compute the conditional probability P(N|p), which can be expanded using Bayes theorem

attachicon.gifbayesian.png

So, if we are given a well-defined probability function P(N), we can easily compute the number N that gives that maximum probability P(N|p).

The problem is that we are not given what P(N) is. One reasonable choice would be to assume that P(N) is the improper uniform distribution from p to infinity (every number has an equal chance of occuring). However, if we plug that improper distribution into the equation above, we would be able to cancel out P(N), but we'd end up with a harmonic series in the denominator, which does not converge.

Oh wait, the answer is much simpler than that. Turns out this is the easiest bonanova puzzle I have ever seen =)

Since "n tickets numbered 1-n are in a box", if we get number p = (1-n), then it is easy to see that n = 1 - p.

Link to comment
Share on other sites

  • 0

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

It seems like this problem is dependent upon what is a reasonable a priori distribution for N.

Let P(N) be the prior distribution for the number N. Let P(p|N) be the conditional probability for p given N; we know that P(p|N) = 1/N. We wish to compute the conditional probability P(N|p), which can be expanded using Bayes theorem

attachicon.gifbayesian.png

So, if we are given a well-defined probability function P(N), we can easily compute the number N that gives that maximum probability P(N|p).

The problem is that we are not given what P(N) is. One reasonable choice would be to assume that P(N) is the improper uniform distribution from p to infinity (every number has an equal chance of occuring). However, if we plug that improper distribution into the equation above, we would be able to cancel out P(N), but we'd end up with a harmonic series in the denominator, which does not converge.

Oh wait, the answer is much simpler than that. Turns out this is the easiest bonanova puzzle I have ever seen =)

Since "n tickets numbered 1-n are in a box", if we get number p = (1-n), then it is easy to see that n = 1 - p.

LOL

That may be the first and only time I use LOL in this forum.

I'm not that clever. Really.

But I love it. It's a better puzzle than the one I intended.

Honorable mention.

.

  • Upvote 1
Link to comment
Share on other sites

  • 0

I picked p in my draw. Lets estimate n = p. What would have been the likelihood that I got the max number ? : 1/p

If say n = p+1. What would have been the likelihood I picked p in my draw ? : 1/(p+1)

1/p > 1/(p+1).

So given that I picked p , the estimate of n that would give the best probability of me picking p is p itself.

So best estimate of n is p

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

It must be p only because then the Probabability of picking out p = 1/p, for n>p, the probability decreases.

You are both correct. post-1048-0-61227800-1363045604.gif

nakulendu posted first, so I marked his answer as the solution.

Link to comment
Share on other sites

  • 0

I picked p in my draw. Lets estimate n = p. What would have been the likelihood that I got the max number ? : 1/p

If say n = p+1. What would have been the likelihood I picked p in my draw ? : 1/(p+1)

1/p > 1/(p+1).

So given that I picked p , the estimate of n that would give the best probability of me picking p is p itself.

So best estimate of n is p

>

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

It must be p only because then the Probabability of picking out p = 1/p, for n>p, the probability decreases.

You are both correct. attachicon.gifbona_gold_star.gif

nakulendu posted first, so I marked his answer as the solution.

The solutions above are making some assumptions about the properties of underlying distribution of N, which may or may not be warranted depending on your interpretation of the OP.

Implicit within the explanations above is the assumption that P(N) = P(N+1). The explanation goes on to say that since P(p|N) decreases as N increases (without bound, presumably), then it makes sense that p is the most optimal choice.

From an earlier post, the full Bayesian expression for P(N|p) is

post-14842-0-38608500-1363052466_thumb.p

Note that if we assume that that N is uniformly distributed between p and infinity (which is an improper prior distribution), then the denominator of the term above becomes the harmonic series, which does not converge. If we assume that N is uniformly distributed between p and some large finite M, then the solution is warranted.

So, choosing N=p is the correct solution IF P(N) is uniformly distributed AND P(N) -> 0 as N approaches infinity.

Link to comment
Share on other sites

  • 0

Suppose n tickets numbered 1-n are in a box, and you draw one of them

You don't know how many tickets are in the box, but you are asked to estimate how many there are.

Your ticket has the number p on it.

What estimate of n has the highest likelihood of being correct?

It seems like this problem is dependent upon what is a reasonable a priori distribution for N.

Let P(N) be the prior distribution for the number N. Let P(p|N) be the conditional probability for p given N; we know that P(p|N) = 1/N. We wish to compute the conditional probability P(N|p), which can be expanded using Bayes theorem

attachicon.gifbayesian.png

So, if we are given a well-defined probability function P(N), we can easily compute the number N that gives that maximum probability P(N|p).

The problem is that we are not given what P(N) is. One reasonable choice would be to assume that P(N) is the improper uniform distribution from p to infinity (every number has an equal chance of occuring). However, if we plug that improper distribution into the equation above, we would be able to cancel out P(N), but we'd end up with a harmonic series in the denominator, which does not converge.

OMG!you are one hell of a methematician...lol ;)

Link to comment
Share on other sites

  • 0

Bushindo, Thanks.

I learn so much from you.

Is there also a maximum likelihood estimate of the underlying assumptions

that would make the maximum likelihood estimate of n well defined?

Thanks for the kind words, bonanova, I also learned much from you over the years. The two wonderful examples (among many) that come quickest to mind are Hole in a Sphere and Maiden Escape, both of which taught me completely new ways of thinking.

The comment I made earlier is that the solution to this puzzle will depend on what kind of prior information (or assumption) we have about the probability distribution P(N), since the Bayesian derivation forces us to explicitly use P(N) in the calculation of the answer.

My interpretation of what you are asking is that is there a maximum likelihood estimate for the distribution P(N) such that P(N|p) is well-defined (and my impression is that this is a way to bypass the requirement to know exact values for P(N)). It is possible to assume the distribution P(N) as belonging to some well-defined probability class (exponential, gaussian, negative exponential, etc.) and then use maximum likelihood to find the best-fit curve within the assumed class. Of course, there is no free lunch. We just moved the prior assumption backwards a bit so that we make assumptions about the class of probability curves P(N) belongs to.

You may be asking is there a maximum likelihood estimate for estimating what class of probability function P(N) most likely is in, and that is a much tougher problem. If we have several candidate probability classes (again, exponential, binomial, gaussian, etc.) then we can easily compare between them and find the most likely probability class for P(N). The problem is that there are an infinite number of possible probability class (and most don't have nice tractable parametric forms), and finding the best class over this domain is not possible. So, if we restrict the possible probability classes to a finite number (note that we are making some assumptions here again), we can then solve the problem.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...