Sample Question #265 (probability brainteaser)

Consider a coin with the probability of getting heads being

*p*, where*p*>=0.5. You toss the coin and get a tails on the first toss.1) What is the expected number of tosses needed to get an equal number of heads and tails for the first time?

2) Will your answer be the same if the first toss gave you heads?

3) What is your answer if

*p*<0.5?(Comment: this is a classic problem and is quite difficult)

Advertisements

HINT

Consider the cases where p=0.5 and p<0.5 first.

I think it is related to the first passage time to level 0 of a random walk. But I still need to work it out.

1) 1/(2p-1) 2) no, infinity 3) flip the answers of 1) and 2)

ANSWER

1) Denote getting heads +1 and getting tails -1. Expected value of next toss is p(1)+(1-p)(-1)=2p-1. Note that since p>=0.5, this value >=0. If we start with the first coin toss, or look at the value after the first toss gave tails, we know the probability of getting even heads and tails is 1, so the expected number of tosses needed, n, satisfies (2p-1)n=1, or n=1/(2p-1).

2) If the first toss gave us a heads, then P(heads=tails) is 0, since p>=0.5. Steve Z’s reply follows.

3) See Steve Z’s reply.

Hi Brett, sorry I don’t know how to start a new thread. I have a question and need your help.Consider S_t geometric brownian motiondS_t=mu S_t dt + sigma S_t dB_t where mu and sigma constantS_0, K and T are given, if weincrease sigma, would the probability P(S_T >=K) increase ? One answer I got from a Prof is yes, but I calculated and didn’t get it. Since P(S_T>=K)=N(d_2), I calculated the partial derivative w.r.t. sigma, the formula is not positive definite, it depends on S_0, K and T. Would you please check it ? Thanks.

I don’t follow the solution given, and am not even convinced that it is correct (for one thing, I think it gives the wrong answer when p=0.5).At best, it seems you are computing the expectation conditioned on the first coin toss being tails (though the derivation still doesn’t seem rigorous to me even in this case).Care to give a slightly more formal derivation?

Jon, the solution is best understood if you’re familiar with random walk/Brownian motion formulas…

Good luck!

-brett

This problem serves as a good illustration of Wald’s Equation (or Theorem), which goes like this:

Suppose X[1], X[2], . . . are iid random variables having the

same finite expectation E[X], and suppose

N is a stopping time for X[i], for i = 1, 2, . . ., with expectation

E[N] < infinity.

Then E[sigma(X[i] as i

runs from 1 through N)] = E[N]*E[X].

Now to apply this theorem to the problem at hand. Let X[i] = +1 if the coin lands heads and let

X[i] = -1 if the coin lands tails. Let N

denote the number of tosses needed (after the first toss of a tail) to get an

equal number of heads and tails altogether for the first time. So when the N-th toss is performed, we get

FOR THE FIRST TIME an equal number of heads and tails altogether.

Clearly sigma(X[i] as

i runs from 1 through N) = 1 since as

soon as the number of heads becomes one greater than the number of tails (after

the first toss of tails), the tossing stops (hence N is a stopping time).

Now taking expectations on both sides of the last equation,

along with the application of the theorem, gives the result: E[N]*E[X] = 1. So, E[N] = 1/E[X].

But what is E[X] equal to?

As noted elsewhere, it is very easy to compute it. Obviously, E[X] = 2p -1. Thus E[N] = 1/(2p -1).

For similar treatments/discussions, consult Stochastic

Processes (2nd Ed) by Sheldon M. Ross.Note that the above solution is a somewhat elaborated version of a solution presented earlier.