Processing math: 84%

You are here

A Selection of Problems from A.A. Markov’s Calculus of Probabilities: Problem 4 – Solution 1

Author(s): 
Alan Levine (Franklin and Marshall College)

 

Review statement of Problem 4.

 

Solution 1: First of all, we note that the game can be won by player L in various numbers of rounds not less than l and not more than l+m1.

Therefore, by the Theorem of Addition of Probabilities,13 we can represent the desired probability (L) in the form of a sum (L)l+(L)l+1++(L)l+i++(L)l+m1, where (L)l+i denotes the total probability that the game is finished in l+i rounds won by player L.

And in order for the game to be won by player L in l+i rounds, that player must win the (l+i)th round and must win exactly  l1 of the previous l+i1 rounds.

Hence, by the Theorem of Multiplication of Probabilities,14 the value of (L)l+i must be equal to the product of the probability that player L wins the (l+i)th round and the probability that player L wins exactly l1 out of l+i1 rounds.

The last probability, of course, coincides with the probability that in l+i1 independent experiments, an event whose probability for each experiment is p, will appear exactly l1 times.

The probability that player L wins the (l+i)th round is equal to p, as is the probability of winning any round.

Then15 (L)l+i=p12(l+i1)12i12(l1)pl1qi=l(l+1)(l+i1)12iplqi, and finally, (L)=pl{1+l1q+l(l+1)12q2++l(l+1)(l+m2)12(m1)qm1}.
In a similar way, we find (M)=qm{1+m1p+m(m+1)12p2++m(m+1)(m+l2)12(l1)pl1}.

However, it is sufficient to calculate one of these quantities, since the sum (L)+(M) must reduce to 1.

 

Continue to Markov's second solution of Problem 4.

Skip to Markov's numerical example for Problem 4.

Skip to statement of Problem 8.

 


[13] This “theorem,” presented in Chapter I, says (in modern notation): If A and B are disjoint, then P(AB)=P(A)+P(B).  We would now consider this an axiom of probability theory. Since Markov considered only experiments with finite, equiprobable sample spaces, he could “prove” this by a simple counting argument.

[14] This “theorem,” also presented in Chapter I, says (in modern notation): P(AB)=P(A|B)P(B). Nowadays, we would consider this as a definition of conditional probability, a term Markov never used. Again, he “proved” it using a simple counting argument. He then defined the concept of independent events.

[15] The conclusion (L)_{l+i} = \binom{l+i-1}{i}p^l q^i is a variation of the negative binomial distribution; namely, if X represents the number of Bernoulli trials needed to attain r successes, then P(X=n) = \binom{n-1}{r-1}p^r q^{n-r},\ \ n\geq r, where p is the probability of success on each trial and q=1-p.

 

Alan Levine (Franklin and Marshall College), "A Selection of Problems from A.A. Markov’s Calculus of Probabilities: Problem 4 – Solution 1," Convergence (November 2023)

A Selection of Problems from A.A. Markov’s Calculus of Probabilities