Bernoulli Processes

Dated Oct 2, 2017; last modified on Mon, 05 Sep 2022

Bernoulli Process

A Bernoulli Process is a sequence of independent \({0, 1}\) - valued random variables \(X_1, X_2, X_3, …\), e.g. \(0, 0, 1, 0, 1, 1\)

A Bernoulli Process does not mandate that the probability distributions of the \(X_i\) be identical. That is up to the model that we choose. For instance, the Binomial Random Variable assumes \(\mathbb{P}\{X_i = 1\} = p \ \ \forall i\)

Suppose you flip a coin repeatedly, and record \(0\) for tails and \(1\) for heads.

  • How many flips until the first \(1\)?
  • Is it possible to get observe an infinite sequence of \(0\)’s?
  • What is the probability of getting 51 ones among 100 coin tosses?

Sample Problems

Probability of Specific Sequences

Suppose you have a coin in which \(\mathbb{P}(H) = \frac{2}{3} \). If you toss it 5 times, what is the probability of getting three \(1\)’s and two \(0\)’s? ()

Our favorable outcomes are: \((1,1,1,0,0), (0,1,1,1,0), …, (0,1,0,1,1) \therefore \) 10 favorable outcomes.

Counting can be tricky. In how many ways can we put 3 ones in 5 slots? \( {5 \choose 3} = \frac{5!}{2! \times 3!} = 10 \)

Each of the favorable outcomes has equal probability of occurring, e.g.

$$ \mathbb{P}\{(1,1,1,0,0)\} = \left( \frac{2}{3} \right)^3 \cdot \left( \frac{1}{3} \right)^2 $$

Because all of these outcomes are disjoint, we can sum them up to get the final probability as:

$$ \left( \frac{2}{3} \right)^3 \cdot \left( \frac{1}{3} \right)^2 \cdot 10 $$

Probability of General Sequences

If \(\mathbb{P}\{H\} = p\), what is the probability of \(k\) heads in \(n\) experiments?

Where \(x\) is the number of sequences of length \(n\) with \(k\) ones and \(n-k\) zeros, then the answer is:

$$ = p^k \cdot (1 - p)^{n-k} \cdot x $$

Let’s find \(x\)…

$$ x = \frac{n (n-1) … (n - k + 1)}{k (k-1) … (1)} = \frac{n!}{k! (n-k)!} = { n \choose k } = C_{n}^{k} $$

For the numerator, we’re trying to place \(k\) ones in \(n\) slots. The first one can be placed in any of the \(n\) slots, the second one can be placed in any of the remaining \(n - 1\) slots, and so forth, until the last one which can be placed into any of the remaining \( n - (k - 1) = n - k + 1 \) slots.

For the denominator, because all the ones are identical, we don’t care about the ordering of the individual ones. We therefore divide by the number of possible orderings of \(k\) items.

And therefore, the final answer is:

$$ { n \choose k } \cdot p^k \cdot (1 - p)^{n-k} $$

… which is the Binomial Random Variable

Alice and Bob each have a biased coin - \(\mathbb{P}(H) = 51\% \). They each start with $100, flipping the coin and betting against the bank on the outcome. Alice calls heads every time; Bob calls tails. Given that they both go broke, who is more likely to have gone broke first?

Why is Alice likely to have gone broke first? There are a couple of Gambler’s Ruin problems in ORF 309 that can help me understand this.

  1. Dinner! Drinks! Denominators! | The New Yorker. Dan Rockmore. www.newyorker.com . Jan 10, 2022.