3 Bernoulli Distribution

3.1 Probability Mass Function

A random variable is said to have a Bernoulli Distribution with parameter \(p\) if its probability mass function is: \[\Pr(X = x)=\left\{ \begin{array}{ll} \pi^x(1-\pi)^{1-x}, & x=0,1\\ 0 & \mathrm{otherwise} \end{array} \right. \] Where \(\pi\) is the probability of a success.

3.2 Cumulative Mass Function

\[\Pr(X \leq x)=\left\{ \begin{array}{lll} 0 & x<0\\ 1-\pi & x=0\\ 1 & 1\leq x \end{array} \right. \]

The graphs on the left and right show a Bernoulli Probability Distribution and Cumulative Distribution Function, respectively, with $\pi=.4$.  Note that this is identical to a Binomial Distribution with parameters $n=1$ and $\pi=.4$.

(#fig:Bernoulli_Distribution)The graphs on the left and right show a Bernoulli Probability Distribution and Cumulative Distribution Function, respectively, with \(\pi=.4\). Note that this is identical to a Binomial Distribution with parameters \(n=1\) and \(\pi=.4\).

3.3 Expected Values

\[ \begin{aligned} {\mathrm{E}}(X) &= \sum\limits_{i=0}^{1} x\cdot \Pr(X = x) \\ &= \sum\limits_{i=0}^{1} x \cdot \pi^{x} (1-\pi)^{1-x}\\ &= 0 \cdot \pi^{0} (1-\pi)^{1-0} + 1 \cdot \pi^{1} (1-\pi)^{1-1}\\ &= 0 + \pi (1-\pi)^{0}\\ &= \pi\\ \\ \\ {\mathrm{E}}(X^{2}) &= \sum\limits_{i=0}^{1} x^2 \cdot \Pr(X = x)\\ &= \sum\limits_{i=0}^{1} x^{2} \cdot \pi^x (1-\pi)^{1-x}\\ &= \sum\limits_{i=0}^{1} 0^{2} \cdot \pi^0 (1-\pi)^{1-0} + 1^2 \cdot \pi^1 (1-\pi)^{1-1}\\ &= 0 \cdot 1 \cdot 1 + 1 \cdot \pi \cdot 1 \\ &= 0 + \pi\\ &= \pi\\ \\ \\ \mu &= {\mathrm{E}}(X) = \pi\\ \\ \\ \sigma^2 &= {\mathrm{E}}(X^2) - {\mathrm{E}}(X)^2 \\ &= \pi-\pi^2 \\ &= \pi(1-\pi) \end{aligned} \]

3.4 Moment Generating Function

\[\begin{aligned} M_{X}(t) &= {\mathrm{E}}(e^{tX}) \\ &= \sum\limits_{i=0}^{1} e^{tx} p(x) \\ &= \sum\limits_{i=0}^{1} e^{tx} \pi^{x} (1-\pi)^{1-x}\\ &= e^{t0} \pi^0 (1-\pi)^{1-0} + e^t \pi^t (1-\pi)^{1-1}\\ &= (1-\pi) + e^t \pi\\ &= \pi e^t + (1-\pi) \\ \\ \\ M^{(1)}_X(t) &= \pi e^t\\ \\ \\ M^{(2)}_X(t) &= \pi e^t\\ \\ \\ {\mathrm{E}}(X) &=M^{(1)}_X(0)\\ &= \pi e^0\\ &= \pi e^0\\ &= \pi\\ \\ \\ {\mathrm{E}}(X^2) &= M^{(2)}_X(0)\\ &= \pi e^0\\ &= \pi\\ \\ \\ \mu &= {\mathrm{E}}(X)\\ &= \pi\\ \\ \\ \sigma^2 &= {\mathrm{E}}(X^2) - {\mathrm{E}}(X)^2 \\ &= \pi - \pi^2 \\ &= \pi (1-\pi) \end{aligned} \]

3.5 Theorems for the Bernoulli Distribution

3.5.1 Validity of the Distribution

\[\sum\limits_{x=0}^{1}\pi^x(1-\pi)^{1-x}=1\]

Proof:

\[\begin{aligned} \sum\limits_{x=0}^{1} \pi^x (1-\pi)^{1-x} &= \pi^0 (1-\pi)^1 + \pi^1 (1-\pi)^0 \\ &= (1-\pi) + \pi \\ &= 1 \end{aligned}\]

3.5.2 Sum of Bernoulli Random Variables

Let \(X_1,X_2,\ldots,X_n\) be independent and identically distributed random variables from a Bernoulli distribution with parameter \(p\). Let \(Y=\sum\limits_{i=1}^{n}X_i\).\ Then \(Y\sim\) Binomial\((n,\pi)\)

Proof: \[\begin{aligned} M_Y(t) &= {\mathrm{E}}(e^{tY}) \\ &= {\mathrm{E}}(e^{tX_1} e^{tX_2} \cdots e^{tX_n}) \\ &= {\mathrm{E}}(e^{tX_1}) {\mathrm{E}}(e^{tX_2}) \cdots {\mathrm{E}}(e^{tX_n}) \\ &= (\pi e^t+(1-\pi)) (\pi e^t+(1-\pi)) \cdots (\pi e^t+(1-\pi)) \\ &= (\pi e^t+(1-\pi))^n \end{aligned}\]

Which is the moment generating function of a Binomial random variable with parameters \(n\) and \(\pi\). Thus, \(Y\sim\) Binomial\((n,\pi)\).