My Avatar

LanternD's Castle

An electronics enthusiast - survive technically

STT 861 Theory of Prob and STT I Lecture Note - 4


Expectation and its theorems, Geometry distribution, Negative Binomial distribution and their examples; Theorem of linerarity, PMF of a pair of random variables, Markov's inequality; variance and its examples, uniform distribution.

Portal to all the other notes

Lecture 04 - Sept 27 2017



Definition: Let $X$ be a (discrete) r.v. with PMF $p_k=P[X=k],k\in Z$. We say that the expectation of $X$ is


Example 1

A game of dice. Throw 1 die, win \$ 1 if outcome is even, win \$ outcome/2 if outcome is odd.

\[E[x]=\frac{1+1+1}{6}+ \frac{0.5+1.5+2.5}{6} = \frac{7.5}{6} = 1.25\]

Example 2

Let $X\sim Bern(p)$, thus $E(X)=p$.

Example 3

Let $X\sim Bin(n,p)$, $E(X)=\sum_{k=0}^{n}kC_n^kp^k(1-p)^{n-k}$, this is the dumb way to solve.

Another way, use linearity,

\[E[X]=\sum E[X_i]=n \times p\]

Example 4

Let $X\sim Geom(p)$, then $E[x]=\frac{1}{p}$ (prove this at home). Here is a link to it. I don’t want to type it again.

Example 5

Let $X\sim NegBin(X)$, then

\[E[X]=\sum E[X_i]=\frac{n}{p}\]

Try at home

Find the definition, PMF and expectation for the “Multinomial” distribution.

Theorem of Linearity: Let $X$ amd $Y$ be two r.v.’s and $\alpha,\beta\in R$ Let $Z=\alpha X+\beta Y$.

\[E[Z]=E[\alpha X+\beta Y]=\alpha E[X] + \beta E[Y]\]

Don’t require $X$ and $Y$ be independent.

Theorem Let $X$ and $Y$ be two independent r.v.’s, then


(try to prove it at home)

PMF of $(X,Y)$

For $(X,Y)$ a pair of r.v.s, we can define PMF of $(X,Y)$:


Note: if $X$ and $Y$ are independent, then $p_{x,y}=p_xp_y=P[X=x]P[Y=y]$.

Example 6

Let $Y=X^2$ where $X$ is a r.v. Assume $P[X=k]=p_k$, then


Theorem: Let $X$ be a r.v.with PMF $P[X=x_k]=p_k$, let $F$ be a function from $\mathbb{R}$ to $\mathbb{R}$, let $Y=F(X)$, Then


Theorem: Let $X$ be a r.v. such that $X\geq 0$ ($P[X\geq 0]=1$). Then $E[X]\geq0$. Proven by the definition.

Theorem (Markov’s inequality): Let $X$ be a non-negative r.v. Let $X$ be fixed real $>0$. Then

\[P[X>C]\leq \frac{E[X]}{C}\]

(Chebyshev inequality is related to Markov inequality).

Variance (Chapter 1.6)

Empirically the variance is the average squared deviation from the mean.

With data $x_1, x_2,…,x_n$, let




Mathematically, variance is defined as:

Let $X$ be a r.v.


(General formula)

Here we used the approximate corresponding $\frac{1}{n}\leftrightarrow X$, $\mu\leftrightarrow E[X]$.

Proposition: $Var(X)=E[X^2]-(E[X])^2$. This is extremely important, especially in doing homework. :-)


\[\begin{align*} Var(X)&=E[(X-E[X])^2] \\ &=E[X^2-2XE[X]+(E[X])^2] \\ &=E[X^2]-E[2XE[X]]+E[(E[X])^2] \\ &=E[X^2]-2E[X]^2+E[X]^2 \\ &=E[X^2]-(E[X])^2 \end{align*}\]

Usually if PMF of $X$ is given, it is easier compute to $Var(X)$ using the 2nd formula than the original definition.

Usual Notation

Let $X$ be a r.v.

Example 7

Let $X$ have this PMF: for $k=1,2,3,4$, $p_k=P[X=k]=\frac{k}{10}$.

Then \(\mu=E[X]=\frac{1+4+9+16}{10} = 3,\)


Finally, $Var[X] = E[X^2]-E[X]^2 = 10-9=1$

Properties of the variance: Let $c\in R$, let $X$ & $Y$ be independent r.v. Then

Example 8

Let $X\sim Binom(n,p)$, therefore,


where $X_i$ are i.i.d $Bernoulli(p)$. Thus


Variance of Bernoulli:




In Ch2, we will calculate $Var(X\sim Geom(p))=\frac{1-p}{p^2}$.

Let $X\sim NegBin(n,p)$, then $Var(X)=\frac{n(1-p)}{p^2}$.

Proposition: Let $X$ be a r.v. Let $c\in R$, then


where $\mu=E[X]$.


\[\begin{align*} E[(X-c)^2]&=E(X-\mu+\mu-c)^2 \\ &=E[(X-\mu)^2+2(X-\mu)(\mu-c)+(\mu-c)^2] \\ &=Var(X)+0+(\mu-c)^2 \\ &=Var(X)+(\mu-c)^2 \end{align*}\]

Indeed, for $c=x$, we get the definition of $Var(X)$. For $c\neq\mu$ we get


Example 9

Let $X$ be a uniform r.v. is the set of integers from 1 to $N$, $P[X=k]=\frac{1}{N}$ for $k=1,2,3,…,n$ (definition).


\[E[N] = \frac{N+1}{2},\] \[Var(N)=\frac{N^2-1}{12}\]

End note

Sometimes people (the professor) use different notation for expectation and variance,

Just choose whichever you like. I prefer ( ) to [ ].

Disqus Comment 0