My Avatar

LanternD's Castle

PhD Student in ECE @ MSU

STT 861 Theory of Prob and STT I Lecture Note - 4

2017-09-27

Expectation and its theorems, Geometry distribution, Negative Binomial distribution and their examples; Theorem of linerarity, PMF of a pair of random variables, Markov's inequality; variance and its examples, uniform distribution.

Portal to all the other notes

Lecture 04 - Sept 27 2017

Expectation

Definition: Let $X$ be a (discrete) r.v. with PMF $p_k=P[X=k],k\in Z$. We say that the expectation of $X$ is

Example 1

A game of dice. Throw 1 die, win \$ 1 if outcome is even, win \$ outcome/2 if outcome is odd.

Example 2

Let $X\sim Bern(p)$, thus $E(X)=p$.

Example 3

Let $X\sim Bin(n,p)$, $E(X)=\sum_{k=0}^{n}kC_n^kp^k(1-p)^{n-k}$, this is the dumb way to solve.

Another way, use linearity,

Example 4

Let $X\sim Geom(p)$, then $E[x]=\frac{1}{p}$ (prove this at home). Here is a link to it. I don’t want to type it again.

Example 5

Let $X\sim NegBin(X)$, then

Try at home

Find the definition, PMF and expectation for the “Multinomial” distribution.

Theorem of Linearity: Let $X$ amd $Y$ be two r.v.’s and $\alpha,\beta\in R$ Let $Z=\alpha X+\beta Y$.

Don’t require $X$ and $Y$ be independent.

Theorem Let $X$ and $Y$ be two independent r.v.’s, then

(try to prove it at home)

PMF of $(X,Y)$

For $(X,Y)$ a pair of r.v.s, we can define PMF of $(X,Y)$:

Note: if $X$ and $Y$ are independent, then $p_{x,y}=p_xp_y=P[X=x]P[Y=y]$.

Example 6

Let $Y=X^2$ where $X$ is a r.v. Assume $P[X=k]=p_k$, then

Theorem: Let $X$ be a r.v.with PMF $P[X=x_k]=p_k$, let $F$ be a function from $\mathbb{R}$ to $\mathbb{R}$, let $Y=F(X)$, Then

Theorem: Let $X$ be a r.v. such that $X\geq 0$ ($P[X\geq 0]=1$). Then $E[X]\geq0$. Proven by the definition.

Theorem (Markov’s inequality): Let $X$ be a non-negative r.v. Let $X$ be fixed real $>0$. Then

(Chebyshev inequality is related to Markov inequality).

Variance (Chapter 1.6)

Empirically the variance is the average squared deviation from the mean.

With data $x_1, x_2,…,x_n$, let

and

Mathematically, variance is defined as:

Let $X$ be a r.v.

(General formula)

Here we used the approximate corresponding $\frac{1}{n}\leftrightarrow X$, $\mu\leftrightarrow E[X]$.

Proposition: $Var(X)=E[X^2]-(E[X])^2$. This is extremely important, especially in doing homework. :-)

Proof:

Usually if PMF of $X$ is given, it is easier compute to $Var(X)$ using the 2nd formula than the original definition.

Usual Notation

Let $X$ be a r.v.

Example 7

Let $X$ have this PMF: for $k=1,2,3,4$, $p_k=P[X=k]=\frac{k}{10}$.

Then

Finally, $Var[X] = E[X^2]-E[X]^2 = 10-9=1$

Properties of the variance: Let $c\in R$, let $X$ & $Y$ be independent r.v. Then

Example 8

Let $X\sim Binom(n,p)$, therefore,

where $X_i$ are i.i.d $Bernoulli(p)$. Thus

Variance of Bernoulli:

Thus,

In Ch2, we will calculate $Var(X\sim Geom(p))=\frac{1-p}{p^2}$.

Let $X\sim NegBin(n,p)$, then $Var(X)=\frac{n(1-p)}{p^2}$.

Proposition: Let $X$ be a r.v. Let $c\in R$, then

where $\mu=E[X]$.

Proof:

Indeed, for $c=x$, we get the definition of $Var(X)$. For $c\neq\mu$ we get

Example 9

Let $X$ be a uniform r.v. is the set of integers from 1 to $N$, $P[X=k]=\frac{1}{N}$ for $k=1,2,3,…,n$ (definition).

Then


End note

Sometimes people (the professor) use different notation for expectation and variance,

Just choose whichever you like. I prefer ( ) to [ ].



Disqus Comment 0