My Avatar

LanternD's Castle

PhD Student in ECE @ MSU

STT 861 Theory of Prob and STT I Lecture Note - 10

2017-11-08

Solving a problem in midterm exam; conditional distribution, definition, example and its expectation and variance.

Portal to all the other notes

Lecture 10 - Nov 08 2017

Exam question

Let $X\sim U(50,90)$, $Z\sim U(70,90)$, $Law(Z) = Law(X|X\geq 70)$.

Q: Let $X\sim U(a,b)$, $c\in (a,b)$, find law ($X|X>c$).

A: let $F$ be the CDF of $X|X>c$. Let $x\in [c,b]$.

It is uniform in $(c, b)$.


Conditional distribution - Chapter 5

Recall joint density of a vector $(X,Y)$ is $f_{X,Y}(x,y)$, $x$ and $y$ are real numbers.

Definition (Theorem): The conditional law of $X$ given $Y$ is

Consequently, the CDF of $X$ given $Y=y$ is

Discrete case using the same idea.

For $Y$, no change in notation.

For $X$, use $\sum$ instead of integrals.

Example 1

Let $N\sim Poi(\lambda)$ and $X\sim Bin(N,\theta)$, here we see law $(X|N=k)=Bin(k,\theta)$.

This type of example is called a mixture. A mixture is when the parameter of one random variable is determined by the value of another random variable.

In our example, we would like to find law $(X)$ unconditional on $N$. We want to find $P(X=i)$ for any $k=0,1,2,…$.

It is Poisson distribution with parameter $\lambda’=\lambda\theta$.

Home exercise

Read and understand the section of the book in Chapter 5 on Markov Chains.

Conditional expectations - Chapter 5.2

For the discrete case,

Super important theorem (definition)

let $X$ and $Y$ be given. Let

Theorem: $E(XY) = E(Xg(x))$

(the idea of “tower property”: $E(E(Z|X))=E(Z)$. “Given $X$” means $X$ is not random).

Proof: use the property to prove the theorem

Notice:

Now recall $v(x)$. It turns out

Explanation: The unconditional variance of $Y$ is the expected conditional variance plus the variance of the conditional expectation.

Example 2

Let $X|N \sim Bin (N,\theta)$, $N\sim Poi(\lambda)$.

(We found that $X\sim Poi(\lambda)$)

From the theorem, ignoring the fact above, we apply $E(X)=E(E(X|N))$

If we give some information and then take the information away, it is equivalent to not giving information at the first place.

We verify the theorem above.

For variance

This is also correct.



Disqus Comment 0