STT 861 Theory of Prob and STT I Lecture Note - 10
2017-11-08
Solving a problem in midterm exam; conditional distribution, definition, example and its expectation and variance.
Portal to all the other notes
- Lecture 01 - 2017.09.06
- Lecture 02 - 2017.09.13
- Lecture 03 - 2017.09.20
- Lecture 04 - 2017.09.27
- Lecture 05 - 2017.10.04
- Lecture 06 - 2017.10.11
- Lecture 07 - 2017.10.18
- Lecture 08 - 2017.10.25
- Lecture 09 - 2017.11.01
- Lecture 10 - 2017.11.08 -> This post
- Lecture 11 - 2017.11.15
- Lecture 12 - 2017.11.20
- Lecture 13 - 2017.11.29
- Lecture 14 - 2017.12.06
Lecture 10 - Nov 08 2017
Exam question
Let , , .
Q: Let , , find law ().
A: let be the CDF of . Let .
It is uniform in .
Conditional distribution - Chapter 5
Recall joint density of a vector is , and are real numbers.
Definition (Theorem): The conditional law of given is
Consequently, the CDF of given is
Discrete case using the same idea.
For , no change in notation.
For , use instead of integrals.
Example 1
Let and , here we see law .
This type of example is called a mixture. A mixture is when the parameter of one random variable is determined by the value of another random variable.
In our example, we would like to find law unconditional on . We want to find for any .
It is Poisson distribution with parameter .
Home exercise
Read and understand the section of the book in Chapter 5 on Markov Chains.
Conditional expectations - Chapter 5.2
For the discrete case,
Super important theorem (definition)
let and be given. Let
Theorem:
(the idea of “tower property”: . “Given ” means is not random).
Proof: use the property to prove the theorem
Notice:
- this is the function with replaced by .
- . If we put in another function.
Now recall . It turns out
Explanation: The unconditional variance of is the expected conditional variance plus the variance of the conditional expectation.
Example 2
Let , .
(We found that )
From the theorem, ignoring the fact above, we apply
If we give some information and then take the information away, it is equivalent to not giving information at the first place.
We verify the theorem above.
For variance
This is also correct.