STT 861 Theory of Prob and STT I Lecture Note - 8
2017-10-25
Proof of the biased sample mean; Example 3.5.3 in the text book; Normal distribution, joint normal distribution (multivariate), Gamma distribution.
Portal to all the other notes
- Lecture 01 - 2017.09.06
- Lecture 02 - 2017.09.13
- Lecture 03 - 2017.09.20
- Lecture 04 - 2017.09.27
- Lecture 05 - 2017.10.04
- Lecture 06 - 2017.10.11
- Lecture 07 - 2017.10.18
- Lecture 08 - 2017.10.25 -> This post
- Lecture 09 - 2017.11.01
- Lecture 10 - 2017.11.08
- Lecture 11 - 2017.11.15
- Lecture 12 - 2017.11.20
- Lecture 13 - 2017.11.29
- Lecture 14 - 2017.12.06
Lecture 08 - Oct 25 2017
For video record
(This part is similar to previous note. We recorded a video for it so the professor talked about it once again.)
Basically it is the prove that sample variance biased.
Recall,
Note that is the expectation of a r.v. which takes the value with probability = .
Therefore
and
Now recall the formula:
Use this with defined above:
Now the question is: what is the bias of ? For this we replace each by , where ’s are i.i.d with mean and variance. The resulting expression is
We are interested in . We want to know if this or not.
Now use formula above with and and we take on both side.
We proved that . It is biased with .
Example 3.5.3, Page 100
has density is 2 if , and 0 otherwise.
First compute the joint CDF of ,
General definition
If , we integrate between 0 and ,
Now we integrates between and with respect to .
So we have proved that when ,
If , we still compute the same integral
()
This proven when .
Compute the marginal density of and .
In general, marginal density when fixing . Similarly, where is fixed.
If ,
If , then .
Special continuous distributions - Chapter 4
Definition: is a normal with parameters and if it has this density
Facts:
- , because the density is 0;
- (prove this using one integration by parts)
Notation: .
Definition: is normal with parameters and if it has the density
Facts:
Notation:
Proof: comes from the case by using change of variables.
Fact: let , let and be fixed. Let . Then .
Fact: let and be two independent r.v.’s respectively, , , then
Moral of the story: the class of normal r.v.’s is stable by linear combination.
Example 1
let , , . Assume they are independent. . Find , .
A: By the above moral, should be normal.
,
Example 2
Let , let
so , . Because is a linear transformation of .
Joint normal distribution (multivariate normal)
Definition: The vector is normal with mean and covariance matrix . (here , , and , and sometimes people use the letter or instead of ), if its joint density is
(Notice: and are column vectors)
The stuff in is
Sample case, , , so and
So
This matches .
when with independent and .
In general, if are independent normals, , then
We recognize that if and are bivariate normal, then and are independent.
Normally, (can not infer) and are independent. But it does when is bivariate normal.
Question: let’s create a multivariate normal vector with covariance matrix to create a vector , where all ’s are i.i.d (standard normals’’).
(It’s easy to use Box-Mueller transformation to do so).
How to create a from this ?
Answer: Recall the notion of square-root of a matrix ( is positive definite).
(Linear algebra: )
There exists a matrix which is ’’.
Consider
We know (by stability by linear combinations) that is multivariate normal.
Note
What about ?
So we see that .
Exercise: For , find a square-root for a matrix. There should be a place in the book where the author does that in hiding for the purpose of creating a bivariate normal.
Gamma distribution - Chapter 4.3
Definition: A r.v. is Gamma with shape parameter and scale parameter if it has this density
where the constant , if is an integer,
Notation: .
Fact:
- when : .
- Consider a sequence , which are . Assume they are independent. Then, with , we get
Consequently, apply the above with , the sum of i.i.d r.v.’s is a r.v. .
Specifically, this proves exactly that the th arrival’’ time of a process is a r.v.
- , because
- When is not an integer, , let , , . (Exercise: prove this at home).