STT 861 Theory of Prob and STT I Lecture Note - 6
2017-10-11
Hypergeometric distribution; Poisson Law; Brownian motion; continous random variables, exponential distribution, cumulative distribution function, uniform distribution; expectation and variance of continuous random variable.
Portal to all the other notes
- Lecture 01 - 2017.09.06
- Lecture 02 - 2017.09.13
- Lecture 03 - 2017.09.20
- Lecture 04 - 2017.09.27
- Lecture 05 - 2017.10.04
- Lecture 06 - 2017.10.11 -> This post
- Lecture 07 - 2017.10.18
- Lecture 08 - 2017.10.25
- Lecture 09 - 2017.11.01
- Lecture 10 - 2017.11.08
- Lecture 11 - 2017.11.15
- Lecture 12 - 2017.11.20
- Lecture 13 - 2017.11.29
- Lecture 14 - 2017.12.06
Lecture 06 - Oct 11 2017
Hypergeometric distribution (continued)
(See Page 66)
A hypergeometric r.v. with parameters :
Have a set of size , a subset of size (‘red’). Pick a sample of size without replacement. Then is the number of elements of type (‘red’) in the sample.
Then for ,
Important: let
The correction factor is called the finite sample correction factor.
Poisson Law
Let and be and and independent. Then is .
(Think of it as arrivals in a store on 2 different days.)
Therefore, is . By the same construction is .
Now as funs from 0 to 1, we have a collection or r.v’s . This is the Poisson process with parameter .
These are the properties of this process. [a stochastic process, like , is a random function where the rules of randomness are specified.]
The word stochastic comes from the Greek word “stochos” (). It means “target”.
It comes from the stroy of Aristotle and target distribution.
Let , then,
- and are independent.
Vocabulary: (2) is called independence of increments. And (1) is usually called stationarity.
Remark: does not .
What is ?
is not . They might have different variance.
Remark: (1) and (2) define the prob distribution of the process uniquely.
Example 2.5.2 (Page 77)
Murder rate . Assume # murders in the interval where is a Poisson process .
Q1: What is prob of two or more murders with in 1 day .
A1:
Q2: What is prob of “2 or more murders for each of 3 consecutive days”?
A2:
Q3: What is the prob of “No murder for 5 days”?
A3:
Q4: Rate during weekdays is 1.2/day, during weekends is 2.5/day. What is the prob of “10 or more murders in 1 week?
A4:
HW discussion (Problem 2.4.4 (a))
, . . Find the distribution of .
We know that , where the ’s are i.i.d. . Similarly we have s are i.i.d. .
Now if we assume and are independent. Then all the s and s are independent. Then are i.i.d. . Therefore by definition is .
Preview to Chapter 3 and Brownian motion
Let be a sequence of i.i.d. r.v.s with and .
Let , then and it turns out gets really small as .
as .
In fact we have the weak law of large numbers:
, as .
Proof: By Chebyshev’s inequality:
What about dividing by something much smaller than ?
Let , then when is large. .
So we has a maybe limiting behavior? Yes. Distribution of tends to bell curve (Normal distribution with variance).
Pick . Roughly speaking .
Let
we find, , .
The distribution converges to Bell curve with variance .
The whole entire collection of converges to a stochastic process called Brownian motion . It has these properties:
- ;
- ;
- and are independent;
- is Normal (bell curve).
Continuous Random Variables (Chapter 3)
Definition: A r.v. is said to be continuous with density is: ,
Properties of densities:
Example 1
let , where , and are positive constants. How should we define for ?
Let’s say . Do the integration.
Exponential distribution
Definition: The r.v. whose density is
is called Exponential with parameter .
Notation:
Cumulative distribution function (CDF)
Definition: The cumulative distribution function (CDF) of with density is defined as before:
- The fundamental theorem of calculus (Leibniz) says has a derivative, it is .
- Also we see:
- is non-decreasing
Example 2
If , then we find for .
Note: the tail of is defined as .
For Exponential, .
Remark: If has density and , then .
Uniform distribution
The r.v. is said to be uniform between 2 fixed values and if its density is a constant .
What about the CDF? It has 0 at and 1 at .
Expectation and variance of continuous random variable (Chapter 3.3)
Definition: let have density , then its expectation is
Example 3
, .
Integration by parts:
Best choice: use a whose is known, and if is a polynomial that’s good because has a degree one less than . Choice: , , , .
Exercise: , find .
Definition: the variance of with density is defined the same as before:
where .
Exercise: Verify that for , and .
for .
- ← Older-STT 861 Theory of Prob and STT I Lecture Note - 5
- Find the Expectation of a Symmetric Probabiliry Density Function-Newer →