My Avatar

LanternD's Castle

An electronics enthusiast - survive technically

STT 861 Theory of Prob and STT I Lecture Note - 9

2017-11-01

Review of the important concepts of previous section; moment generation function; Gamma distribution, chi-square distribution.

Portal to all the other notes

Lecture 09 - Nov 01 2017

Quick Review Session (For the mid-term exam)

Bayes’ theorem

Suppose we have data: an even B that happened.

Possible outcomes: A1,A2,,An.

Model for each Ai: P(Ai) given. This is the “prior” model.

Model for each relation between Ai and B: P(B|Ai). This is the “likelihood” model.

Theorem:

P(Ai|B)=P(B|Ai)P(Ai)j=1nP(B|Aj)P(Aj)

Quick Example (The Chevalier de Méré example in Note 3).

P(one six in 4 rolls of a die) = 1 - P(no six in 4 rolls of a die) = 1(56)40.5177

P(one double-six in 24 rolls of 2 dice) = 1(3536)240.4914

Discrete and continuous variables

Discrete r.v.’s: P(X=xk)=Pk. E(X)=kxkpk.

Continuous Case: P(axb)=abf(x)dx. E(X)=xf(x)dx.

Linearity

Chebyshev and Weak law of large numbers

X is a r.v. Var(X) exists. Then

P(|XE(X)|>ε)Var(X)ε2

This is true no matter how small ε>0 is.

Apply this to X¯=1n(XiE(Xi)), where Xi’s are i.i.d. and Var(Xi)<.

Note E(X)=μ=E(X), Var(X¯)=σ2n.

By Chebyshev:

P(|X¯μ|>ε)σ2hε2

As n, this probability 0.

Special discrete distributions

Let XExp(λ), the density is f(x)=λeλx for x0.

Let Xi be i.i.d Exp(λ). Let N(t) be the # of arrivals in time interval [0,t]. Assume N= Poisson process. Then Xi is a model for the amount of time between i1th and the ith arrivals.

[Use step functions to illustrate.]

Theorem: if N(t) is Poisson(λ) process, and Ti’s are its jump times (arrival times) and Xi=TiTi1, then XiExp(λ) (i.i.d).

What about the distribution of Ti? TiΓ(i,θ=1λ).

Here recall λ is a rate parameter, so θ is a scale parameter.

The density of Tn is

f(x)=λΓ(n)(λx)n1eλx

where x1.

Moment generation function

Method for doing problem 2.2.5.

Let X have the binomial distribution with parameters n and p. Conditionally on X=k, let Y have the binomial distribution with parameters k and r . What is the marginal distribution of Y?

There are lots of way to solve this problem, here we use moment generate functions (mgf).

Definition: Let X be a r.v. Let

MX(t)=E(etX)

where t is fixed. MX(t) is the moment generation function of X.

It turns out, the function usually characterizes the distribution of X.

Example: let XBin(n,p). we know X=X1+X2++Xn (i.i.d Bernoulli(p)).

Now,

MX(t)=E(etX)=E(eX1+X2++Xn)=E(etX1)E(etX2)E(etXn)=(E(etX1))n

and

E(etX1)=p(et)+(1p)×1=1+p(et1)

Therefore,

MX(t)=(1+p(et1))n

Now look at Problem 2.2.5.

YBin(Bin(n,p),r)

Therefore

Y=Y1+Y2++YX

where Yi are i.i.d Bernoulli(r).

Hunch: Y is Binomial(a,b). To prove it: compute MY(t)=(1+b(et1))a

MY(t)=E(etY)=E(et(Y1+Y2++YX))=k=0n(E(...|X=k))P(X=k)=k=0nE(etBin(k,r))P(X=k)=k=0n(1+r(et1))kP(X=k)=k=0nekln(1+r(et1))=k=0nekuPk

This is the defeinition of E(euX)MX(u).

MX(u)=(1+p(eu1))n=(1+p(eln(1+r(et1)))1)n=(1+p(1+r(et1)1))n=(1+pr(et1))n

Therefore we recognize that YBinom(n,pr).

Gamma Distribution

Go back to Gamma distribution.

Example

let ZN(0,1) (standard normal). the fZ(z)=12πexp(z22). Find the density of Y=Z2.

FY(y)=P(Yy)=P(Z2y)=P(yZy)=FZ(y)FZ(y)

Use chain rule to compute fY(y).

fY(y)=dFYdy=ddyF(y)ddyFZ(y)=fZ(y)12yfZ(y)12y=12πy12ey2

We recognize this is the density of Γ(α=12,θ=2).

Chi-square distribution and degree of freedom

This Gamma and every Gamma for which α=n2, where n is an integer, is called χ2(n) (“Chi-squared” with n degrees of freedom).

We see χ2(n)Z12+Z2+6++Zn2. where Zii.i.d N(0,1).

χ2(n)Γ(n2,2)

Q: What about χ2(2)?

A: Gamma(1,2), which is exponential distribution with parameter λ=12.

Q: Now to create XExp(λ=1) using only i.i.d normals N(0,1).

Try this: X=Z12+Z22Exp(12).

X=12(Z12+Z22)Exp(1)

When we need to multiply a scale parameter θ by a constant c, we multiply the random variable by c.

Equivalently, when we need to multiply a rate parameter λ by c, just divide the random variable by c.



Disqus Comment 0