My Avatar

LanternD's Castle

An electronics enthusiast - survive technically

STT 861 Theory of Prob and STT I Lecture Note - 7

2017-10-18

Survival function and its example; transformation of random variables, scale and rate parameters and their examples; Joint probability density and its example; independent continuous random variables.

Portal to all the other notes

Lecture 07 - Oct 18 2017

Survival function

Definition: Survival function S of a r.v. X is 1-CDF.

S(x)=1F(x)=P(X>x)

if F is the CDF of X.

This is meaningful especially if X is a waiting time. Indeed S(x) is the chance of survival to “age” x.

Example 1

Memoryless property of the exponential distribution. Let XExp(λ). We know the CDF F(x)=1eλx. Then

S(x)=1F(x)=eλx

Important: this characterizes the exponential distribution.

Q: What is P(X+h|X>x) (P(Given X survival to age x, what is the chance of survival h units of time longer))?

A:

P=P({X>x}{X>x+h})P(X>x)=P(X>x+h)P(X>x)=S(x+h)S(x)=eλh

This proves that for Xexp(λ), the information of how long we have survived tells us nothing about the likelihood of surviving h units of time longer (Nothing to do with x, the current “age”).

FACT: in a Poisson(λ) process, the waiting time between two jumpers (arrivals) are i.i.d. Exp(λ) r.v.’s.

Sketch of proof: Let’s prove first that X1Exp(λ).

This X1 is the first time that N reaches 1.

The event {X1<t} is the same as Nt1.

Therefore,

P(X1<t)=P(Nt1)=1P(Nt=0)=1eλt

We also see this is the cdf of X1 . We recognize the formula as FX1(t)=1eλt.

Now, let’s convince ourselves that X2Exp(λ) and X1 and X2 are independent. This is true because N has independent increments. This is related to the memoryless property of Xi’s.

Example of transformation of r.v.’s

Let XExp(1), let λ>0 be fixed (not random). Let Y=1λX. Let’s find the distribution of Y via its CDF.

SY(y)=P(Y>y)=P(xλ)=P(x>λy) ={ eλyy01y<0 

We recognize, via this survival function, that Y has the same CDF as Exp(λ).

Example 2

Let UUniform(0,1). Let Y=lnU. Find Y’s law (distribution) via the survival function. (Note: when lnU<0, Y>0)

A:

P(Y>y)=P(lnU>y)=P(1U>ey)=P(U<ey)=ey (F(U)=u)

(Because the exp function is strictly increasing)

We recognize that YExp(1).

Note: if we have UUniform(0,1), and we want to create a r.v. ZExp(λ), we can take Z=1λln(U).

Example of scale and rate parameters

We saw that if we multiply an Exp r.v., with parameter λ by a constant c, we get Exp with parameter λc.

For instance,

YExp(λ),Y=1λX Z=cY=cλXExp(λc)

This shows that 1λ is a scale parameter.

We also say λ is a rate parameter when X is multiplied by c.

For instance, λ is the rate of arrivals of the Poisson r.v. (or process) in a time interval of length 1. But 1λ is the average waiting time for the next arrival. It is the scale of the waiting time.

Generally speaking, a r.v. X (or its law) has a scale parameter α if the CDF of X has this form

FX(x)=F¯(xα)

We also say X has a rate parameter λ if the CDF of X has this form

FX(x)=F~(λx)

Example 3

We just saw, for XExp(λ),

FX(x)=1eλx

so

F~(y)=1ey

Also

FX(x)=1ex1/λ F¯(y)=1ey

and 1λ is the scale paremeter α.

Theorems: if α is a scale parameter for X, then

fX(x)=1αf(xα)

(α=1 is the special case)

If λ is a rate parameter for X, then

fX(x)=λf(λx)

(λ=1 is the special case)

Then, let Xα have scale parameter α, then

E(Xα)=αE(X1)

X1 is Xα with α=1.

Var(Xα)=α2Var(X1)

let Xλ have rate parameter λ, then

E(Xλ)=1λE(X1) Var(Xλ)=1λ2Var(X1)

Theorem: Arbitrary function of random variables.

Let X have density fX. Let Ψ be a strictly monotone function (increasing or decreasing). So Ψ’ the derivative basically exists, and the inverse function Ψ1 exists. Let Y=Ψ(X). Then Y has this density.

fY(y)=fX(Ψ1(y))1Ψ(Ψ1(y))

Idea of proof:

Start from CDF, of Y and use chain rule.

Exercise: Use this theorem to check the lnUExp(1) when UUniform(0,1).

Definition: (Joint probability density) A pair of r.v. (X,Y) has a joint probability density fX,Y(x,y) (a function on R2) if

P(X[a,b] and Y[c,d])=cd(abfX,Y(x,y)dx)dy

Definition (More like a theorem): if fX,Y(x,y) as above is really =fX(x)fY(y), then X and Y are independent.

Proof:

P(XdxYdy)=fX,Y(x,y)dxdy=fX(x)fY(y)dxdy=fX(x)dxfY(y)dy=P(Xdx)P(Ydy)

It also proves that fx is the density of X and fY is the density of Y.

Example 4

Let fX,Y(x,y)=15e5x3y if x0,y0, and 0 otherwise. We see

fX,Y(x,y)=5e5x×3e3y=fX(x)×fY(y)

(the places where fX, fY and fX,Y=0, do match up) See X and Y independent and XExp(5) and YExp(3).

Example 5

Let fX,Y(x,y)=1 when x[0,1]and y[0,1], =0 otherwise. Then fX,Y(x,y)=fX(x)×fY(y), each of them is i.i.d Uniform(0,1).

Example 6

Let fX,Y(x,y)=const if 0xy1, =0 otherwise. Then the area in a 2D surface is a triangle. The constant is 2. We say that (X,Y) is uniform in that triangle. However, X and Y are not independent. Because we have

XY

This is a non-random relation, so X and Y are dependent.

General theorem: For most shapes in R2 we can define the uniform density on that shape if its area is finite, with parameter of 1area. Moreover, if the shape has a center of symmetric then X and Y are independent. When the shape is the rectangle [a,b]×[c,d]. Then X independent Y and XUniform[a,b], Y[c,d]

For a circle shape, X and Y are not independent, but ρ and θ are independent.

X1=ρcosθ,Y1=ρsinθ,(ρ,θ)

are independent.

θUniform[0,2π]

Exercise: find the density of ρ.



Disqus Comment 0