My Avatar

LanternD's Castle

PhD Student in ECE @ MSU

STT 861 Theory of Prob and STT I Lecture Note - 7


Survival function and its example; transformation of random variables, scale and rate parameters and their examples; Joint probability density and its example; independent continuous random variables.

Portal to all the other notes

Lecture 07 - Oct 18 2017

Survival function

Definition: Survival function $S$ of a r.v. $X$ is 1-CDF.

if $F$ is the CDF of $X$.

This is meaningful especially if $X$ is a waiting time. Indeed $S(x)$ is the chance of survival to “age” $x$.

Example 1

Memoryless property of the exponential distribution. Let $X\sim Exp(\lambda)$. We know the CDF $F(x)=1-e^{-\lambda x}$. Then

Important: this characterizes the exponential distribution.

Q: What is $P(X+h|X>x)$ (P(Given X survival to age $x$, what is the chance of survival $h$ units of time longer))?


This proves that for $X\sim exp(\lambda)$, the information of how long we have survived tells us nothing about the likelihood of surviving $h$ units of time longer (Nothing to do with $x$, the current “age”).

FACT: in a Poisson($\lambda$) process, the waiting time between two jumpers (arrivals) are i.i.d. $Exp(\lambda)$ r.v.’s.

Sketch of proof: Let’s prove first that $X_1\sim Exp(\lambda)$.

This $X_1$ is the first time that $N$ reaches 1.

The event $\{X_1<t\}$ is the same as ${N_t\geq 1}$.


We also see this is the cdf of $X_1$ . We recognize the formula as $F_{X_1}(t)=1-e^{-\lambda t}$.

Now, let’s convince ourselves that $X_2\sim Exp(\lambda)$ and $X_1$ and $X_2$ are independent. This is true because $N$ has independent increments. This is related to the memoryless property of $X_i$’s.

Example of transformation of r.v.’s

Let $X\sim Exp(1)$, let $\lambda>0$ be fixed (not random). Let $Y=\frac{1}{\lambda}X$. Let’s find the distribution of $Y$ via its CDF.

We recognize, via this survival function, that $Y$ has the same CDF as $Exp(\lambda)$.

Example 2

Let $U\sim Uniform(0,1)$. Let $Y=-\ln U$. Find $Y$’s law (distribution) via the survival function. (Note: when $\ln U<0$, $Y>0$)


(Because the exp function is strictly increasing)

We recognize that $Y\sim Exp(1)$.

Note: if we have $U\sim Uniform(0,1)$, and we want to create a r.v. $Z\sim Exp(\lambda)$, we can take $Z=-\frac{1}{\lambda}\ln(U)$.

Example of scale and rate parameters

We saw that if we multiply an $Exp$ r.v., with parameter $\lambda$ by a constant $c$, we get $Exp$ with parameter $\frac{\lambda}{c}$.

For instance,

This shows that $\frac{1}{\lambda}$ is a scale parameter.

We also say $\lambda$ is a rate parameter when $X$ is multiplied by $c$.

For instance, $\lambda$ is the rate of arrivals of the Poisson r.v. (or process) in a time interval of length 1. But $\frac{1}{\lambda}$ is the average waiting time for the next arrival. It is the scale of the waiting time.

Generally speaking, a r.v. $X$ (or its law) has a scale parameter $\alpha$ if the CDF of $X$ has this form

We also say $X$ has a rate parameter $\lambda$ if the CDF of $X$ has this form

Example 3

We just saw, for $X\sim Exp(\lambda)$,



and $\frac{1}{\lambda}$ is the scale paremeter $\alpha$.

Theorems: if $\alpha$ is a scale parameter for $X$, then

($\alpha=1$ is the special case)

If $\lambda$ is a rate parameter for $X$, then

($\lambda=1$ is the special case)

Then, let $X_\alpha$ have scale parameter $\alpha$, then

$X_1$ is $X_\alpha$ with $\alpha=1$.

let $X_\lambda$ have rate parameter $\lambda$, then

Theorem: Arbitrary function of random variables.

Let $X$ have density $f_X$. Let $\Psi$ be a strictly monotone function (increasing or decreasing). So $\Psi$’ the derivative basically exists, and the inverse function $\Psi^{-1}$ exists. Let $Y=\Psi(X)$. Then $Y$ has this density.

Idea of proof:

Start from CDF, of $Y$ and use chain rule.

Exercise: Use this theorem to check the $-\ln U \sim Exp(1)$ when $U\sim Uniform(0,1)$.

Definition: (Joint probability density) A pair of r.v. $(X,Y)$ has a joint probability density $f_{X,Y}(x,y)$ (a function on $R^2$) if

Definition (More like a theorem): if $f_{X,Y}(x,y)$ as above is really $=f_X(x)f_Y(y)$, then $X$ and $Y$ are independent.


It also proves that $f_x$ is the density of $X$ and $f_Y$ is the density of $Y$.

Example 4

Let $f_{X,Y}(x,y)=15e^{-5x-3y}$ if $x\geq0,y\geq0$, and 0 otherwise. We see

(the places where $f_X$, $f_Y$ and $f_{X,Y}=0$, do match up) See $X$ and $Y$ independent and $X\sim Exp(5)$ and $Y\sim Exp(3)$.

Example 5

Let $f_{X,Y}(x,y)=1$ when $x\in[0,1]$and $y\in[0,1]$, =0 otherwise. Then $f_{X,Y}(x,y)=f_X(x)\times f_Y(y)$, each of them is i.i.d $Uniform(0,1)$.

Example 6

Let $f_{X,Y}(x,y)=const$ if $0\leq x\leq y\leq 1$, =0 otherwise. Then the area in a 2D surface is a triangle. The constant is 2. We say that $(X,Y)$ is uniform in that triangle. However, $X$ and $Y$ are not independent. Because we have

This is a non-random relation, so $X$ and $Y$ are dependent.

General theorem: For most shapes in $\mathbb{R}^2$ we can define the uniform density on that shape if its area is finite, with parameter of $\frac{1}{area}$. Moreover, if the shape has a center of symmetric then $X$ and $Y$ are independent. When the shape is the rectangle $[a,b]\times [c,d]$. Then $X$ independent $Y$ and $X\sim Uniform [a,b]$, $Y\sim [c,d]$

For a circle shape, $X$ and $Y$ are not independent, but $\rho$ and $\theta$ are independent.

are independent.

Exercise: find the density of $\rho$.

Disqus Comment 0