Continuous Distributions in Depth
This lesson explores the continuous uniform and exponential distributions in greater depth, covering their derivations, properties, and relationships to other distributions. These two distributions form the foundation for much of the work in Further Statistics 2.
The Continuous Uniform Distribution U(a,b)
PDF and CDF
If X∼U(a,b):
f(x)=b−a1for a≤x≤b
F(x)=⎩⎨⎧0b−ax−a1x<aa≤x≤bx>b
Moments
| Property | Formula |
|---|
| E(X) | 2a+b |
| E(X2) | 3a2+ab+b2 |
| Var(X) | 12(b−a)2 |
Derivation of the variance:
E(X2)=∫abb−ax2dx=b−a1⋅3b3−a3=3a2+ab+b2
Var(X)=3a2+ab+b2−(2a+b)2=124(a2+ab+b2)−3(a+b)2=12(b−a)2
Worked Example
A bus arrives at a stop at a time uniformly distributed between 8:00 and 8:20. If you arrive at 8:05, find the probability of waiting more than 10 minutes.
You need the bus to arrive after 8:15. X∼U(0,20) (minutes after 8:00). The bus arrives after you, so we condition on X≥5.
P(X>15∣X≥5)=P(X≥5)P(X>15)=15/205/20=31
Exam Tip: The uniform distribution is "memoryless" in a different sense to the exponential — probabilities over sub-intervals are proportional to their lengths.
The Exponential Distribution Exp(λ)
PDF, CDF, and Survival Function
f(x)=λe−λx,x≥0
F(x)=1−e−λx,x≥0
P(X>x)=e−λx(survival function)
Moments
| Property | Value |
|---|
| E(X) | 1/λ |
| Var(X) | 1/λ2 |
| Median | λln2 |
| Mode | 0 |
Derivation of E(X) by integration by parts:
E(X)=∫0∞xλe−λxdx
Let u=x, dv=λe−λxdx. Then du=dx, v=−e−λx.
E(X)=[−xe−λx]0∞+∫0∞e−λxdx=0+λ1=λ1
The Memoryless Property
The exponential distribution is the only continuous distribution with the memoryless property:
P(X>s+t∣X>s)=P(X>t)
Proof:
P(X>s+t∣X>s)=P(X>s)P(X>s+t)=e−λse−λ(s+t)=e−λt=P(X>t)
Interpretation: If you have already waited s minutes for a bus, the probability of waiting at least t more minutes is the same as the probability of waiting t minutes from the start.
Connection: Poisson Process and Exponential Waiting Times
If events follow a Poisson process with rate λ, then:
- The number of events in time t follows Po(λt)
- The waiting time between consecutive events follows Exp(λ)
- The waiting time until the n-th event follows a Gamma(n,λ) distribution
This connection is crucial for understanding why the exponential distribution arises in practice.
Worked Example
Customers arrive at a rate of 3 per hour.
(a) Find the probability that the next customer arrives within 10 minutes.
Waiting time T∼Exp(3) in hours, so T∼Exp(3) and 10 min = 1/6 hour.
P(T≤1/6)=1−e−3/6=1−e−0.5=1−0.6065=0.3935
(b) Given that no customer has arrived in 20 minutes, find the probability of waiting a further 10 minutes.
By the memoryless property: P(T>30 min∣T>20 min)=P(T>10 min)=e−3/6=e−0.5=0.6065.
Comparing Uniform and Exponential
| Feature | Uniform U(a,b) | Exponential Exp(λ) |
|---|
| Support | [a,b] (bounded) | [0,∞) (unbounded) |
| Shape | Flat (rectangular) | Monotonically decreasing |
| Mean | (a+b)/2 | 1/λ |
| Variance | (b−a)2/12 | 1/λ2 |
| Memoryless | No | Yes |
| Typical use | Random point in interval | Waiting time between events |
The Gamma Distribution (Awareness)
The sum of n independent Exp(λ) random variables follows a Gamma(n,λ) distribution:
f(x)=(n−1)!λnxn−1e−λxfor x≥0
| Property | Value |
|---|
| E(X) | n/λ |
| Var(X) | n/λ2 |
This generalises the exponential (which is Gamma(1,λ)) and models the waiting time until the n-th event in a Poisson process.
Summary
- U(a,b): flat density, mean (a+b)/2, variance (b−a)2/12.
- Exp(λ): decreasing density, mean 1/λ, variance 1/λ2, memoryless.
- The exponential distribution is the continuous analogue of the geometric distribution.
- Poisson process: events ∼Po(λt), waiting times ∼Exp(λ).
- The Gamma distribution generalises the exponential to sums of independent exponentials.
Exam Tip: When a question involves waiting times and a Poisson process, the exponential distribution is your go-to model. Always state the parameter λ clearly and convert time units if needed.