The moment generating function MX(t) of a random variable X is defined to be.
MX(t) = E[etX]. The domain of MX is all real numbers t such that etX has finite ...
Moment generating functions
The moment generating function MX (t) of a random variable X is defined to be MX (t) = E[etX ]. The domain of MX is all real numbers t such that etX has finite expectation. When X is a discrete random variable with mass function p(x), we have X MX (t) = etx p(x). x
When X is an absolutely continuous random variable with density f (x), we have Z ∞ MX (t) = etx f (x)dx. −∞
In order to see why MX (t) is called the moment generating function we write # "∞ X tn X n . MX (t) = E[etX ] = E n! n=0
Suppose that MX (t) is finite on −t0 < t ≤ t0 for some t0 > 0. In this case one can show that in the last expression for MX (t) it is legal to exchange the order of expectation and summation, that is MX (t) =
∞ X E[X n ] n=0
n!
tn ,
t ∈ (−t0 , t0 ).
The Taylor series for MX (t) is MX (t) =
∞ n n X t d MX (t)|t=0 . n! dtn
n=0
By comparing coefficients of tn in the last two displays above we get E[X n ] =
dn MX (t)|t=0 . dtn
In particular, we have 0 E[X] = MX (0),
00 E[X 2 ] = MX (0),
00 0 Var(X) = MX (0) − (MX (0))2 .
An important property of the moment generating function MX (t) is that it determines the distribution function of X. More precisely, we have the following result. 1
Theorem 1 If X and Y are two random variables such that MX (t) = MY (t) for all t, then X and Y have the same distribution. Now let’s calculate the moment generating functions of the important random variable we have learnt. Example 2 Suppose that X is a binomial random variable with parameters (np). Then MX (t) = (pet + 1 − p)n ,
t ∈ R.
Example 3 Suppose that X is a Poisson random variable with parameter λ. Then MX (t) = exp(λ(et − 1)),
t ∈ R.
Example 4 Suppose that X is a geometric random variables with parameter p. Then MX (t) =
pet , 1 − (1 − p)et
t < ln
1 . 1−p
Example 5 Let X be uniformly distributed in the interval (a, b). Then MX (t) =
etb − eta , t(b − a)
t ∈ R.
Example 6 Suppose that X is a gamma random variable with parameters (α, λ). Then Z ∞ λα α−1 −λx MX (t) = etx x e dx Γ(α) 0 Z ∞ λα = xα−1 e−(λ−t)x dx Γ(α) 0 µ ¶α λα Γ(α) λ = = Γ(α) (λ − t)α λ−t for all t < λ. In particular, if X is an exponential random variable with parameter λ, then MX (t) =
λ , λ−t
t < λ.
Example 7 If X is a normal random variable with parameters (µ, σ 2 ). Then Z ∞ (x−µ)2 1 etx √ e− 2σ2 dx MX (t) = σ 2π Z−∞ ∞ y2 1 et(y+µ) √ e− 2σ2 dx = σ 2π −∞ Z ∞ y2 1 √ ety− 2σ2 dy. = eµt −∞ σ 2π 2
Now ty −
y2 (y − σ 2 t)2 σ 2 t2 = − + . 2σ 2 2σ 2 2
Consequently Z
∞ (y−σ 2 t)2 1 √ e− 2σ2 dy MX (t) = e e −∞ σ 2π µ ¶ σ 2 t2 = exp µt + . 2 µt
σ 2 t2 2
for all t ∈ R. Theorem 8 If X and Y are independent random variables, then MX+Y (t) = MX (t)MY (t),
t ∈ R.
Combining the last two theorems with examples above, we can easily reprove the following result we proved before Theorem 9 Suppose that X and Y are independent random variables. (1) If X and Y are binomial random variables with parameters (m, p) and (n, p) respectively, then X + Y is a binomial random variable with parameter (m + n, p). (2) If X and Y are Poisson random variables with parameters λ1 and λ2 respectively, then X + Y is a Poisson random variable with parameter λ1 + λ2 . (3)If X and Y are gamma random variables with parameters (α1 , λ) and (α2 , λ) respectively, then X + Y is a gamma random variable with parameter (α1 + α2 , λ). (4)If X and Y are normal random variables with parameters (µ1 , σ12 ) and (µ2 , σ22 ) respectively, then X + Y is a normal random variable with parameter (µ1 + µ2 , σ12 + σ22 ). Remark on Notations
3