All About Moment Generating Function With Examples

Probability / Thursday, October 4th, 2018
(Last Updated On: November 16, 2018)

Moment generating function

In this article we will first learn what a moment generating function (mgf) is and then we will learn how to use moment generating functions. So far we have considered in detail only two most important characteristic s of a random variable, namely, the mean and the variance.

We have seen that the mean and variance of a random variable contain important information about the random variable, or, more preciously, about the distribution function of that variable. Now we will discuss about the density of a random variable.

Density of random variable, moment generating function is very powerful computational tool. They make certain computations much shorter.


Let X be a discrete random variable and let f(x) be its probability mass function (pmf). Then the moment generating function of X is the expected value of the function eXt, the variable ‘t’ is just a parameter and is denoted by MX(t).
\[{{M}_{X}}(t)=E\left( {{e}^{Xt}} \right)=\sum\limits_{x}{{{e}^{Xt}}f(x)}\]
For continuous random variable
\[{{M}_{X}}(t)=E\left( {{e}^{Xt}} \right)=\int{{{e}^{Xt}}f(x)dx}\]
Provided the summation or integration is finite for some interval of t around zero. That is, it is absolutely convergent for some positive integer h such that –h < t <h.

 Example 01

The pdf for the score X of a randomly selected student is given below


1 2 3 4 5
f(X) 0.15 0.20 0.40 0.15


Then what is the moment generating function of the random variable X?


By definition, we have
\[{{M}_{X}}(t)=E\left( {{e}^{Xt}} \right)=\sum\limits_{x}{{{e}^{Xt}}f(x)}\]

 Example 02

Let N denotes the number of flips of a coin until a tail appears and let f(n) = (1/2)n. Find its moment generating function.


We have,
\[{{M}_{N}}(t)=E\left( {{e}^{Nt}} \right)=\sum\limits_{n}{{{e}^{Nt}}f(n)}\]
\[={{e}^{t}}{{\left( \frac{1}{2} \right)}^{1}}+{{e}^{2t}}{{\left( \frac{1}{2} \right)}^{2}}+{{e}^{3t}}{{\left( \frac{1}{2} \right)}^{3}}+{{e}^{4t}}{{\left( \frac{1}{2} \right)}^{4}}+{{e}^{5t}}{{\left( \frac{1}{2} \right)}^{5}}+…\]
\[={{\left( \frac{{{e}^{t}}}{2} \right)}^{1}}+{{\left( \frac{{{e}^{t}}}{2} \right)}^{2}}+{{\left( \frac{{{e}^{t}}}{2} \right)}^{3}}+{{\left( \frac{{{e}^{t}}}{2} \right)}^{4}}+{{\left( \frac{{{e}^{t}}}{2} \right)}^{5}}+…\]
Since the above series is GM with a = et/2 and r = et/2 and sum of series is

 Example 03

Find the moment generating function of random variable X whose probability function is
\[f(x)=a{{e}^{-ax}},x\ge 0.\]

By definition of Moment generating function, we have
\[{{M}_{X}}(t)=E\left( {{e}^{Xt}} \right)=\int\limits_{0}^{\infty }{{{e}^{xt}}f(x)dx}\]
\[\Rightarrow {{M}_{X}}(t)=\int\limits_{0}^{\infty }{{{e}^{xt}}}a{{e}^{-ax}}dx=\int\limits_{0}^{\infty }{a{{e}^{-x\left( a-t \right)}}dx}\]
\[\therefore {{M}_{X}}(t)=\left[ \frac{-a{{e}^{-x\left( a-t \right)}}}{a-t} \right]_{0}^{\infty }=\frac{a}{a-t}\]

Moments from Moment generating function

We have by definition of Moment generating function
\[{{M}_{X}}(t)=E\left( {{e}^{Xt}} \right)=E\left( 1+tX+\frac{{{t}^{2}}{{X}^{2}}}{2!}+\frac{{{t}^{3}}{{X}^{3}}}{3!}+…+\frac{{{t}^{n}}{{X}^{n}}}{n!}+… \right)\]
\[=1+tE\left( X \right)+\frac{{{t}^{2}}}{2!}E\left( {{X}^{2}} \right)+\frac{{{t}^{3}}}{3!}E\left( {{X}^{3}} \right)+…+\frac{{{t}^{n}}}{n!}E\left( {{X}^{n}} \right)+…\]
\[=1+t\mu _{1}^{‘}+\frac{{{t}^{2}}}{2!}\mu _{2}^{‘}+\frac{{{t}^{3}}}{3!}\mu _{3}^{‘}+…+\frac{{{t}^{n}}}{n!}\mu _{n}^{‘}+…\]
\[{{M}_{X}}(t)=\sum\limits_{n=0}^{\infty }{\frac{{{t}^{n}}}{n!}\mu _{n}^{‘}}\]
\[Where,\mu _{n}^{‘}=E\left( {{X}^{n}} \right)=\int{{{e}^{xt}}f(x)dx,X~~\text{is continuous distribution}}\]
\[\mu _{n}^{‘}=E\left( {{X}^{n}} \right)=\sum\nolimits_{x}{{{e}^{xt}}f(x)dx,X~~\text{is }~\text{discrete distribution}}\]
Thus the rth moment about origin µ’r is the coefficient of (tr/r!) in MX(t).

Moment about any point X = a

The moment generating function about X = a is defined as
\[{{M}_{X}}\left( t \right)\left( \text{about X }=\text{a} \right)=E\left( {{e}^{\left( X-a \right)t}} \right)\]
\[=E\left( 1+t\left( X-a \right)+\frac{{{t}^{2}}{{\left( X-a \right)}^{2}}}{2!}+\frac{{{t}^{3}}{{\left( X-a \right)}^{3}}}{3!}+…+\frac{{{t}^{n}}{{\left( X-a \right)}^{n}}}{n!}+… \right)\]
\[=1+tE\left( X-a \right)+\frac{{{t}^{2}}E{{\left( X-a \right)}^{2}}}{2!}+\frac{{{t}^{3}}E{{\left( X-a \right)}^{3}}}{3!}+…+\frac{{{t}^{n}}E{{\left( X-a \right)}^{n}}}{n!}+…\]
\[=1+t\mu _{1}^{‘}+\frac{{{t}^{2}}}{2!}\mu _{2}^{‘}+\frac{{{t}^{3}}}{3!}\mu _{2}^{‘}+…+\frac{{{t}^{n}}}{n!}\mu _{n}^{‘}+…\]
Where µ’r =E{(X –a)r}, is the rth moment about the point X = a.

Note: We  can also calculate rth moment about origin by differentiating MX(t).
\[{{\left[ \frac{{{d}^{r}}}{d{{t}^{r}}}\left\{ {{M}_{X}}\left( t \right) \right\} \right]}_{t=0}}={{\left[ \frac{\mu _{r}^{‘}}{r!}.r!+\mu _{r+1}^{‘}.t+\mu _{r+2}^{‘}.\frac{{{t}^{2}}}{2!}+… \right]}_{t=0}}\]

 Example 04

Calculate the first three moments about origin with the help of mgf of probability density function
\[f(x)={{e}^{-x}},0\le x<\infty \]


By definition of mgf, we have
\[{{M}_{X}}(t)=E\left( {{e}^{Xt}} \right)=\int\limits_{0}^{\infty }{{{e}^{xt}}f(x)dx}\]
\[\Rightarrow {{M}_{X}}(t)=\int\limits_{0}^{\infty }{{{e}^{xt}}.{{e}^{-x}}dx=\int\limits_{0}^{\infty }{{{e}^{-x(1-t)}}dx}}\]
\[\Rightarrow {{M}_{X}}(t)=\left[ -\frac{{{e}^{-x(1-t)}}}{1-t} \right]_{0}^{\infty }=\frac{1}{1-t}\]
\[\therefore {{M}_{X}}(t)=1+t+{{t}^{2}}+{{t}^{3}}+{{t}^{4}}+…\]
Now to find the first, second and third moments about origin we will differentiate MX(t) once, twice and thrice with respect to t at t = 0.
\[\mu _{1}^{‘}={{\left[ \frac{d}{dt}\left\{ {{M}_{X}}(t) \right\} \right]}_{t=0}}={{\left[ 1+2t+3{{t}^{2}}+4{{t}^{3}}+… \right]}_{t=0}}=1\]
\[\mu _{2}^{‘}={{\left[ \frac{{{d}^{2}}}{d{{t}^{2}}}\left\{ {{M}_{X}}(t) \right\} \right]}_{t=0}}={{\left[ 2+6t+12{{t}^{2}}+… \right]}_{t=0}}=2\]
\[\mu _{3}^{‘}={{\left[ \frac{{{d}^{3}}}{d{{t}^{3}}}\left\{ {{M}_{X}}(t) \right\} \right]}_{t=0}}={{\left[ 6+24t+… \right]}_{t=0}}=6\]

 Example 05

Let X be a random variable and its probability mass function is p(X = r) = qr-1p, r = 1, 2, 3, … Find the mgf of X and hence its mean and variance.


By definition,
\[{{M}_{X}}(t)=E\left( {{e}^{Xt}} \right)=\sum\limits_{r=1}^{\infty }{{{e}^{tr}}{{q}^{r-1}}p}\]
\[\Rightarrow {{M}_{X}}(t)=\frac{p}{q}{{\sum\limits_{r=1}^{\infty }{\left( q{{e}^{t}} \right)}}^{r}}=\frac{p}{q}.q{{e}^{t}}{{\sum\nolimits_{r=1}^{\infty }{\left( q{{e}^{t}} \right)}}^{r-1}}\]
\[\therefore {{M}_{X}}(t)=p{{e}^{t}}\left\{ 1+q{{e}^{t}}+{{\left( q{{e}^{t}} \right)}^{2}}+… \right\}=\frac{p{{e}^{t}}}{1-q{{e}^{t}}}\]
\[\mu _{1}^{‘}={{\left[ \frac{d}{dt}\left\{ {{M}_{X}}(t) \right\} \right]}_{t=0}}={{\left[ \frac{d}{dt}\left( \frac{p{{e}^{t}}}{1-q{{e}^{t}}} \right) \right]}_{t=0}}\]
\[={{\left[ \frac{p{{e}^{t}}}{{{\left( 1-q{{e}^{t}} \right)}^{2}}} \right]}_{t=0}}=\frac{p}{{{\left( 1-q \right)}^{2}}}\frac{p}{{{p}^{2}}}=\frac{1}{p}[as,p+q=1]\]
\[\mu _{2}^{‘}={{\left[ \frac{{{d}^{2}}}{d{{t}^{2}}}\left\{ {{M}_{X}}(t) \right\} \right]}_{t=0}}={{\left[ \frac{{{d}^{2}}}{d{{t}^{2}}}\left( \frac{p{{e}^{t}}}{1-q{{e}^{t}}} \right) \right]}_{t=0}}\]
\[=\frac{p\left( 1+q \right)}{{{\left( 1-q \right)}^{3}}}=\frac{1+q}{{{p}^{2}}}\]
\[\therefore \text{Mean}=\mu _{1}^{‘}=\frac{1}{p}\]
\[\therefore Variance=\mu _{2}^{‘}-{{\left( \mu _{1}^{‘} \right)}^{2}}=\frac{1+q}{{{p}^{2}}}-\frac{1}{{{p}^{2}}}=\frac{q}{{{p}^{2}}}\]

Properties of Moment generating function

Let X be a random variable then
\[(i){{M}_{cx}}(t)={{M}_{x}}(ct),where~\text{c is constant}\]


<<Previous   Next>>


Leave a Reply

Your email address will not be published. Required fields are marked *