lecture24_pgf
lecture24_pgf
Definition 24.1 Let X be an integer valued random variable. The probability generating function (PGF)
of X is defined as :
X
GX (z) , E[z X ] = z i P(X = i).
i
24.1.1 Convergence
For a non-negative valued random variable, there exists R, possibly +∞, such that the PGF converges for
|z| < R and diverges for |z| > R where z ∈ C. GX (z) certainly converges for |z| < 1 and possibly in a larger
region as well. Note that,
X X
|GX (z)| = z i P(X = i) ≤ |z|i .
i i
This implies that GX (z) converges absolutely in the region |z| < 1. Generating functions can be defined
for random variables taking negative as well as positive integer values. Such generating functions generally
converge for values of z satisfying α < |z| < β for some α, β such that α ≤ 1 ≤ β.
Example 1 : Consider the Poisson random variable X with probability mass function
e−λ λi
P(X = i) = , i ≥ 0.
i!
Find the PGF of X.
24-1
24-2 Lecture 24: Probability Generating Functions
24.1.2 Properties
1. GX (1) = 1.
dGX (z)
2. dz = E[X].
z=1
Now,
dGX (z) d X i
= z P(X = i),
dz dz i
(a) X d i
= z P(X = i),
i
dz
X
= iz i−1 P(X = i),
i
dGX (z)
= E[X].
dz z=1
dk GX (z)
3. dz k = E [X (X − 1) (X − 2) · · · (X − k + 1)].
z=1
4. If X and Y are independent and Z = X + Y , then GZ (z) = GX (z)GY (z). The ROC for the PGF of z
is the intersection of the ROCs of the PGFs of X and Y .
Proof :
Since X and Y are independent, they are uncorrelated. This implies that
Hence proved.
N
P
5. Random sum of discrete RVs : Let Y = Xi , where Xi ’s are i.i.d discrete positive integer valued
i=1
random variables and N is independent of Xi ’s. The PGF of Y is GY (z) = GN (GX (z)).
Proof :
GY (z) = E[z Y ] = E E z Y |N
(By law of iterated expectation).
Now,
P
xi
Y
|N = n = E GX (z)N .
E z |N = n = E z i
24.2 Exercise
1. Find the PMF of a random variable X whose probability generating function is given by
( 31 z+ 32 )4
GX (z) = z
2. Suppose there are X0 individuals in initial generation of a population. In the nth generation, the Xn in-
(n) (n) (n) (n) (n) (n)
dividuals independently give rise to numbers of offspring Y1 , Y2 , ..., YXn , where Y1 , Y2 , ..., YXn
st
are i.i.d. random variables. The total number of individuals produced at the (n + 1) generation will
(n) (n) (n)
then be Xn+1 = Y1 + Y2 + ... + YXn . Then, {Xn } is called a branching process. Let Xn be the size
of the nth generation of a branching process with family-size probability generating function G(z), and
let X0 = 1. Show that the probability generating function Gn (z) of Xn satisfies Gn+1 (z) = G(Gn (z))
for n ≥ 0. Also, prove that E[Xn ] = E[Xn−1 ]G′ (1).