Chapter3-Probability Distribution
Chapter3-Probability Distribution
Probability Distributions
Random Variables
Example: Tossing a coin: we could get Heads or Tails.
Let's give them the values Heads=0 and Tails=1 and we
have a Random Variable "X":
In short:
X = {0, 1}
1
Not Like an Algebra Variable
In Algebra a variable, like x, is an unknown value:
Example: x + 2 = 6
In this case we can find that x=4
But a Random Variable is different ...
2
Values such as "1.5" or "2.5923" don’t make sense for
this type of problem.
Examples:
- the number of children in a family.
- the number of defective light bulbs in a box
- the number of arrivals per hour.
Figure 2.
3
the continuous probability distribution cannot be
expressed in tabular form.
2. P( X
x
x) 1
x P(X=x)
0 1/4
1 1/2
2 1/4
5
X= 0, 1, 2, 3
Probability distribution:
x 0 1 2 3
P(X=x) 1/8 3/8 3/8 1/8
6
1
, if x 1,2,3,4,5,6
P ( X x) 6
0, otherwise
Uniform distribution.
Example: Two dice are tossed.
1 2 3 4 5 6
7
3 3,1 3,2 3,3 3,4 3,5 3,6
Outcome (first
die, second die) Sum Probability
(1,1) 2 1/36
(1,2), (2,1) 3 2/36
(1,3), (2,2), (3,1) 4 3/36
(1,4), (2,3), (3,2), (4,1) 5 4/36
(1,5), (2,4), (3,4), (4,2),
(5,1) 6 5/36
... ... ...
(6,6) 12 1/36
8
The probability distribution of X is:
x 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
P(X=x)
36 36 36 36 36 36 36 36 36 36 36
The probability density function of X is displayed in the
following graph.
P(X = 3) = 1/8
P(X = 2) = 3/8
P(X = 1) = 3/8
P(X = 0) = 1/8
Remember: Geometric Series
1
n 0
n
x
1 x
| x | 1
x
n 1
n
x
1 x
| x | 1
10
.
The possible values of x = 1,2,3,…
a. Find the p.m.f of X?
1
P(X=1) = 2
1 1
P(X=2) = 2 2
1
n
, n 1,2,3,...
P(X=n) = 2
Need to check
1
n 1 2
n
1 ???
Notice
n
1 1/ 2
n 1 2 1 1/ 2
1
(Geometric series)
12
Properties of p.d.f
13
A function f(x) is a legitimate probability density function
if the following two conditions satisfied:
2. Total Area f ( x) dx 1
c
f ( x ) dx 0
f ( x) dx 1 , then
0 1 ∞
∫ 𝑓(𝑥 )𝑑𝑥 + ∫ 𝑓(𝑥 )𝑑𝑥 + ∫ 𝑓(𝑥 )𝑑𝑥 = 1
−∞ 0 1
1
x3
0 cx dx c 3 1, c 1
2 1
0
3
then c = 3
b. Find P(0< X < ½)
1 1/ 2
1 2
x3 1
p(0 X ) 3x dx 3
2
2 0 3 0
8
15
To compute the probability that X takes a value in the
interval [1,2] (x ⪯ 2), you need to integrate the
probability density function over that interval:
16
10
1 1
1 0.9
x 1 10
b
1 1 1 1 1
lim b lim b lim b 0.1
x 10 b 10 10 b
Cumulative Distribution Function (C.D.F)
17
Definition:The cumulative distribution function of the
random variable X is given by
18
For each x, F(x) is the area under the density curve to
the left of x.
Proposition:
If X is a continuous random variable with p.d.f f(x) and
cdf F(x), then at every x at which the derivative F’(x)
exists, F’(x)=f (x).
19
F(x) is right continuous.
20
The Cumulative Distribution Function F(x)
F(x)
3/4
1/4
x
0 1 2
Example:
3 2
x 0x2
f ( x) 8
0 otherwise
(i) Sketch the probability density function f(x).
21
(ii) Find F(1).
F (1) P( X 1)
13 2 3 𝑥3 1 1
= ∫0 8 𝑥 𝑑𝑥 = ] =
8 3 0 8
0 x0
1
F ( x) x 3 0 x 2
8
1 x2
22
Example: Life expectancy (in days) of electronic
component has density function
1
2 x 1
f ( x) x
0 x 1
(a) Find the cdf for the life expectancy of the
electronic component.
The cdf is F ( x) P( X x), for x R
if x ≥ 1
x x
1
F ( x) P( X x) f (t )dt 2 dt
1
t
x
1 1 1
1 1
t1 x x
0 x 0
F ( x) 1
1 x x 1
23
The Joint Distribution of Two Discrete Random Variables
Example: An experiment consists of three tosses of a fair
coin. The sample space
S = {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT}
24
Y(HHH) = 0, Y(HHT) = 0, Y(HTH)=0, Y(HTT) = 0,
Y(THH)=1, Y(THT) =1, Y(TTT)=3, Y(TTH) = 2
then the possible values of Y are 0,1, 2, and 3
The probability distribution of Y is given by
Table 2
y 0 1 2 3
P(Y= y) 4/8 2/8 1/8 1/8
25
Tables 1 and 2 are fine for computing probabilities for X
alone and Y alone, but from these tables, there is no way
to know that P(X=1 and Y = 1)=1/8.
What do we need?
It seems clear that in this example, we can construct a
probabilities table to give us enough information.
For our example, recall that S = {HHH, HHT, HTH, HTT,
THH, THT, TTH, TTT}
Table 3
y
p(x, y) 0 1 2 3
0 0 0 0 1/8
x 1 1/8 1/8 1/8 0
2 2/8 1/8 0 0
3 1/8 0 0 0
26
Table 4
y
0 1 2 3
p(x, y)
0 0 0 0 1/8 1/8 P(X=x)
x 1 1/8 1/8 1/8 0 3/8
2 2/8 1/8 0 0 3/8
3 1/8 0 0 0 1/8
4/8 2/8 1/8 1/8 1
P(Y=y)
P(Y y) P( X x, Y y)
x
a. Find P(X+Y = 2) .
=P(0,2)+P(1,1)+P(2,0)=0+2/8+0 = 2/8
b. Find P(X > Y).
=P(1,0)+P(2,0)+P(2,1) = 0
29
3
P( X 0) P( X 0, Y y )
y 0
1 1 2
p(0,0) p(0,1) p(0,2) p(0,3) 0 0
8 8 8
3
P( X 1) P( X 1, Y y)
y 0
2 2 4
p(1,0) p(1,1) p(1,2) p(1,3) 0 0
8 8 8
3
P( X 2) P( X 2, Y y )
y 0
1 1 2
p(2,0) p(2,1) p(2,2) p(2,3) 0 0
8 8 8
d. Find the conditional p.m.f of X, given Y=1?
p( x, Y 1)
p X ( x | Y 1)
PY (1)
30
But PY(1)=3/8. Therefore
8
pX ( x | Y 1) p( x,Y 1), x 0,1,2
3
Thus
8 8 1 1
p X (0 | 1) p(0, Y 1) ( )
3 3 8 3
8 8 2 2
pX (1 | 1) p(1,Y 1) ( )
3 3 8 3
8 8
p X (2 | 1) p(2, Y 1) (0) 0
3 3
These probabilities can be represented in the
following table
x 0 1 2
P(x|Y=1) 1/3 2/3 0
e. Find P(0 ≤ X ≤ 1|Y=1) ?
P(0 X 1 | Y 1) P( x | 1)
X 0,1
1 2
p(0 | 1) p(1 | 1) 1
3 3
31
f. Are X and Y independent?
X and Y are independent if
P(x, y)= pX(x) . PY(y) for all x and y
Notice: p(0, 0)=1/8, pX(0)=2/8, pY(0)=1/8
Hence p(0, 0) ≠ pX(0) . pY(0). So, X and Y are not
independent.
The Joint Distribution of Two Continuous Random Variables
[Michael J. Evans and Jefrey S. Rosenthal]
1 4 1 4
∫0 [3 𝑥 3 𝑦 + 2𝑥𝑦 5 ]10 𝑑𝑦 = ∫0 (3 𝑦 + 2𝑦 5 )𝑑𝑦 =
4 2 4 2
[ 𝑦 2 + 𝑦 6 ]10 = + = 1.
6 6 6 6
Hence, f is a joint density function.
33
EXAMPLE: (As in the previous example). Let X and Y again have
the joint density function
2 5
𝑓(𝑥, 𝑦) = {4𝑥 𝑦 + 2𝑦 , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
∞ 1 1
𝑓𝑋 (𝑥 ) = ∫−∞ 𝑓 (𝑥, 𝑦)𝑑𝑦 = ∫0 (4𝑥 2 𝑦 + 2𝑦 5 ) 𝑑𝑦 = 4𝑥 2 ( 𝑦 2 )]10 +
2
2 1
( 𝑦 6 )]10 = 2𝑥 2 + ,
6 3
while for x < 0 or x > 1, 𝑓𝑋 (𝑥 ) = 0
Similarly,
∞ 1 4
𝑓𝑌 (𝑦) = ∫−∞ 𝑓 (𝑥, 𝑦)𝑑𝑥 = ∫0 (4𝑥 2 𝑦 + 2𝑦 5 ) 𝑑𝑥 = 𝑥 3 ]10 𝑦 +
3
4
2𝑥]10 𝑦 5 = 𝑦 + 2𝑦 5 ,
3
34
1 1
= ∫ 60𝑥 (1 − 𝑥) 𝑑𝑥 = 60 ∫ (𝑥 3 − 2𝑥 4 + 𝑥 5 )𝑑𝑥
3 2
0 0
1 2 1
= 60 [ − + ] = 1
4 5 6
∞ 1−𝑥
1
𝑓𝑋 (𝑥 ) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦 = ∫ (120𝑥 3 𝑦)𝑑𝑦 = 120𝑥 3 [ 𝑦 2 ]1−𝑥
0
−∞ 0 2
= 60(𝑥 3 − 2𝑥 4 + 𝑥 5 )
∞ 1−𝑦
1
𝑓𝑌 (𝑦) = ∫ 𝑓 (𝑥, 𝑦)𝑑𝑥 = ∫ (120𝑥 3 𝑦)𝑑𝑥 = 120[ (1 − 𝑦)4 ] 𝑦
−∞ 0 4
35