Advanced Digital
Communication
Probability
Random Experiment
Random experiment: its outcome, for some
reason, cannot be predicted with certainty.
Examples: throwing a die, flipping a coin and
drawing a card from a deck.
Procedure
(e.g., flipping a coin)
Outcome
(e.g., the value
observed [head, tail] after
flipping the coin)
Sample Space
(Set of All Possible
Outcomes)
Sample Space
Sample space: the set of all possible outcomes,
denoted by S. Outcomes are denoted by ws
and each w lies in S, i.e., w S.
A sample space can be discrete or continuous.
Events are subsets of the sample space for
which measures of their occurrences, called
probabilities, can be defined or determined.
Example
Various events can be
defined: the outcome is
even number of dots,
the outcome is smaller
than 4 dots, the
outcome is more than 3
dots, etc.
Probability
Consider rolling of a die with six possible outcomes
The sample space S consists of all these possible
outcomes i.e. S= {1, 2, 3, 4, 5, 6}
Consider an event A which is subset of S and is A = {2,
4}. Ac is complement of A which consists of all points
of S not in A i.e. Ac = {1, 3, 5, 6}
Two events are mutually exclusive if they have no
points in common. A and Ac mutually exclusive
events
Probability
Union of two events is an event which contains
all sample points in both events i.e. A U AC = S
If B = {1, 3, 6} and C = {1, 2, 3} are events of S,
then intersection is given as an event which
shows points which are common to both i.e.
E = B C = {1, 3}
For mutually exclusive events intersection is a
null event i.e. A AC =
Probability
P(A) is the probability of event A in S
Probability of event A satisfies the condition
P(A) >= 0
Probability of sample space is P(S) = 1
Probability of mutually exclusive events is that
both cannot occur. Their intersection results in
null and their union results in sum of the
individual probabilities i.e. P(A and B) = 0 and
P(A or B) = P(A) +P (B)
Joint event and probabilities
Perform two experiments together and consider their
outcomes. E.g. consider single toss of two dice
Sample space consists of 36 two-tuples (i,j) where i,j
= 1,2 . 6. If one experiment has outcomes Ai,
i=1,2.n and the second experiment has outcomes Bj,
j=1,2.m. The combined experiment has joint
outcomes (Ai, Bj), i=1,2.n, j=1,2.m.
The joint probability satisfies the condition 0<=P(Ai,
Bj)<=1.
Joint event and probabilities
The outcomes of Bj, j=1,2.m are mutually
exclusive: mj=1P(Ai ,Bj)= P(Ai)
Similarly we have ni=1P(Ai ,Bj)= P(Bj).
If outcomes of the two experiments are
mutually exclusive ni=1mj=1P(Ai ,Bj)=1
Important properties of probability
measures
P(AC) = 1 P(A), where AC denotes the complement of
A. This property implies that P(AC) + P(A) = 1, i.e.,
something has to happen.
P() = 0 (again, something has to happen).
P(A B) = P(A) + P(B) P(A B). Note that if two
events A and B are mutually exclusive then P(A B) =
P(A) + P(B), otherwise the nonzero common
probability P(A B) needs to be subtracted off.
If A B then P(A) P(B). This says that if event A is
contained in B then occurrence of A means B has
occurred but the converse is not true.
Conditional Probability
Suppose event B has occurred and we wish to
determine probability of occurrence of event A
Conditional probability of event A given the
occurrence of event B is given as:
P(A|B)=P(A,B)/P(B) provided P(B)>0
In a similar way, B conditioned on occurrence of A is
given by: P(B|A)=P(A,B)/P(A) provided P(A)>0
P(A,B) is the simultaneous occurrence of A and B i.e.
A B. For mutually exclusive events P(A|B)=0. If A is
a subset of B, A B=A => P(A|B)=P(A)/P(B).
If B is a subset of A, A B=B => P(A|B)=P(B)/P(B)=1
Bayes Rule
If we have P(A,B)= P(A|B)P(B)=P(B|A)P(A)
P(A|B) =P(B|A)P(A)/P(B)
Where P(A), the prior, is the initial degree of
belief in A. P(A|B), the posterior, is the degree
of belief having accounted for B. The
quotient P(B|A)/P(B) represents the
support B provides for A.
Statistically Independent
Consider two or more experiments or repeated
trials of the same experiment
Consider the case of conditional probability
P(A|B) and suppose A does not depend on B
so P(A|B) =P(A).
Since P(A,B)= P(A|B)P(B)=P(A)P(B) is the
joint probability of statistically independent A
and B
This can be extended to 3 or more events e.g.
P(A,B,C)= P(A)P(B)P(C)
Random Variables
All useful message signals appear random; that is, the
receiver does not know, a priori, which of the possible
waveform have been sent.
Let a random variable X(A) represent the functional
relationship between a random event A and a real number.
Notation - Capital letters, usually X or Y, are used to
denote random variables. Corresponding lower case
letters, x or y, are used to denote particular values of the
random variables X or Y.
Example:
P( X 3)
means the probability that the random variable X will take
a value less than or equal to 3.
Types of Random Variables
Discrete Random Variable
Continuous Random Variable
Mixed Random Variable
Discrete Random Variable
Discrete random variable have a countable
(finite or infinite) image
Sx = {0, 1}
Sx = {, -3, -2, -1, 0, 1, 2, 3, }
Probability Mass Function (pmf)
Probability Mass Function is the discrete probability density function
Probability Mass Function (p(x)) specifies the probability of each
outcome (x) and has the properties:
that provides the probability of a particular point in the sample space of a
discrete random variable
p( x) 0
p ( x) = 1
x
P( X = x) = p( x)
Cumulative Distribution Function (cdf)
cdf specifies the probability that the random variable
will assume a value less than or equal to a certain
variable (x).
The cumulative distribution, F(x), of a discrete
random variable X with probability mass
distribution, p(x), is given by:
F ( x) = P ( X x) = p (t )
x t
Examples of cdf of discrete random
variables
The cdf of a discrete
random variable
generated by flipping of
a fair coin is:
Similarly the cdf of a
discrete random variable
generated by tossing a
fair die is:
Mean, Standard Deviation and Variance
of a Discrete Random Variable X
Mean or Expected Value
= xp( x )
all x
Standard Deviation
2
= ( x ) p( x )
all x
2
2
= x p( x )
all x
Mean, Standard Deviation and Variance
of a Discrete Random Variable X
Variance
var{ X } = 2 = ( x ) 2 p ( x)
all x
Example
The probability mass function of X is
X
0
1
2
3
p(x)
0.001
0.027
0.243
0.729
The cumulative distribution function of X is
X
0
1
2
3
F(x)
0.001
0.028
0.271
1.000
Example Contd.
Probability
Mass
Function
p(x) 1
0.5
x
0
4
Cumulative
Distribution
Function
F(x) 1
0.5
x
0
Example Contd.
3
= E ( X ) = x p( x)
x =0
= 0 + (1)(0.27 ) + (2)(0.243) + (3)(0.729)
= 2.7
3
2 = Var ( X ) = ( x ) 2 p( x)
x =0
= 0.00729 + 0.07803 + 0.11907 + 0.06561
= 0.27,
and = 0.27
= 0.5196
Questions
What is the average value of the following random
variable SX={1, 6, 7, 9, 13}?
Ans: 7.2
What is the expected value of random variable
from the following figure?
pX(x)
0.4
0.3
0.2
0.1
1
13
Questions
Which of the following has higher variance
and why?
pX(x)
qX(x)
E{X}=5.2
E{X}=3.87
0.5
0.4
0.4
0.2
0.1
0.033
1
13
Calculate the variance for both cases and
justify?
13
Continuous Random Variable
There are physical systems that generate
continuous outcomes
Continuous random variables have an
uncountable image
Sx = (0, 1)
Sx = R
E.g. Noise voltage generated by an electronic
amplifier
In such cases random variable is said to be
continuous random variable
Example of cdf of continuous random
variables
The cdf of a continuous
random variable is:
This is a smooth non
decreasing function of x
Mixed Random Variables
Mixed random variables have an image which
contains continuous and discrete parts
Sx = {0} U (0, 1)
Example of cdf of mixed random
variables
The cdf of a continuous
random variable is:
The cdf of such a random
variable is smooth, non
decreasing function in
certain parts of the real
line and contains jumps
at a number of discrete
values of x