Lecture 2
Lecture 2
EC721
Lecture 2
4
That is difficult how can we deal with an
unpredictable behavior?
• Though random signals are evolving in time in
an unpredictable manner they are described
with statistical averages instead of equations.
1. We will learn how to statistically describe and
model a signal
2. Apply the studied theory to communication
systems
5
Can you think of an example
for a random experiment?
• MATH 6….
– A Random Experiment is an experiment whose
outcome cannot be predicted with certainty
1. Tossing a coin
2. Rolling a dice
6
What is a Random Variable?
• Random Variable (R.V.): is a variable whose possible values are
numerical outcomes of a random phenomenon. There are two
types of random variables, discrete and continuous.
– Discrete Random Variable: A discrete random variable is one which
may take on only a countable number of distinct values such as
0,1,2,3,4,........ (Example: the Friday night attendance at a cinema,
tossing a dice )
– Continuous Random Variable: A continuous random variable can
take on any value along a continuous range (Examples include height,
weight, the amount of sugar in an orange)
• Notations;
– Random Variables are denoted by upper case letters (X)
– Individual outcomes for RV are denoted by lower case letters (x).
7
What is Probability Distribution?
A probability distribution is a mapping of all the possible values of a random variable to their corresponding
probabilities for a given sample space.
OR A probability distribution is a table, equation or a graph that links each outcome of a statistical
experiment with its probability of occurrence. Probability of the
– The probability distribution is denoted as
P( X x) random variable X
equals a particular
– which can be written in short form as
value x
P (x)
– Continuous Probability Distribution : Assigns density at individual points, probability of ranges can be obtained by
integrating density function.
FX () 1
Probability of
Impossible FX ( ) 0
Event
9
Probability Density Function
• Another way to describe probabilities is through the
derivative of the distribution function
d
Probability f X ( x) FX ( x)
Density dx
Function
(PDF)
F X ( x2 ) F X ( x1 ) f X ( x)dx
x1
11
Examples of Continuous CDF and PDF
12
What is the Relative Frequency Approach?
13
What are the Axioms of Probability?
• S denotes the SAMPLE SPACE, which
corresponds to the total number of sample
points .
• Sure event: A sure event is an event, which
IMPOSSIBLE EVENT
• A single sample point is called ELEMENTRY
POINT
14
What is the Definition of Probability?
P( S ) 1
0 P( A) 1
If A+B is the union of the two MUTUALLY EXCLUSIVE
events, then the intersection equal to zero
P( A B) P ( A) P( B) P( A, B)
P( A, B) P( AB) P( A B)
15
What are the Properties of Probability?
• Property 1:
P( A) 1 P( A)
16
What are the Properties of Probability?
• Property 2:
– If M mutually exclusive events A1 , A2 , A3 ,……, AM have the
exhaustive property ;
A1 + A2 + A3 +……+ AM = S
Then,
P( A1 ) P( A2 ) P( A3 ) ...... P( AM ) 1
1
P ( Ai ) , i 1,2,3,....., M
M
17
What are the Properties of Probability?
• Property 3:
– When the events A and B are not mutually
exclusive, then the probability of the union event
“A or B” equals;
P ( A B ) P ( A) P ( B ) P ( AB)
18
What is the Joint Probability?
• P(AB) is referred to as the joint probability,
has the following relative frequency
interpretation;
• Where Nn(AB) denotes the number of events A
and B occur simultaneously in an n trails of the
experiment.
19
What is the Conditional Probability and
Bayes’ Theorem?
• P(A/B) or P(B/A) are referred to as conditional probabilities.
• Suppose we perform an experiment that involves a pair of events
A and B… P(B/A) denote the probability of event B, given that
event A has occurred.
– Provided that A has non zero probability The conditional probability
P(B/A) is defined by
PB A
P ( AB)
P ( A)
Bayes’ Theorem
21
Set Theory and Probabilities
22
Example of Intersected Events
23
What is Statistical Average and How to
Compute it?
• Statistical average determines the average behavior of
the outcomes arising in a random experiment…. Known
as the Expected value, Mean and First moment.
Recall the time average of a signal x(t)
T
2
1
x(t ) T lim
T x(t )dt
T
2
24
What is Statistical Average and How to
Compute it?
• The Statistical average for a random variable X
is defined by;
m X E[ X ] X xf X ( x)dx x(t )
25
Function of a Random Variable
• Let X denote a random Variable, and g(X) denote a real-
valued function of that random variable.
Y is a
Y g ( X ) Random
Variable!!
27
How about the Central Moment?
• Simply defined as the moments of the
difference between the random variable X and
its mean.
E[( X X ) n ] ( x X ) n f X ( x)dx
For n=1 the central For n=2 the 2nd central moment
moment is ?? is referred to as the
VARIANCE!!
28
What is the Variance?
• The variance of a random variable X is some sense
is a measure of the variable’s randomness!!
• By specifying the variance, we essentially constrain
the effective width of the probability density
function of the random variable about the mean.
var[ X ] X2 E[( X X ) 2 ] ( x X ) 2 f X ( x)dx
29
What is the Relation between the Variance
and the Mean-Square Value?
E[( X X ) ]
2
X
2
2
E[ X 2 X X X ]
2
2
E[ X ] X
2
30
What is the Characteristic Function?
• It is the inverse Fourier Transform of the
probability density function, given by;
v ==2πf
X (v) E[exp( jvX )] and x == t
f X ( x) exp( jvx )dx
Joint PDF;
f X ,Y ( x, y ) FXY ( x, y )
x y
32
Properties of Joint Distributions?
• Extending the previous properties to the joint
case, it follows;
FXY ( , ) 0
FXY ( x, ) 0
FXY ( , y ) 0
FXY ( x, ) FX ( x)
FXY (, y ) FY ( x) Marginal Distributions;
FXY (, ) 1
Also, f X ( x) f X ,Y ( x, y )dy
f X ,Y ( x, y)dxdy 1
fY ( y ) f X ,Y ( x, y )dx
33
How about the Conditional Densities?
Bayes’ Rule
f X ,Y ( x, y )
f X ( x / y)
Y fY ( y )
Conditional density is
f X ( x / y ) f X ( x) identical to
Y unconditional density
34
What are Joint Moments?
• Consider a pair of random variables X and Y, the
statistical averages of importance in this case are
the joint moments, given by;
E[ X iY k ] x i y k f X ,Y ( x, y )dxdy
37
Uniform Distribution
• A random Variable X is said to be uniformly
distributed over the interval (a,b) if its
probability distribution function is;
40
Normal Distribution Main Characteristics
41
Normal Distribution Main Characteristics
• Two normal curves with different means (but the same standard deviation)
–The curves are shifted left and right
• Two normal curves with different standard deviations (but the same
mean)
– The curves are shifted up and down
42
How to get the probability for a Normal
Distribution?
43
44
Exponential Distribution
• The probability density function is given by;
1 x
f X ( x) e .......x 0and 0
45
Rayleigh Distribution
• Used widely in wireless communication to
model the effect of fading a signal undergoes.
• The distribution function is defined as follows:
x x2
f X ( x) 2 exp .....x 0, 0
2
2
E[ X ] 2 ;Var [ X ] 2 2
2
46
Why Transformation of Random Variables
and How?
• A problem that often arise in the statistical
characterization of communication systems is
that of determining the pdf of a random
variable y related to another random variable
x by the transformation:
y=g(x)
• There are 2 cases to be considered one-to-one
(monotone) transformation or one-to-many
(non-monotone) transformation.
47
Why Transformation of Random Variables
and How?
• In case of one-to-many transformation and to
obtain the pdf of y from x given that y=g(x), we do
the following :
– Find the roots of y=g(x)
– Find dy/dx
– Calculate the following equation at every root
f x ( xk ) Jacobian of the
Number of f y ( y ) transformation
roots k dy
dx xk g 1 ( y )
How about the one-to-one case? How can we change the above
equation?? 48
Why Transformation of Random Variables
and How?
49
Why Transformation of Random Variables
and How?
50
What is a Random Process?
• It is an extension of the concepts associated
with random variables when the time
parameter is brought into account.
• Random processes have 2 properties:
– They are functions of time
– Before conducting an experiment, it is not possible
to exactly define the waveform that will be
observed in the future.
51
What is a Random Process?
53
Terminologies Used
in Random Processes
• For a fixed sample point si the graph of the
function X(ti, s) versus the time t is called a
Realization or Sample Function of the random
process, for simplicity denoted by:
X i (t ) X (ti , s ) X (ti )
54
Then what is the Definition
of a Random Process?
55
Difference Between Random Variable and
Random Process
56
How to Compute the Mean of
a Random Process?
RX (t1 , t 2 ) RX (t 2 t1 )
58
Properties of Autocorrelation Function
• The autocorrelation function of a stationary process
can be redefined as follows:
RX ( ) E[ X (t ) X (t )]
• Properties:
by setting τ=0 the mean squared value of the process is obtained
RX (0) E[ X 2 (t )]
The autocorrelation function is an even function
RX ( ) RX ( )
The autocorrelation function has its maximum magnitude at τ=0
RX ( ) RX (0) The more rapidly the random
process changes with time the
more rapidly the
autocorrelation decrease from
its max value
59
How to Compute the Auto covariance of a
Random Process?
• The auto covariance of a random process X(t)
is computed as follows:
• It is obvious from the above equation that like the autocorrelation function
the auto covariance for a strictly stationary random process depends only on
the time difference!!
•The mean and the autocorrelation function are sufficient to describe the first 2
moments of the process 60
How to Compute Cross-Correlation
Functions?
• Consider the more general case of two random processes
X(t) and Y(t) with autocorrelation Rx(t,u) and RY(t,u). The
two cross correlation functions are defined as:
RXY (t , u ) E[ X (t )Y (u )]
RYX (t , u ) E[Y (t ) X (u )]
62
What is a Stationary Process?
• To define a stationary process, consider a random process X(t) that is initiated at
t=-∞, let X(t1), X(t2),…., X(tk) denote the random variables obtained observing
the random process at different times. The joint distribution of the random
variables is fX(t1),…., X(tk)(x1,…., xk).
• If all the observations are shifted in time X(t1+τ), X(t2+τ),…., X(tk+τ), thus the joint
pdf is fX(t1+τ),…., X(tk+τ)(x1,…., xk).
• The random process is said to be stationary in the Strict Sense or strictly
Stationary (SSS) if:
fX(t1+τ),…., X(tk+τ)(x1,…., xk) = fX(t1),…., X(tk)(x1,…., xk).
– In other words the random process is strictly stationary if the joint distribution of any
set of random variable obtained by observing it is invariant with respect to the location
of the origin t=0.
– If all statistical moments are independent of time to the order N--∞
• The random process is said to be Wide Sense Stationary (WSS) if only the first 2
moments (N=2) are independent of time.
E[ X (t )] X Const
RX (t1 , t 2 ) RX ( ) 63
What is an Ergodic Process?
Ensemble average across
the random process (all
sample functions)
E
n
s
e
m
b
l
Time average along the
e
random process (one
sample functions)
A SINGLE SAMPLE FUNCTION REPRESENTS ALL 64
SAMPLE FUNCTIONS
What is an Ergodic Process?
• Ergodic process is a subset of a stationary process, in other words a random
process to be ergodic , it has to be stationary: however the converse is not
necessarily true!!
• Ergodicity could be defined in the most general sense by considering higher
order statistics
• A stationary random process X(t) is said to be ergodic if the time averages of
any sample function is equal to the corresponding ensemble averages
(expectation)---- Known as Mean Ergodicity.
x (t ) E[ X (t )]
T
2
1
lim
T T x(t )dt xf
T
X ( x ) dx
2
66
Properties of PSD
• Properties are:
67
68
1
50 cos( 400t 200 2 )d 0
2
69
70