Spectral Estimation
Spectral Estimation
Introduction
Signal Spectral Analysis: Estimation of the power spectral
density
The problem of spectral estimation is very large and has
applications very different from each other
Applications:
To study the vibrations of a system
To study the stability of the frequency of a
oscillator
To estimate the position and number of signal sources in an
acoustic field
To estimate the parameters of the vocal tract of a speaker
Medical diagnosis
Control system design
In general To estimate and predict signals in time or in space
Study of radio frequency spectrum in a
big city
side by side there are the various radio and
television channels, the signals cell phone,
radar signals, etc.
The frequency ranges, if considered with
sufficient bandwidth, are occupied by
signals totally independent of each other,
with different amplitudes and different
statistical characteristics
To analyze the spectrum, it seems logical
to use a selective receiver that measures the
energy content in each interval frequency.
Non Parametric
We will seek the most accurate possible Spectral Analysis
estimate of these energies in the time
available without making any further
assumptions, not looking for models of
signal generation
Study of radio frequency spectrum in a
big city
The non-parametric spectral analysis is
a conceptually simple matter if you use
the concept of ensemble average.
if you have enough realizations of the
signal, just calculate the discrete
Fourier transform and averaging the
powers, component by component.
However, rarely you have numerous
replicas of the signal; often, you have
available a single replica, for an interval
of time allotted
To determine the power spectrum, you
have to use additional assumptions
such as stationarity and ergodicity
Analysis of the speech signal
consider the spectrum of acoustic signal due
to vibration or a voice signal
in this case, all of the signal as a whole has
unique origins and then there
will be the relationship between the contents
of the various spectral bands.
it must first choose a model for the
generation of the signal and then determine Parametric
the parameters of the model itself Spectral
Analysis
For example, it will seek the parameters of a
linear filter that, powered by a uniform
spectrum signal (white noise) produces
a power spectrum similar to the spectrum
under analysis
Analysis of the speech signal
Obviously, the success of the technique
depends on the quality and parametric
correctness of the model chosen.
Valid models lead to a parsimonious signal
description , that is characterized by the
minimum number of parameters necessary
This will lead to a better estimate
of these parameters and then to optimal
results
the parametric spectral analysis leads to the
identification of the model and this opens a
subsequent phase of study
to understand and then possibly check the
status and evolution
the system under observation
Formal Problem Definition
Let be y= {y(1), y(2), . . . , y(N)} a second order
stationary random process,
GOAL: to determine an estimate ˆ( ) of its power
spectral density ( ) for ω ∈ [−π, +π]
Observation
We want | ˆ( ) ( ) | 0
The main limitation on the quality of most PSD
estimates is due to the quite small number of data
samples N usually available
Most commonly, N is limited by the fact that the signal
under study can be considered wide sense stationary
only over short observation intervals
Two possible way(1)
There are two main approaches
Non Parametric
Parametric Spectral
Spectral Analysis
Analysis
assume that the underlying stationary
explicitly estimate the covariance
stochastic process has a certain structure
or the spectrum of the process
which can be described using a small
without assuming that the
number of parameters. In these approaches,
process has any particular
the task is to estimate the parameters of the
structure
model that describes the stochastic process
Non Parametric Estimation:
Priodogram
The periodogram method was introduced by Schuster
in 1898.
The periodogram method relies on the definition of
the PSD
biased estimate
Bartlett window.
Frequency Domain
Bartlett window
Ideally, to have zero bias, we
want WB(ω) = Dirac impulse
δ(ω)
The main lobe width
decreases as 1/N.
For small values of N,
WB(ω) may differ quite a bit
from δ(ω)
If the unbised estimation
the window is rectangular
Summary Bias analysis
Note that, unlike WB(ω), WR(ω) can assume negative
values for some values of ω, thus providing estimate of
the PSD that can be negative for some frequencies.
The bias manifests itself in different ways
Main lobe width causes smearing (or smooting): details in
φ(ω) separated in f by less than 1/N are not resolvable.
periodogram resolution limit = 1/N
Sidelobe level causes leakage
For small N, severe bias
As N → ∞, WB (ω) → δ(ω), so φ(ω) is asymptotically
unbiased
Periodogram Variance
As N → ∞
inconsistent estimate
erratic behavior
The Blackman-Tukey method
Basic idea: weighted correlogram, with small weight
applied to the estimated covariances r(k) with large k
lag window
Frequency Domain
overlap
As J increases:
bias increases (more smoothing)
variance decreases (more averaging)
Non parametric estimation summary
The non-parametric spectral analysis is a conceptually
simple matter if you use the concept of ensemble
average
Goal is to estimate the covariance or the spectrum of
the process without assuming that the process has any
particular structure
Priodogram- Correlogram
Asymptotically unbiased, inconsistence
None of the methods we have seen solves all the
problems of the periodogram
Parametic estimation…
Matlab Examples:
Periodogram
Exercise 1.a
Estimate the power spectral density of the signal
“flute2” by means of periodogram
Hint on periodogram:
the spectrum estimation using periodogram is given by the
following equation
Matlab Examples:
Periodogram
Pseudocode:
phip = (1/N)*abs(fft(y,M)).^2;
Matlab Examples:
Periodogram
Matlab Examples:
Bias and variance of the periodogram
Exercise 1.b
Quantify the bias and variance of the periodogram
Hint on periodogram:
Periodogram is asymptotically unbiased and has large
variance, even for large N.
Matlab Examples:
Bias and variance of the periodogram
Goal: quantify the bias and variance of the periodogram
Pseudocode:
compute R realizations of N samples white noise
e = randn(N,R);
for each realization:
filter white noise by means of a LTI filter Y(z) = H(z)E(z)
end
compute the ensemble mean: phip(RxN)
phipmean = mean(phip);
Plot
Matlab Examples:
Bias and variance of the periodogram
Matlab Examples:
Correlogram
Exercise 2
Estimate the power spectral density of the signal “flute2”
by means of correlogram.
Hint on correlogram:
the spectrum estimation using correlogram is given by the
following equation:
Matlab Examples:
Correlogram
Goal: Estimate the power spectral density using the correlogram
Pseudocode:
load the file flute2.wav
consider 50ms of the input signal (y)
estimate ACS
[r lags] = xcorr(y, 'biased');
r = circshift(r,N);
phic = fft(r,M);
Matlab Examples:
Correlogram
MATLAB Hint: Matlab provides the functions:
[r lag]=xcorr(x,’biased’) that produces a
biased estimate of the autocorrelation (2N-1 samples)
of the stationary sequence “x”. “lag” is the vector of lag
indices [-N+1:1:N-1].
r = circshift(r,N) that circularly shifts the
values in the array r by N elements. If N is positive, the
values of r are shifted down (or to the right). If it is
negative, the values of r are shifted up (or to the left).
Matlab Examples:
Correlogram
Matlab Examples:
Modified periodogram (Blackman-Tukey)
Exercise 3.a
Estimate the power spectral density of
the signal “flute2” by means of Blackman-Tukey
method.
Hints on B-T method:
The spectrum estimation using BT method is given by the
following equation
Matlab Examples:
Modified periodogram (Blackman-Tukey)
Goal: Estimate the power spectral density using the B-T method
Pseudocode:
load the file flute2.wav
consider 50ms of the input signal -->N = length(y);
estimate ACS
[r lags] = xcorr(y, 'biased');
window with a bartlett window of the same length
rw = r.*bartlett(2*N-1);
r = circshift(r,N);
phiBT = real(fft(r,Nfft));
Matlab Examples:
Modified periodogram (Blackman-Tukey)
Matlab Examples:
Modified periodogram (Blackman-Tukey)
Exercise 3.b
Goal: quantify the bias and variance of the BT method
Pseudocode:
compute R realizations of N samples white noise
e = randn(N,R);
end
Plot
Matlab Examples:
Modified periodogram (Blackman-Tukey)
Matlab Examples:
Modified periodogram (Bartlett Method)
Exercise 4
Estimate the power spectral density of the signal
“flute2” by means of Bartlett method.
Hint on Bartlett method :
split up the available sample of N observations into L = N/M
subsamples of M observations each, then average the
periodograms obtained from the subsamples for each value of
ω.
Matlab Examples:
Modified periodogram (Bartlett Method)
Goal: Estimate the power spectral density using the Baralett Method
Pseudocode:
load the file flute2.wav
consider 50ms of the input signal -->N = length(y);
define the number of subsequences L and the number of samples
for each of them M=ceil(N/L)
for each subsequence:
consider the right samples: yl = y(1+l*M : M+l*M);
estimate periodogram: (1/M)*abs(fft(yl)).^2
mean periodograms of the subsequences:
phil = phil + (1/M)*abs(fft(yl)).^2;
phiB=phil/L;
end
Matlab Examples:
Modified periodogram (Bartlett Method)
Matlab Examples:
Modified periodogram (Welch Method)
Exercise 5
Estimate the power spectral density of the signal
“flute2” by means of Welch method.
Hint on Welch method :
similar to Bartlett method but: allow overlap of subsequences
use data window for each periodogram
Matlab Examples:
Modified periodogram (Welch Method)
Goal: Estimate the power spectral density using the Baralett Method
Pseudocode:
load the file flute2.wav
consider 50ms of the input signal -->N = length(y);
define:
the number of samples for each subsequence: M
the number of new samples for each subsequence: K=M/4
the number of subsequences: S= N/K - (M-K)/K;
the window: v = hamming(M) ;
P = (1/M)*sum(v.^2);
for each subsequence
xs = x(1+s*K : M+s*K) ;
consider the right samples:
window the subsequence: v.*xs
estimate periodogram: (1/(M*P))*abs(fft(v.*xs)).^2
mean periodograms of the subsequences:
phis = phis+ (1/(M*P))*abs(fft(v.*xs)).^2 ;
phiW = phis/S;
end
Matlab Examples:
Modified periodogram (Welch Method)
Parametric Spectral Estimation
Consider a sequence of independent samples {xn} and
with it feeding a filter H(ω), if the transform of the
sequence {xn} is indicated X(ω), the output will be:
Frequency
Domain
IIR filter
Yule–Walker
equations
Yule–Walker equations
Yule Walker equations can be obtained in the matrix form
indicating the following vectors with the symbols:
Compute the first 3 samples [r0, r1, r2] of the autocorrelation of the
process xn
Parametric spectral estimation of the MA process (order 1)
Parametric spectral estimation of the AR process (order 1)
Exercise 1: Solution :
Autocorrelation
The signal xn is a MA process of order 2:
Exercise 1: Solution :
MA(1) parametric spectral Estimation
The generic FIR filter of an MA (1) estimate has
transformed:
Autocorrelation
Exercise 1: Solution :
MA(1) parametric spectral Estimation
We choose the minimum phase solution
Y-W
Autocorrelation
Exercises:
AR and MA spectral Estimation
Exercise 2:
Spectral estimation of complex sinusoidal waveform
Y-W
Power Spectrum
Exercise 2: Solution
What happens adding white noise to the signal ?
Autocorrelation
Autoregressive Model
Matlab Examples:
Autoregressive Model
Goal: Estimate the power spectral density of the signal y by means of
AR model
Consider the signal y defined by the differential equation:
y(n)=a1 y(n-1) + a2 y(n-2) + a3 y(n-3) + z(n)
Estimate {ap} and σz with an AR model (order p)
Plot estimated PSD and compare with the true PSD
MATALB Hint: Matlab provides the functions:
[r lag]=xcorr(x,’biased’) that produces a biased estimate of
the autocorrelation (2N-1 samples) of the stationary sequence “x”. “lag”
is the vector of lag indices [-N+1:1:N-1].
R=toeplitz(C,R) that produces a non-symmetric Toeplitz matrix
having C as its first column and R as its first row.
R=toeplitz(R) is a symmetric (or Hermitian) Toeplitz matrix.
Matlab Examples:
Autoregressive Model
Pseudocode:
Consider the signal y defined by the differential equation:
y(n)=a1 y(n-1) + a2 y(n-2) + a3 y(n-3) + z(n)
sigmae = 10;
a = poly([0.99 0.99*exp(j*pi/4) 0.99*exp(-
j*pi/4)])
b = 1 ;
z = sigmae*randn(N,1);
y = filter(b, a, z);
Estimate {ap} and σz with an AR model (order p)
n=3;
r = xcorr(y , 'biased');
Rx = toeplitz(r(N:N+n-1), r (N:-1:N-n+1));
rz = r(N+1:N+n ) ;
theta = -Rx^(-1)*rz;
varz = r(N) +sum(theta.*r(N-1:-1:N-n));
Plot estimated PSD and compare with the true PSD
Matlab Examples:
Autoregressive Model
Spectral Estimation Application:
Linear Prediction Coding (Just a Hint)
The object of linear prediction is to estimate the
output sequence from a linear combination of input
samples, past output samples or both :
q p
yˆ (n ) b( j ) x (n j ) a (i ) y (n i )
j 0 i 1
a ( i ) y ( n i ) b( j ) x ( n j )
i 0 j 0
The error is : e( n ) y ( n ) yˆ ( n )
p
a (i ) y ( n i )
i 0
a(i ) y (n i ) y (n j ) 0, j 1,..., p
i 0 n
The required predicators are found by solving these equations.
Linear Prediction Coding
Orthogonality principle
The orthogonality principle also states that resulting
minimum error is given by
p
a (i )r
i 0
i j 0, j 1,2 , ...,p
Or
p
a (i ) r
i 0
i E
a (i )r
i 0
i j 0, j 1,2 , ...,p
p
a (i ) r
i 0
i E