0% found this document useful (0 votes)
4 views

SSP Bayesian Estimation

The document discusses Bayesian estimation, highlighting its advantages in scenarios where the Minimum Variance Unbiased Estimator (MVUE) is not available. It covers concepts such as Minimum Mean Square Error (MMSE) estimators, Maximum A Posteriori (MAP) estimators, and Linear Minimum Mean Square Error (LMMSE) estimators, along with their derivations and applications in signal processing. Various cost functions and their corresponding Bayesian estimators are also presented, emphasizing the flexibility and computational challenges of these methods.

Uploaded by

raisa.mim17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

SSP Bayesian Estimation

The document discusses Bayesian estimation, highlighting its advantages in scenarios where the Minimum Variance Unbiased Estimator (MVUE) is not available. It covers concepts such as Minimum Mean Square Error (MMSE) estimators, Maximum A Posteriori (MAP) estimators, and Linear Minimum Mean Square Error (LMMSE) estimators, along with their derivations and applications in signal processing. Various cost functions and their corresponding Bayesian estimators are also presented, emphasizing the flexibility and computational challenges of these methods.

Uploaded by

raisa.mim17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

Statistical Signal Processing: Estimation

Bayesian Estimators

Tamanna Howlader
Institute of Statistical Research and Training
University of Dhaka
Why Bayesian Estimation?

- Useful in situations where MVUE cannot be found


- May provide more accurate estimators
Bayesian approach: Assumes prior knowledge about unknown parameter
i.e. is a random variable

Example:

How should one estimate A?


Bayesian MSE

Classical MSE

MSE depends on A; estimator that minimizes MSE also depends on A


We obtain an MSE at each assumed value of A

Bayesian MSE

Bmse does not depend on A; single value for Bmse


Minimum MSE estimator

MMSE estimator minimizes the Bayesian MSE

Derivation:
Minimum MSE estimator
Posterior vs Prior PDF

Recall:

How does one obtain the posterior PDF?


Summary of the Bayesian approach

( )

(MMSE estimator)
Finding MMSE estimator: Example

Example:

~
Finding MMSE estimator: Example

Recall:
Finding MMSE estimator: Example
Finding MMSE estimator: Example
Finding MMSE estimator: Example

Example:

Likelihood of data:
Finding MMSE estimator: Example
Finding MMSE estimator: Example

where
Finding MMSE estimator: Example
Finding MMSE estimator: Example
Finding MMSE estimator: Example
Finding MMSE estimator: Example
Finding MMSE estimator: Example
Real life example:
MMSE for Bayesian general linear model
MMSE for Bayesian general linear model
Example:
MMSE for Bayesian general linear model
MMSE for Bayesian general linear model
MMSE for Bayesian general linear model
Concept of Bayes Risk
Recall:
MMSE estimator minimizes where the expectation is
respect to the pdf
Let denote the error for a particular realization of and

The MMSE estimator minimizes where is


called the cost function and is called the Bayes risk.
Cost functions

Different cost functions yield different Bayesian estimators.

Common cost functions and estimators:

Cost function Estimator


Quadratic Mean of posterior
(MMSE)
Absolute error Median of
posterior
Hit or miss Mode of posterior
(MAP)
Cost functions

Gaussian posterior
Estimator for absolute error cost function
Estimator for absolute error cost function
Estimator for 0-1 cost function
Estimator for 0-1 cost function
Vector MMSE estimator
Vector MMSE estimator
Vector MMSE estimator
Vector MMSE estimator

Example:
Vector MMSE estimator
Vector MMSE estimator
Vector MMSE estimator
Vector MMSE estimator
Vector MMSE estimator
Maximum aposteriori estimator
Maximum aposteriori estimator
Example:

Determine the MAP estimator of A


Recall:
Maximum aposteriori estimator
Now the MAP estimator is obtained by maximizing the posterior PDF i.e.
Maximum aposteriori estimator

Advantage of MAP estimator: easy to determine since it does not require


integration
Vector MAP estimator
Maximum aposteriori estimator

Example:

Find MAP estimator for


Maximum aposteriori estimator

Solution:
Maximum aposteriori estimator

Recall:
Maximum aposteriori estimator

Recall:
Maximum aposteriori estimator
Maximum aposteriori estimator
Maximum aposteriori estimator

Putting
Maximum aposteriori estimator
Maximum aposteriori estimator
Maximum aposteriori estimator
Property of MAP estimator: It does not commute over nonlinear
transformations, although it does so for linear transformations
Suppose that we wish to estimate Is the MAP estimator
, where is the MAP estimator? No!
Linear Bayesian estimator
The optimal Bayesian estimators are difficult to implement in closed form
and practically too computationally intensive.
Under the jointly Gaussian assumption these estimators are easily found
but in general they are not.
When we are unable to make the Gaussian assumption another approach
must be used.
One approach would be to retain the MMSE criterion but constrain the
estimator to be linear.
This class of estimators which are generally called Wiener filters in signal
processing are extensively used.
LMMSE estimator
Notation:
Assumption: scalar parameter ; dependence of on
data set:
modelled as random variable
Flexibility of LMMSE approach: does not assume any specific form for joint
pdf ; requires only first two moments

Definition:
LMMSE estimator

Deriving the optimal weighting coefficients:


LMMSE estimator
LMMSE estimator
LMMSE estimator
Example:
LMMSE estimator

Using Woodbury’s identity (see example


10.2 in textbook):
Vector LMMSE estimator
Vector LMMSE estimator
LMMSE estimator properties
Signal processing example
Recall some definitions:
If a discrete random process. is wide sense stationary (WSS) then it
has a mean

which does not depend on n, and an autocorrelation function (ACF)

which depends only on the lag. between the two samples and not their
absolute positions.

Toeplitz matrix: A square Toeplitz matrix is defined as


Signal processing example

If in addition, then A is symmetric Toeplitz.


Signal processing example
Signal processing example
Smoothing:
Signal processing example
Recall: The LMMSE estimator is of the form
Signal processing example
Signal processing example

You might also like