Linear Algebra Part2
Linear Algebra Part2
Engineering
Course Introduction
Course Content
1. Vector spaces, subspaces, bases and dimensions, linear dependence and
independence, vector products, orthogonal bases and orthogonal
projections,
2. Linear operators and Matrices: Eigen values and Eigen vectors,
characteristic polynomial, diagonalization, Hermitian and unitary
matrices, singular value decomposition
3. Discrete and continuous random variables: distribution and density
functions, conditional distributions and expectations, functions of
random variables, moments, sequence of random variables
4. Random process: Probabilistic structure; Mean, autocorrelation and auto-
covariance functions; Strict-sense and wide-sense stationary processes;
Power spectral density; LTI systems with WSS process as the input;
Examples of random processes - white noise, Gaussian, Poisson and
Markov Processes
Marks Distribution
• Assignments + Quiz: 10 % + 15%
• Mid-term Exam: 35%
• End-Term Exam: 40%
S. Name of Books / Authors/ Publishers Year
No.
1. S. Axler, "Linear Algebra Done Right", 3rdEdn., Springer International 2015 2015
Publishing.
2. G.Strang, "Linear Algebra and Its Applications", 4thEdn., Cengage Learning. 2007
3. K.M. Hoffinan and R. Kunze, "Linear Algebra", 2nd Edn.,Prentice Hall 2015 2015
India.
4. A. Papoulis and S. Pillai, "Probability, Random Variables and Stochastic 2017
Processes", 4thEdn., McGraw Hill.
5. H. Stark and J.W. Woods, "Probability and Random Processes with 2001
Applications to Signal Processing", 3rdEdn., Pearson India.
Eigen Values and Eigen Vectors
4
Usefulness of Eigen Value
➢ In Matrix form:
5
➢ Note following
➢ First order equation
➢ Linear in unknowns
➢ Constant coefficients
➢ A is independent of time
➢ In Matrix form
7
➢ We get an eigenvalue problem
8
The solution of Ax = x
9
➢ The number is an eigen value of A if and only if is
singular
Characteristics Polynomial
10
➢ The Eigenvectors will be
11
Summary of steps involved
12
➢ We can choose free parameters
13
Example
14
Eigenvalues and A
15
16
Diagonalization of a Matrix
17
Proof
Ax = x
Split
18
➢ It is essential to keep this split in the correct order
AS = S
S −1 AS = A = S −1S
Remarks on Independence
1. If the matrix A has no repeated eigenvalues-the numbers
then its n eigenvectors are automatically independent.
Therefore any matrix with distinct eigenvalues can be
diagonalized.
19
2. The diagonalizing matrix S is not unique. An eigenvector x
can be multiplied by a constant and remains an
eigenvector. We can multiply the columns of S by any
nonzero constants, and produce a new diagonalizing S.
Trivial Example:
20
4. Not all matrices possess n linearly independent
eigenvectors, so not all matrices are diagonalizable.
22
Positive Definiteness of a matrix
1. x=0, y=0
2. First derivative for having max or min
3. Second derivative to identify max or min
4. Higher terms will not define max or min but can stop from
solution to be global minima.
23
➢ Every quadratic form has a stationary point at (0,0).
24
➢ Third derivative comes into picture when second derivative fails
to give a definite decision.
➢ This happens when quadratic part is singular.
➢ When f(x,y) is strictly positive at all points except stationary
point, (bowl goes up), it is called positive definite.
NO!
25
26
Singular Case ac=b2
27
Two variable case can be extended to multivariable case
28
➢ There are only second order terms.
➢ Its first derivative is zero.
➢ The tangent is flat. It is stationary point.
➢ We need to decide if x=0 is a minimum or a maximum or a
saddle point of the function f=xTAx.
Examples
Saddle Point
Saddle Point
29
Test for Positive Definiteness
30
Singular Value Decomposition
31
➢ The columns of U (n by m) are eigenvectors of AAT, and the
columns of V (n by n) are eigenvectors of ATA. The r singular
values on the diagonal of Σ (m by n) are the square roots of the
nonzero eigenvalues of both AAT and ATA.
32
33
Example:
34
Applications of SVD
35
Applications of SVD
36
➢ Polar Decomposition
➢ Every non-zero complex number can be
identified as a 1 by 1 matrix.
➢ r corresponds to a positive definite matrix.
➢ eiθ corresponds to an orthogonal matrix.
➢ eiθ forms a unitary matrix, How?
37
➢ Q= UVT is an orthogonal matrix. How?
➢.
➢ In complex case, S becomes Hermitian instead of symmetric
and Q becomes Unitary instead of orthogonal.
➢ In the invertible case, and S, both are definite.
38
39
Recap (Least Squares Problem)
• Projecting b onto a subspace
Refer Slide-64
Practice Example