0% found this document useful (0 votes)
18 views

CC10 Group 5 1

Uploaded by

afgchannel123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

CC10 Group 5 1

Uploaded by

afgchannel123
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Vietnam National University Ho Chi Minh City

Ho Chi Minh City University of Technology

LINEAR ALGEBRA
ASSIGNMENT REPORT
Lecturer: Hoàng Hải Hà
Class: CC10 – Group 5
MATLAB Project – Linear Algebra

MEMBERS LIST

ORDER
FULL NAME ID WORK DONE
NO.

1 Nguyễn Lê Cao Tướng 2353300 Do the MATLAB Code from task 1 to 5

Write the Theoretical Background and


2 Châu Hùng Hào 2352291
Conclusion for the Project

Write the report and search for the


3 Lưu Đại Hưng 2352430
related theory

Write the Theoretical Background and


4 Lê Mạnh Hưng 2352428
search for the related theory

Do the MATLAB Code from task 6 to


5 Trần Ngọc Bảo 2352109
10

Do the MATLAB Code from task 10 to


6 Bùi Quang Minh 2352732
14

Page 2 | 39
MATLAB Project – Linear Algebra

Contents
I. Theoretical Background: 4
1) Eigenvalues and eigenvectors of matrix 4
a) Basic theory 4
b) Steps to find eigenvalues and eigenvectors of a matrix. 4
2) Orthogonal diagonalization 5
a) Basic theory 5
b) Orthogonal diagonalization of symmetric matrix A 7
3) SVD (Singular Value Decomposition): 8
a) Basis theory: 8
II. MATLAB Codes 12
1) Keywords used in code: 12
2) Questions & Tasks: 14
a) Project Questions: 14
b) Tasks 14
III. Conclusion: 38
IV. References: 39

Page 3 | 39
MATLAB Project – Linear Algebra

I. Theoretical Background:
1) Eigenvalues and eigenvectors of matrix
a) Basic theory
Definition: Let AMn(K) The number λ0K is called the eigenvalue of matrix A, if there exists a
vector of values X0≠0 such that AX0= λ0X0 The vector X0is called the eigenvector of matrix A
corresponding to the eigenvalue X0.
• Property 1.
Each eigenvector has only one eigenvalue. Suppose the square matrix A has eigenvector x
corresponding to two eigenvalues λ1, λ2 then Ax= λ1x= λ2x ⟺ (λ1- λ2)x = 0 ⟺ λ1 = λ2.
• Property 2.
If x is an eigenvector corresponding to the eigenvalue λ of the square matrix A, then kx is also
an eigenvector with λ: Ax = λx ⟺A(kx) = λ(kx)
• Property 3
If λ is an eigenvalue of square matrix A then λn is an eigenvalue of matrix An.
• Property 4
The eigenvalue of the square matrix A is the solution of the equation (A-λI) = 0.
Suppose λ is the eigenvalue of matrix A, when x ≠ 0 exists, Ax = λx ⟺ (A- λI)x = 0.This is a
system of linear equations, this system has a solution x ≠ 0 when and only if det ( A- λI ) = 0.
• Property 5
Square matrix A has eigenvalue λ, then the set of eigenvectors corresponding to λ is the
solution of ( A- λI )x = 0
b) Steps to find eigenvalues and eigenvectors of a matrix.
❖ Step 1: Find the eigenvalue.
+ Set up the equation det ( A- λI ) = 0
+ Calculate determinants, solve equations
+All solutions of the equation are all eigenvalues of A
❖ Step 2: Find the eigenvectors.
+ Corresponds to eigenvalue λ1. Solve the system of equations ( A- λ1I )x = 0

Page 4 | 39
MATLAB Project – Linear Algebra

+ All non-zero roots of the system are all eigenvectors of A corresponding to the eigenvalue
λ1.
+ Similarly, find the eigenvectors of A corresponding to the remaining eigenvalues

2) Orthogonal diagonalization
a) Basic theory
Definition 1: The matrix A Mn (R) is called a symmetric matrix, real if AT = A
Example 1: Given a matrix
1 2 3
𝐴 = (2 5 7)
3 7 4

Check that AT = A. So, A is a symmetric matrix. The magnetic parts of A are symmetrical to
each other along the main diagonal.
Definition 2: The matrix A ∈ Mn (R) is called an orthogonal matrix, if A-1 = AT.

- From the definition we have AA-1 = AAT  AAT = I. Thus, if the product of A and AT is
the unit matrix I, then A is an orthogonal matrix.
Proposition 1: Matrix A is orthogonal if and only if the column vector set (or row vector set)
of A is orthonormal.
Solution:
- Let A be an orthogonal matrix. That is, AAT = I. Notice the multiplication of two matrices
together, we see: row i of A multiplied by column j of matrix AT, we get the elements in row i
and column j of the identity matrix. Column j of AT is row j of A.

We have Ai* A*j = {1, i=j


0, if 𝑖 ≠ 𝑗

- It follows that the row vector set of A is orthonormal.


- Completely similar, when considering ATA = I, we have that the column vector set of A is
orthonormal.
- Use this clause to find an arbitrary orthogonal matrix A of level n as follows:
a/ In Rn, choose a basis E.

Page 5 | 39
MATLAB Project – Linear Algebra

b/ Using the Gram-Schmidt process (if necessary), orthogonalize E to get the orthonormal
basis F. c/ Divide a row vector in F by its length to get the orthonormal basis Q. Create matrix
A whose row vector set (or column vector set) is Q.
- Then A is an orthogonal matrix.
Example 2: In R3, choose basis E = {(1;1;1), (1;2;1), (1;1;2)} Using the Gram - Schmidt
orthogonalization process, we get the orthogonal set
F ={ (1;1;1), (1;-2;1), (-1;0;1) }
- Divide each vector by its length, we have the orthonormal set
1 1 1
Q={ (1; 1; 1), (1; −2; 1), (−1; 0; 1)}
√3 √6 √2

- Set up an orthogonal matrix whose column vector set (or row vector set) is Q:
1 1 −1
√3 √6 √2
1 −2
𝐴= 0
√3 √6
1 1 1
(√3 √6 √2 )

Definition 3: A real, square matrix A is called orthogonal diagonalizable if A = PDP-1 =PDPT,


where D is a diagonal matrix and P is an orthogonal matrix.
Theorem 1: Let A be a real symmetric matrix.
- The following statements are true.
1/ Eigenvalues of A are real numbers.
2/ A can always be orthogonally diagonalized.
3/ Two eigenvectors corresponding to different eigenvalues are perpendicular to each other.
Proposition 2: If matrix A is orthogonally diagonalizable, then A is a symmetric matrix.
-Prove:
- Suppose A can be orthogonally diagonalized. Then A = PDPT.
- Inferred, AT = (PDPT)T = (PT)TDTPT = PDPT = A. Or A is a symmetric matrix.
-Thus, only real symmetric matrices can be orthogonally diagonalized.

Page 6 | 39
MATLAB Project – Linear Algebra

b) Orthogonal diagonalization of symmetric matrix A


Step 1: Find the eigenvalue of A.
Step 2: Find a basis of orthonormality of each individual subspace.

- To find the orthonormal basis of the eigenspace Eλk, we follow these steps:
a/ Choose arbitrary Ek basis of Eλk.
b/ Use the Gram - Schmidt process (if necessary) to find the orthogonal basis Fk.
c/ Dividing each vector in Fk by its length, we have the orthonormal basis Qk of Eλk.
Step 3: Conclusion. Matrix A is always orthogonal diagonalizable. That is, A = PDPT, in
which the diagonal matrix D has diagonal elements that are the eigenvalues of A, the column
vector sets of the orthogonal matrix P from the eigenvectors in the orthonormal bases in step
2.
Example 3: Orthogonal diagonalization of a real symmetric matrix:
2 −4
𝐴=( )
−4 17
Step 1: Find the eigenvalues of A

A has two eigenvalues: λ1 = 1, λ2 = 18


Step 2: Find the orthonormal basis of the individual subspaces

• Corresponds to λ1 = 1

- Solve (A- λ1I)X = 0 ⇔ X = (4α; α)T.. The basis of Eλ₁ is (4; 1)T .
1
-The orthonormal basis of Eλ₁ is (4; 1)T
√17

• Corresponds to λ2 =18

- Solve (A- λ2I)X = 0 ⇔ X = (α; -4α)T. The basis for E λ₂ is (1; -4)T .
1
-The orthonormal basis of Eλ₂ i s (1; −4)T
√17

Step 3: Conclusion: Matrix A is orthogonal diagonalizable and A=PDPT, in which:


18 0
D=( )
0 1

Page 7 | 39
MATLAB Project – Linear Algebra

1 4

P= √17 √17
−4 1
(√17 √17)

3) SVD (Singular Value Decomposition):


a) Basis theory:
Let A be a matrix size mxn , proof that AAT and AT A share the same set of eigenvalues.

Assume that 0 and X 0 be a eigenvalue and eigenvector of AAT respectively, we have:

AAT X0 = o X0

= AT AAT X0 = AT 0 X0

Rewrite the formula into AT A(AT X0 ) = 0 (AT X0 ) we can see that 0 is also a eigenvalue of
AT A .

Singular value decomposition is a factorization of form


A = U VT

where:

- Umxm and Vnxn are the orthogonal matrices.

- mxn is a rectangular diagonal matrix with non-negative real numbers


 1 , 2 , 3 ,..., r on the diagonal called singular values of A , the number of the non-
zero value is equal to the rank of A .

If A is real, then U and V are also real.

Page 8 | 39
MATLAB Project – Linear Algebra

SVD undergo these following processes:

- First, the V T will rotate right-singular vectors to standard basis.


- Next, the  will then add or remove dimension along with scaling each axis by the
singular values.
- Finally, the U rotate the standard basis to the right-singular vectors.

VT

Page 9 | 39
MATLAB Project – Linear Algebra

Now let have a look at the two products AAT and AT A :


We have:

AAT = (U  V T )(U  V T )T = U  V TV T UT = U  T U T
and

AT A = (U  V T )T (U  V T ) = V T U TU  V T = V T  V T

As U and V are orthogonal matrices, V TV = U TU = I .

It is clear that  T and T  are diagonal matrices and contain the values  12 ,  22 ,  32 ,...,  r2
on the diagonal, those values are also eigenvalues of AAT and AT A . U is the eigenvector of
AAT and each column of U is a left-singular vector of A . Similarly, each column of V is
called a right-singular vector of A .

The formula A = U  V T can also be rewritten as a sum of rank 1 matrices:

A =  1u1v1T +  2u2v2T + ... +  rur vrT

Or for short:
r
A =   iuiviT
i =1

Note that here  i is a scalar, ui and viT are vectors.

Page 10 | 39
MATLAB Project – Linear Algebra

Since  1 , 2 , 3 ,..., r are non-negative real numbers and in descending order, a few first
values are very big, whereas the values at the end are very small and almost equal to 0.
Therefore, we can ignore some last ingredients in the sum, this donates as:
k
A  AK =   iuiviT
i =1

AK clearly has the rank (at most) k.

If we only keep:

- k first right-singular vectors, set VKT equals to the first k rows of V T (a k x n matrix).

- k first left-singular vectors, set U K equals to the first k columns of U (a m x k


matrix).

- k first singular values (from the top), set  K equals to the first k rows and columns of
 (a k x k matrix).

From that we have rank-k approximation

AK = UK K VKT

As k goes higher, we obtain a matrix more and more equals to the original matrix A .

Page 11 | 39
MATLAB Project – Linear Algebra

II. MATLAB Codes


1) Keywords used in code:
1 linspace() generate linearly spaced vectors

2 subplot() create multiple plots in a single figure window

3 hold on retain the current plot and certain properties so that subsequent
graphing commands add to the existing graph rather than replacing
it

4 plot() create 2D line plots.

5 quiver() take additional arguments to control the appearance of the arrows

6 axis equal set the aspect ratio of the plot axes so that the data units are the
same in every direction

7 title() add a title to a plot or a subplot

8 hold off turn off the hold state that was previously set using the ‘hold on’
command

9 disp() display the value of a variable or a string

10 svd() decompose a matrix into three other matrices

11 imread() read an image from a file

12 figure create a new figure window or to make an existing figure window


active

13 imshow() display an image in a figure window

14 size() determine the dimensions of an array or matrix

15 double() converts data to double-precision floating-point numbers

Page 12 | 39
MATLAB Project – Linear Algebra

16 clc clear Command Window

17 clearvars clear all variables from the workspace

18 close all close all figure windows

19 pause halt the execution of a script or function temporarily

20 all() function checks if all elements in the logical array are ‘true’

21 abs() computes the absolute value of each element in an array

22 isequal() compares the equality of two variables

23 round() round the elements of an array to the nearest integer

24 unit8() a data type that represents unsigned 8-bit integers

25 numel() returns the number of elements in an array or the total number of


elements in an array-like object

26 diag() extract the diagonal elements of a matrix or to construct a diagonal


matrix from a vector of elements

27 fprintf() used to write formatted data to a file or to the Command Window

28 rand() function generates uniformly distributed random numbers in the


interval [0,1]

29 ones() creates an array filled with ones

Page 13 | 39
MATLAB Project – Linear Algebra

2) Questions & Tasks:


a) Project Questions: In degenerate value analysis (SVD),

Q1: Did the multiplication by the transpose of the matrix V resulted in a reflection of
the plane?
Q2: Did multiplication by the matrix U produce a reflection of the plane?
Reply:
1. Multiplying by the transpose of the V matrix does not necessarily produce a
reflection of the plane. In SVD, the matrix V (or its transpose) represents a rotation in
the column space of the original matrix.
2. Similarly, multiplying by the matrix U does not necessarily produce a reflection of
the plane. In SVD, matrix U represents a rotation in the row space of the original
matrix.
However, both matrices U and V can represent a reflection if they have determinant
equal to -1. The determinant of a matrix reflects the number of reflections in the linear
transformation that the matrix represents. If the determinant is 1, there is no reflection.
If the determinant is -1, there is a reflection. So to know for sure, you need to check the
determinant of the matrix.
b) Tasks:

➢ Task 1:
We will start with an illustration of the geometric meaning of singular value
decomposition. Let us look at a singular value decomposition of a 2 × 2 matrix. Open
the file lab16.m, locate the code cell %% 2x2 matrix, and add the following commands
%% 2x2 matrix
clear;
t=linspace(0,2*pi,100);
X=[cos(t);sin(t)];
subplot(2,2,1);
hold on;
plot(X(1,:),X(2,:),’b’);

Page 14 | 39
MATLAB Project – Linear Algebra

quiver(0,0,1,0,0,’r’);
quiver(0,0,0,1,0,’g’);
axis equal
title(’Unit circle’)
hold off;
This code will create a 2 × 100 matrix X = [x1 x2 · · · x100] whose columns xi are unit
vectors pointing in various directions. The plot will show a blue circle corresponding to
the endpoints of these vectors and two vectors of the standard basis on the plane e1 =
(1, 0)T and e2 = (0, 1)T. We use the function subplot which will create a plot containing
four subplots arranged in two rows and two columns. The plot above will occupy the
first “cell” of this plot. Observe the command quiver which draws a vector with the
beginning point given by the first two arguments ((0,0) in this case) and an ending
point given by the next two arguments ((1,0) or (0,1) in the code above).
Here is an example of our MATLAB code:

Page 15 | 39
MATLAB Project – Linear Algebra

Result:

➢ Task 2:
Now in the M-file, define a variable A holding the matrix
A = [ 2, 1; -1, 1 ];
and compute the singular value decomposition of this matrix using the svd function:
[U,S,V] = svd(A);
Using the workspace window of the Matlab main environment, check out the matrices
U, S, V. Perform the commands
U’*U
V’*V
to ascertain that the matrices U and V are orthogonal. The output should produce 2 × 2
identity matrices.

Page 16 | 39
MATLAB Project – Linear Algebra

MATLAB Code:

➢ Task 3:
Next, let us observe the geometric meaning of the individual matrices U, Σ, V (U, S, V
in our Matlab code) in the singular value decomposition. To do this, observe the
transformations induced by these matrices on a unit circle and the vectors of the
standard basis e1, e2. Let us start by multiplying the coordinates of the points of the
circle and the vectors e1, e2 by the matrix V. Execute the following code:
VX=V’*X;
subplot(2,2,2)
hold on;
plot(VX(1,:),VX(2,:),’b’);
quiver(0,0,V(1,1),V(1,2),0,’r’);
quiver(0,0,V(2,1),V(2,2),0,’g’);
axis equal
title(’Multiplied by matrix V^T’)
hold off;
Observe that the matrix VX contains the vectors of the matrix X transformed by the
multiplication by the matrix VT. Since the matrices V and VT are orthogonal,
multiplication by the matrix VT is equivalent to rotation of a plane, possibly in
combination with a reflection along some straight line. This allows us to conjecture
that the image of the unit circle under this mapping will still be a unit circle, but the
vectors of the basis will be rotated and possibly switched in orientation.

Page 17 | 39
MATLAB Project – Linear Algebra

MATLAB Code:

Result:

➢ Task 4:
Now, let us multiply the result from the previous step by the matrix Σ (S in the Matlab
code). Observe that since the matrix Σ is diagonal, then multiplication by this matrix
geometrically means stretching of the plain in two directions. To verify this, execute
the following Matlab code:
SVX = S*VX;
subplot(2,2,3);

Page 18 | 39
MATLAB Project – Linear Algebra

hold on;
plot(SVX(1,:),SVX(2,:),’b’);
quiver(0,0,S(1,1)*V(1,1),S(2,2)*V(1,2),0,’r’);
quiver(0,0,S(1,1)*V(2,1),S(2,2)*V(2,2),0,’g’);
axis equal
title(’Multiplied by matrix \Sigma V^T’)
hold off;
Observe that, as expected, the unit circle is stretched and becomes an ellipsis. The
images of the standard basis vectors are stretched as well.
MATLAB Code:

Result:

Page 19 | 39
MATLAB Project – Linear Algebra

➢ Task 5:
Finally, multiply the results from the last step by the matrix U to obtain:
AX = U*SVX;
subplot(2,2,4)
hold on;
plot(AX(1,:),AX(2,:),’b’);
quiver(0,0,U(1,1)*S(1,1)*V(1,1)+U(1,2)*S(2,2)*V(1,2),U(2,1)*S(1,1)*V(1,1)+...
U(2,2)*S(2,2)*V(1,2),0,’r’);
quiver(0,0,U(1,1)*S(1,1)*V(2,1)+U(1,2)*S(2,2)*V(2,2),U(2,1)*S(1,1)*V(2,1)+...
U(2,2)*S(2,2)*V(2,2),0,’g’);
axis equal
title(’Multiplied by matrix U\Sigma V^T=A’)
hold off;
Observe that the result is equivalent to multiplying the initial vector X by the matrix A.
Since the matrix U is orthogonal, then the multiplication by this matrix should result in
a rotation of the plane possibly combined with a reflection. Confirm this by observing
the images of the basis vectors.
MATLAB Code:

Page 20 | 39
MATLAB Project – Linear Algebra

Result:

➢ Task 6:
If you answered yes to both Q1 and Q2 above, can you modify the matrices U and V in
such a way that no reflections of the plane occur? Produce the modified matrices U1
and V1 and con rm that U1*S*V1’=A. Observe that this shows that singular value
decomposition is not unique. It is possible to modify the matrices U and V in such a
way that there is no plane reflection.

Firstly, we will create a matrix A:

Calculating , we obtain:

To find eigenvalues of matrix , we apply the equation :

Page 21 | 39
MATLAB Project – Linear Algebra

From that:

or

We have: and
Therefore, diagonal matrix S is:

Now we find matrix V, we know that each column of V is an eigenvector of , so


we will find the eigenvectors of :

For , we have:

Solving the system of equations, we have: = -1 and = -0.3

Normalizing, we have: and

(1)

For , we have:

Page 22 | 39
MATLAB Project – Linear Algebra

Solving the system of equations, we have: = -0.3 and =1

Normalizing, we have: and

(2)
From (1) and (2), we have matrix V:

Similarly, we will work with to find matrix U:


We obtain:

Based on the problem requirements, I will modify the two matrices U and V into U1
and V1 by negating the second column of each matrix:

(These matrices don’t cause reflection)

Next, we will check whether A= or not. We have:

Page 23 | 39
MATLAB Project – Linear Algebra

We can also use MATLAB:

Page 24 | 39
MATLAB Project – Linear Algebra

Result:

In this code, a tolerance value is set to determine if the difference is small enough to
consider the matrices equivalent. If the difference is smaller than tolerance (very small
as equivalent to zero), it means there is no reflection and the SVD is not unique.
Conclusion, it does exist 2 matrices U1 and V1 so that the plane will not reflect.
Therefore, singular value decomposition is not unique.

Page 25 | 39
MATLAB Project – Linear Algebra

➢ Task 7:

We can check this by MATLAB easily:

Page 26 | 39
MATLAB Project – Linear Algebra

Result:

➢ Task 8:
Loading the image by adding commands:
%% Image compression clear;
ImJPG=imread(einstein.jpg);
figure; imshow(ImJPG);
[m,n]=size(ImJPG);

Page 27 | 39
MATLAB Project – Linear Algebra

Result:

Page 28 | 39
MATLAB Project – Linear Algebra

➢ Task 9:
Perform a singular value decomposition of the matrix ImJPG and save the output in
matrices UIm, SIm, and Vim
MATLAB code:

Result:

Page 29 | 39
MATLAB Project – Linear Algebra

Page 30 | 39
MATLAB Project – Linear Algebra

➢ Task 10:

MATLAB code:

Result:

Notice that the singular values are decreasing and always remain>=0.

Page 31 | 39
MATLAB Project – Linear Algebra

➢ Task 11:

Page 32 | 39
MATLAB Project – Linear Algebra

MATLAB Code:

Result:

➢ Task 12:
Observe that the singular value decomposition can also be used to smooth noisy data,
especially if the data contains patterns. Data smoothing is often necessary because all
measurements contain small errors resulting in a “noise”. This noise usually determines
the smallest singular values of the matrix. Dropping these small values, thus, not only
saves the storage space, but also allows to eliminate noise from the data.
Start a new code cell. Load the file checkers.pgm into Matlab and add some noise to
the resulting image matrix using the following code:
%% Noise filtering
clear;

Page 33 | 39
MATLAB Project – Linear Algebra

ImJPG=imread(’checkers.pgm’)
[m,n]=size(ImJPG);
ImJPG_Noisy=double(ImJPG)+50*(rand(m,n)-0.5*ones(m,n));
figure;
imshow(ImJPG);
figure;
imshow(uint8(ImJPG_Noisy));
MATLAB Code:

Result:

Page 34 | 39
MATLAB Project – Linear Algebra

➢ Task 13:
Compute the SVD of the matrix ImJPG Noisy and save the resulting decomposition
matrices as UIm, SIm, and VIm.
MATLAB Code:

Result:

Page 35 | 39
MATLAB Project – Linear Algebra

➢ Task 14:
Compute the approximations of the initial image with k = 10, k = 30, and k = 50
singular values. Display the resulting approximations and compare then to the “noisy”

Page 36 | 39
MATLAB Project – Linear Algebra

image. Observe that SVD significantly reduces the noise. Compare the images to the
initial image without noise. Observe also that even though SVD reduces the noise, it
also somewhat blurs the image.
MATLAB Code:

Result:

Page 37 | 39
MATLAB Project – Linear Algebra

III. Conclusion:
In conclusion, Singular Value Decomposition (SVD) is a crucial concept in linear
algebra, widely used in various scientific and engineering fields. It breaks down a
matrix into three simpler parts: two orthogonal matrices, U and V, and a diagonal
matrix, Σ, of singular values. This breakdown helps solve complex linear algebra
problems and provides deep insights into the original matrix's structure.
SVD's importance comes from its robustness and versatility. It is used in data
compression, noise reduction, and dimensionality reduction, such as in Principal
Component Analysis (PCA). In image processing, SVD allows efficient image
compression by keeping only the most significant data, reducing storage needs without
losing much quality.
This project has highlighted several key algebra concepts. First, understanding
orthogonality is crucial, as it helps preserve vector lengths and ensures numerical
stability in calculations. The orthogonal matrices U and V in SVD play a key role in
maintaining these properties, making the decomposition reliable.
Second, learning about singular values, which are related to the eigenvalues of the
matrix ATA or AAT, has shown how matrices transform space. Singular values measure
how much a matrix stretches different directions, helping us understand matrix rank
and condition number.
Finally, the project has emphasized the practical importance of matrix factorization and
diagonalization techniques. Seeing how SVD generalizes the concept of
eigendecomposition from square matrices to rectangular ones has been particularly
enlightening, showing how SVD applies to many real-world problems where data
matrices aren't always square.
In summary, studying SVD has shown its theoretical knowledge and practical
usefulness in solving complex problems. The knowledge gained, especially in
orthogonality, singular values, and matrix factorization, is essential for advanced
studies and applications in linear algebra.

Page 38 | 39
MATLAB Project – Linear Algebra

IV.References:
[1] (for Theory): Gilbert Strang (1980), Linear Algebra and its Applications – Second
Edition
[2] (for MATLAB): MATLAB Help Center. Link:
https://www.mathworks.com/help/?s_tid=gn_supp
[3] (for Theory & Conclusion): SVD Visualized, Singular Value Decomposition
explained | SEE Matrix , Chapter 3. Link: https://youtu.be/vSczTbgc8Rc

Page 39 | 39

You might also like