Linear Algebra Review - Yxw.cs.illinois.edu

Transcription

CS445 Computational PhotographyLinear Algebra ReviewBy Yuanyi ZhongOriginal slides by Yuan Shen

- Goals of this session: Know systems of linear equations in matrix form know basic linear algebra notations (vectors and matrices) Know matrix properties (norm, inverse, pseudo-inverse,transpose, rank, eigenvalue, and etc.) Know how to setup systems of linear equations and solvethem Know how to get SVD decomposition of a matrix (Eric on Friday) Know how to use Jupyter notebook, numpy

- Reading Material for Linear algebra pdf YouTube channel (3Blue1Brown) with visualization: https://www.youtube.com/channel/UCYO jab esuFRV4b17AJtAwCS 357 course website: ge/schedule/

- System of Linear Equations in Matrix Form

- System of Linear Equations in Matrix Form

- System of Linear Equations in Matrix Form

- System of Linear Equations in Matrix FormInterchanging rowshas no effect tosolutions

- Quiz n nIf we know a square matrix A Rhas n linearlyindependent eigenvectors, then which of the following is/aretrue:A. The matrix is full rankedB. The matrix is invertibleC. The matrix is diagonalizableD. The determinant of A is not equal to 0E. The number 0 is not an eigenvalue of A

- Basic Notations: The i-th element of avector x is denoted xi A R m n, m rows and ncolumns

Vector

- Vector Norm A norm of a vector x is informally a measure of the “length” of thevector. In particular, l2-norm or Euclidean norm is as follows:

- Vector-vector inner productFor inner product, !x and !yshould have the samedimension

- Vector-vector outer-productFor outer product, x! and !ydo not have to be in thesame dimension

Matrix

- Matrix MultiplicationDimension must match!nppmAxnB mC

- Matrix Properties

- Matrix Transpose

- Matrix Rank Definition: the number of linearly independent columns/rows of A

- Matrix Inverse

- Matrix eigenvalue and eigenvectorWe usually normalized thenorm of eigenvector sothat, ! v 2 1

- Diagonalizable Matrices Definition: For a matrix A R n n, if A has n linearly independenteigenvectors, then A is said to be diagonalizable. In other words,A UDU 1Also known as Eigendecomposition, spectral decomposition.

- Diagonalizable Matrices Definition: For a matrix A R n n, if A has n linearly independenteigenvectors, then A is said to be diagonalizable. In other words,A UDU 1From CS357 spring 2019

- Quiz n nIf we know a square matrix A Rhas n linearlyindependent eigenvectors, then which of the following is/aretrue:A. The matrix is full rankedB. The matrix is invertibleC. The matrix is diagonalizableD. The determinant of A is not equal to 0E. The number 0 is not an eigenvalue of A

- Matrix Singular Value Decomposition Definition: factorization a real (or complex) matrix intothree matrices. A UΣVT, where U and V are orthogonal matrices, andΣ is diagonal matrix. Orthogonal (orthonormal) matrix: a real square matrixwhose columns and rows are orthogonal unit vectors.The inverse of an orthogonal matrix is its transpose. Applications: pseudoinverse, PCA, etc.

- How to calculate the decomposed matrices?A UΣV T, A T VΣU TA T A VΣTU TUΣV TA T A VΣT ΣV TLet D ΣT ΣA T A VDV T (diagonalization),It is in the form ofdiagonalization! It thenindicates that V is theeigenvectors of A! T A, andthe square root of D givesus the value of Σ!

- Matrix Singular Value Decomposition (not reduced)The columns of U are theeigenvectors of M! MTThe diagonals of S (singularvalues) are the square rootof eigenvalues of M! TMThe columns of V are theeigenvectors of !M T M

- Goals of this session: Know systems of linear equations in matrix form know basic linear algebra notations (vectors and matrices) Know matrix properties (norm, inverse, pseudo-inverse, transpose, rank, eigenvalue, and etc.) Know how to setup systems of linear equations and solve them Know how to get SVD decomposition of a matrix (Eric on Friday) Know how to use .