Linear Algebra & Geometry - Stanford University

Transcription

Linear Algebra & Geometrywhy is linear algebra useful in computer vision?References:-Any book on linear algebra!-[HZ] – chapters 2, 4Some of the slides in this lecture are courtesyto Prof. Octavia I. Camps, Penn State University

Why is linear algebra useful incomputer vision? Representation– 3D points in the scene– 2D points in the image Coordinates will be used to– Perform geometrical transformations– Associate 3D with 2D points Images are matrices of numbers– Find properties of these numbers

Agenda1. Basics definitions and properties2. Geometrical transformations3. SVD and its applications

Vectors(i.e., 2D or 3D vectors)P [x,y,z]p [x,y]Image3D world

Vectors(i.e., 2D vectors)x2vθx1Magnitude:IfP,Is a UNIT vectorIs a unit vectorOrientation:

Vector Additionv wvw

Vector Subtractionvv-ww

Scalar Productavv

Inner (dot) ProductvαwThe inner product is a SCALAR!

Orthonormal BasisPx2vjiθx1

Vector (cross) Productu wαvThe cross product is a VECTOR!Magnitude:kuk kv wk kvkkwk sin Orientation:

Vector Product Computationi (1,0,0)j (0,1,0) i 1 j 1i j kk (0,0,1) k 1k i jj k i ( x y x y )i ( x y x y ) j ( x y x y )k2 33 23 11 31 22 1

MatricesAn m2a116 a216 6 .4 .an1a12a22.an2.3a1ma2m 77. 7. 5anmPixel’s intensity valueSum:A and B must have the same dimensions!Example:

MatricesAn m2a116 a216 6 .4 .an1a12a22.an2.3a1ma2m 77. 7. 5anmProduct:A and B must havecompatible dimensions!

Matrix TransposeDefinition:Cm n ATn mcij ajiIdentities:(A B)T AT BT(AB)T BT ATIf A AT , then A is symmetric

Matrix DeterminantUseful value computed from the elements of a square matrix A det a11 a11 a11 a12det a11 a22 a12 a21a21 a22 a11 a12 a13det a21 a22 a23 a11 a22 a33 a12 a23 a31 a13 a21 a32a31 a32 a33 a13 a22 a31 a23 a32 a11 a33 a12 a21

Matrix InverseDoes not exist for all matrices, necessary (but not sufficient) thatthe matrix is squareAA 1 A 1 A IA 1 aa 11 12a21 a22 1 1a22 a12 , det A 6 0det A a21 a11If det A 0, A does not have an inverse.

2D GeometricalTransformations

2D TranslationP’Pt

2D Translation EquationP’tyPyxttx

2D Translation using MatricestyPP’tyxtx

Homogeneous Coordinates Multiply the coordinates by a non-zeroscalar and add an extra coordinate equalto that scalar. For example,

Back to Cartesian Coordinates: Divide by the last coordinate and eliminate it. Forexample, NOTE: in our example the scalar was 1

2D Translation usingHomogeneous CoordinatestyPP’ttyxtxP

ScalingP’P

Scaling EquationP’sy yyPxsx x

Scaling & TranslatingP’’PP’ S·PP’’ T·P’P’’ T · P’ T ·(S · P) (T · S)·P A · P

Scaling & TranslatingA

Translating & Scaling Scaling & Translating ?

RotationP’P

Rotation EquationsCounter-clockwise rotation by an angle θP’y’θPyx’x

Degrees of FreedomR is 2x24 elementsNote: R belongs to the category of normal matricesand satisfies many interesting properties:

Rotation Scaling TranslationP’ (T R S) PIf sx sy, this is asimilaritytransformation!

Transformation in 2D-Isometries-Similarities-Affinity-Projective

Transformation in 2DIsometries:[Euclideans]- Preserve distance (areas)- 3 DOF- Regulate motionof rigid object

Transformation in 2DSimilarities:- Preserve- ratio of lengths- angles-4 DOF

Transformation in 2DAffinities:

Transformation in 2DAffinities:-Preserve:- Parallel lines- Ratio of areas- Ratio of lengths oncollinear lines- others - 6 DOF

Transformation in 2DProjective:- 8 DOF- Preserve:- cross ratio of 4 collinear points- collinearity- and a few others

Eigenvalues and EigenvectorsA eigenvalue λ and eigenvector u satisfiesAu λuwhere A is a square matrix.IMultiplying u by A scales u by λ

Eigenvalues and EigenvectorsRearranging the previous equation gives the systemAu λu (A λI)u 0which has a solution if and only if det(A λI) 0.IThe eigenvalues are the roots of this determinant which ispolynomial in λ.ISubstitute the resulting eigenvalues back into Au λu andsolve to obtain the corresponding eigenvector.

Singular Value DecompositionTUΣV A Where U and V are orthogonal matrices,and Σ is a diagonal matrix. For example:

Singular Value decompositionthat

An Numerical Example Look at how the multiplication worksout, left to right: Column 1 of U gets scaled by the firstvalue from Σ. The resulting vector gets scaled by row1 of VT to produce a contribution to thecolumns of A

An Numerical Example Each product of (column i of U)·(value ifrom Σ)·(row i of VT) produces a

An Numerical ExampleWe can look at Σto see that thefirst column hasa large effectwhile the secondcolumn has amuch smallereffect in thisexample

SVD Applications For this image, using only the first 10of 300 singular values produces arecognizable reconstruction So, SVD can be used for imagecompression

Principal Component Analysis Remember, columns of U are the PrincipalComponents of the data: the major patterns thatcan be added to produce the columns of theoriginal matrixOne use of this is to construct a matrix where eachcolumn is a separate data sampleRun SVD on that matrix, and look at the first fewcolumns of U to see patterns that are commonamong the columnsThis is called Principal Component Analysis (or PCA)of the data samples

Principal Component Analysis Often, raw data samples have a lot ofredundancy and patternsPCA can allow you to represent data samplesas weights on the principal components, ratherthan using the original raw form of the dataBy representing each sample as just thoseweights, you can represent just the “meat” ofwhat’s different between samples.This minimal representation makes machinelearning and other algorithms much moreefficient

Linear Algebra & Geometry why is linear algebra useful in computer vision? Some of the slides in this lecture are courtesy to Prof. Octavia I. Camps, Penn State University References:-Any book on linear algebra!-[HZ] - chapters 2, 4. Why is linear algebra useful in