Linear Algebra - Part II: Projection, Eigendecomposition, SVD
Linear Algebra - Part II: Projection, Eigendecomposition, SVD
Punit Shah
◮ Symmetric Matrix:
A = AT
◮ Orthogonal Matrix:
◮ L2 Norm:
||x||2 = xi2
i
diag(v)
◮ Multiplying vector x by a diagonal matrix is efficient:
diag(v)x = v ⊙ x
◮ Inverting a square diagonal matrix is efficient:
1 1
diag(v)−1 = diag [ , . . . , ]T
v1 vn
det(A) or |A|
◮ Measures how much multiplication by the matrix expands
or contracts the space.
det(AB) = det(A)det(B)
If det(A) = 0, then:
◮ A is linearly dependent.
A = Vdiag(λ)V−1
Av = λv
||v|| = 1
Av = λv
Av − λv = 0
(A − λI)v = 0
◮ If nonzero solution for v exists, then it must be the case
that:
det(A − λI) = 0
A = Vdiag(λ)V−1
A = UDVT
◮ If A is m × n, then U is m × m, D is m × n, and V is
n × n.