Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
129 views

Optimization Models: Exercises 2

This document outlines 5 exercises for a homework assignment on topics in linear algebra and least squares problems: 1. Derive expressions for the trace and Frobenius norm of a symmetric positive semi-definite matrix in terms of its eigenvalues, and prove an inequality relating the trace, Frobenius norm, and rank. 2. Interpret the solution to a least squares problem geometrically using the SVD of the data matrix. 3. Derive the singular values of an augmented matrix and its SVD. 4. Give expressions for induced matrix norms in terms of the entries of the matrix. 5. Show that regularizing a least squares problem with noisy data is equivalent to adding an L2 penalty

Uploaded by

Jason Wu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
129 views

Optimization Models: Exercises 2

This document outlines 5 exercises for a homework assignment on topics in linear algebra and least squares problems: 1. Derive expressions for the trace and Frobenius norm of a symmetric positive semi-definite matrix in terms of its eigenvalues, and prove an inequality relating the trace, Frobenius norm, and rank. 2. Interpret the solution to a least squares problem geometrically using the SVD of the data matrix. 3. Derive the singular values of an augmented matrix and its SVD. 4. Give expressions for induced matrix norms in terms of the entries of the matrix. 5. Show that regularizing a least squares problem with noisy data is equivalent to adding an L2 penalty

Uploaded by

Jason Wu
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE 127 / EE 227AT

Spring 2016

Homework Assignment #2

Due date: Tue, 2/16/16 11AM. Please LATEX your homework solution and submit the
printout.

Exercise 1 (A lower bound on the rank) Let A ∈ Sn+ be a symmetric, positive semi-
definite matrix.

1. Show that the trace, trace A, and the Frobenius norm, kAkF , depend only on its
eigenvalues, and express both in terms of the vector of eigenvalues.

2. Show that
(trace A)2 ≤ rank(A)kAk2F .

3. Identify classes of matrices for which the corresponding lower bound on the rank is
attained.

Exercise 2 (Geometry of least-squares) Consider a least-squares problem

p∗ = min kAx − yk2 ,


x

where A ∈ Rm,n , y ∈ Rm . We assume that y 6∈ R(A), so that p∗ > 0. Show that, at optimum,
the residual vector r = y − Ax is such that r> y > 0, A> r = 0. Interpret geometrically the
result. Hint: use the SVD of A. You can assume that m ≥ n, and that A is full column
rank.

Exercise 3 (Singular values of augmented matrix) Let A ∈ Rn,m , with n ≥ m, have


singular values σ1 , . . . , σm .

1. Show that the singular values of the (n + m) × m matrix


 
. A
à =
Im
p
are σ̃i = 1 + σi2 , i = 1, . . . , m.

2. Find an SVD of the matrix Ã.

1
Exercise 4 (Some induced matrix norms) Let A ∈ Rm×n , q ∈ [1, ∞]. The vector norm
x 7→ ||x||q induces the matrix norm

||A||q := sup ||Ax||q


x:||x||q =1

Give expressions for the matrix norms induced by the vector norms 1. x 7→ ||x||1 and 2.
x 7→ ||x||∞ . Hint: Your answers should be in terms of Aij , the entries in A.

Exercise 5 (Regularization for noisy data) Consider the least-squares problem

min kAx − yk22


x

in which the data matrix A ∈ Rm,n is noisy. Our specific noise model assumes that each
row aTi ∈ Rn has the form ai = âi + ui , where the noise vector ui ∈ Rn has zero mean and
covariance matrix σ 2 In , with σ a measure of the size of the noise. Therefore, now the matrix
A is a function of the uncertain vector u, which we denote by A(u). We will write  to
denote the matrix with rows âi T , i = 1, . . . , m. We replace the original problem with

min Eu {kA(u)x − yk22 }


x

where Eu denotes the expected value with respect to the random variable u. Show that this
problem can be written as
min k − yk22 + λkxk22
x

where λ ≥ 0 is some regularization parameter, which you will determine. That is, `2 -
regularization can be used as a way to take into account uncertainties in the matrix A, in
the expected value sense. Hint: compute the expected value of ((âi + ui )T x − yi )2 , for a
specific row index i.

You might also like