Optimization Models: Exercises 2
Optimization Models: Exercises 2
Spring 2016
Homework Assignment #2
Due date: Tue, 2/16/16 11AM. Please LATEX your homework solution and submit the
printout.
Exercise 1 (A lower bound on the rank) Let A ∈ Sn+ be a symmetric, positive semi-
definite matrix.
1. Show that the trace, trace A, and the Frobenius norm, kAkF , depend only on its
eigenvalues, and express both in terms of the vector of eigenvalues.
2. Show that
(trace A)2 ≤ rank(A)kAk2F .
3. Identify classes of matrices for which the corresponding lower bound on the rank is
attained.
where A ∈ Rm,n , y ∈ Rm . We assume that y 6∈ R(A), so that p∗ > 0. Show that, at optimum,
the residual vector r = y − Ax is such that r> y > 0, A> r = 0. Interpret geometrically the
result. Hint: use the SVD of A. You can assume that m ≥ n, and that A is full column
rank.
1
Exercise 4 (Some induced matrix norms) Let A ∈ Rm×n , q ∈ [1, ∞]. The vector norm
x 7→ ||x||q induces the matrix norm
Give expressions for the matrix norms induced by the vector norms 1. x 7→ ||x||1 and 2.
x 7→ ||x||∞ . Hint: Your answers should be in terms of Aij , the entries in A.
in which the data matrix A ∈ Rm,n is noisy. Our specific noise model assumes that each
row aTi ∈ Rn has the form ai = âi + ui , where the noise vector ui ∈ Rn has zero mean and
covariance matrix σ 2 In , with σ a measure of the size of the noise. Therefore, now the matrix
A is a function of the uncertain vector u, which we denote by A(u). We will write  to
denote the matrix with rows âi T , i = 1, . . . , m. We replace the original problem with
where Eu denotes the expected value with respect to the random variable u. Show that this
problem can be written as
min k − yk22 + λkxk22
x
where λ ≥ 0 is some regularization parameter, which you will determine. That is, `2 -
regularization can be used as a way to take into account uncertainties in the matrix A, in
the expected value sense. Hint: compute the expected value of ((âi + ui )T x − yi )2 , for a
specific row index i.