Xxinear and Nonlinear Programming
Xxinear and Nonlinear Programming
Ly
LMy
consist of the vectors ei . Then any vector y in M can be written as y = Ez for some
z ∈ E n−m and, of course, LEz represents the action of L on such a vector. To project
this result back into M and express the result in terms of the basis e1 e2 en−m ,
we merely multiply by ET . Thus ET LEz is the vector whose components give the
representation in terms of the basis; and, correspondingly, the n − m × n − m
matrix ET LE is the matrix representation of L restricted to M.
The eigenvalues of L restricted to M can be found by determining the eigen-
values of ET LE. These eigenvalues are independent of the particular orthonormal
basis E.
Example 1. In the last section we considered
⎡ ⎤
0 1 1
L=⎣ 1 0 1 ⎦
1 1 0
Ly
λy y
1+
x1 = 0
2x2 + x3 +
x2 = 0
x2 + 4x3 +
x3 = 0
M = y y1 = 0
In this case M is the subspace spanned by the second two basis vectors in E 3 and
hence the restriction of L to M can be found by taking the corresponding submatrix
of L. Thus, in this case,
1 1
E LE =
T
1 3
338 Chapter 11 Constrained Minimization Conditions
√
The eigenvalues of LM are thus
= 2 ± 2, and LM is positive definite.
Since the LM matrix is positive definite, we conclude that the point found is a
relative minimum point. This example illustrates that, in general, the restriction of
L to M can be thought of as a submatrix of L, although it can be read directly from
the original matrix only if the subspace M is spanned by a subset of the original
basis vectors.
Bordered Hessians
The above approach for determining the eigenvalues of L projected onto M is quite
direct and relatively simple. There is another approach, however, that is useful
in some theoretical arguments and convenient for simple applications. It is based
on constructing matrices and determinants of order n + m rather than n − m, so
dimension is increased.
Let us first characterize all vectors orthogonal to M. M itself is the set of all x
satisfying hx = 0. A vector z is orthogonal to M if zT x = 0 for all x ∈ M. It is not
hard to show that z is orthogonal to M if and only if z = hT w for some w ∈ E m .
The proof that this is sufficient follows from the calculation zT x = wT hx = 0.
The proof of necessity follows from the Duality Theorem of Linear Programming
(see Exercise 6).
Now we may explicitly characterize an eigenvector of LM . The vector x is
such an eigenvector if it satisfies these two conditions: (1) x belongs to M, and (2)
Lx =
x + z, where z is orthogonal to M. These conditions are equivalent, in view
of the characterization of z, to
hx = 0
Lx =
x + hT w