Sec5 3
Sec5 3
3 ORTHOGONAL TRANSFORMATIONS
AND ORTHOGONAL MATRICES
kT (~
x)k = k~ x in Rn.
xk, for all ~
If T (~
x) = A~
x is an orthogonal transformation,
we say that A is an orthogonal matrix.
1
EXAMPLE 1 The rotation
" #
cosφ −sinφ
T (~
x) = ~
x
sinφ cosφ
2
EXAMPLE 2 Reflection
Consider a subspace V of Rn. For a vector ~ x
in Rn, the vector R(~ x) = 2projV ~
x−~ x is called
the reflection of ~
x in V . (see Figure 1).
Show that reflections are orthogonal transfor-
mations.
Solution
We can write
R(~
x) = projV ~
x + (projV ~
x−~
x)
and
~
x = projV ~
x + (~
x − projV ~
x).
By the pythagorean theorem, we have
x)k2 = kprojV ~
kR(~ xk2 + kprojV ~ xk2
x−~
xk2 + k~
= kprojV ~ xk2 = k~
x − projV ~ xk2.
3
Fact 5.3.2 Orthogonal transformations pre-
serve orthogonality
Consider an orthogonal transformation T from
Rn to Rn. If the vectors ~v and w ~ in Rn are
orthogonal, then so are T (~v ) and T (w).
~
Proof
By the theorem of Pythagoras, we have to
show that
Let’s see:
~ 2 = kT (~v + w)k
kT (~v ) + T (w)k ~ 2 (T is linear)
~ 2 (T is orthogonal)
= k~v + wk
= k~v k2 + kwk
~ 2 (~v and w
~ are orthogonal)
Proof Part(a):
⇒ If T is orthogonal, then, by definition, the
T (~ei) are unit vectors, and by Fact 5.3.2, since
e~1, e~2,. . .,e~n are orthogonal, T (e~1), T (e~2),. . .,T (e~n)
are orthogonal.
~
x = x1e~1 + x2e~2 + · · · + xne~n
in Rn. Then,
5
x)k2 = kx1T (e~1)+x2T (e~2)+· · ·+xnT (e~n)k2
kT (~
= x2 2 2
1 + x2 + · · · + xn
xk2
= k~
Solution
6
Fact 5.3.4
Products and inverses of orthogonal matrices
Proof
In part (a), the linear transformation T (~
x) =
AB~x preserves length, because kT (~
x)k = kA(B~ x)k =
kB~xk = k~
xk. Figure 4 illustrates property (a).
7
The Transpose of a Matrix
8
Solution
26 3 2 6 3
1 6
BA = 49
2 −3 3
2 −6 =
3 −6 2 6 −3 2
49 0 0
1
49 0 49 0 = I3
0 0 49
9
Definition 5.3.5 The transpose of a matrix;
symmetric and skew-symmetric matrices
Consider an m × n matrix A.
The transpose AT of A is the n × m matrix
whose ijth entry is the jith entry of A: The
roles of rows and columns are reversed.
We say that a square matrix A is symmetric
if AT = A, and A is called skew-symmetric if
AT = −A.
" #
1 2 3
EXAMPLE 5 If A = , then AT =
9 7 5
1 9
2 7 .
3 5
10
EXAMPLE 6 The symmetric " 2 × 2 matrices
#
a b
are those of the form A = , for example,
b c
" #
1 2
A= .
2 3
11
Note that the transpose of a (column) vector
~v is a row vector: If
1 h i
T
~v = 2 , then ~v = 1 2 3 .
3
Fact 5.3.6
~ are two (column) vectors in Rn, then
If ~v and w
~ = ~v T w.
~v · w ~
For example,
1 1 h i 1
2 · −1 = 1 2 3 −1 = 2.
3 1 1
12
Fact 5.3.7
Consider an n × n matrix A. The matrix A
is orthogonal if (and only if) AT A = In or,
equivalently, if A−1 = AT .
Proof
To justify this fact, write A in terms of its
columns:
| | |
A = v~1 v~2 . . . v~n
| | |
Then,
− v~1 T
−
− v~2T | | |
−
A A=
T
... v~1
v~2 . . . v~n =
| | |
− v~n T −
v~1 · v~1 v~1 · v~2 . . . v~1 · v~n
v~2 · v~1 v~2 · v~2 . . . v~2 · v~n
... ... ... ... .
v~n · v~1 v~n · v~2 . . . v~n · v~n
1. A is an orthogonal matrix.
4. AT A = In.
5. A−1 = AT .
14
Fact 5.3.9 Properties of the transpose
a. If A is an m×n matrix and B an n×p matrix,
then
(AB)T = B T AT .
rank(A) = rank(AT ).
15
Proof
a. Compare entries:
b. We know that
AA−1 = In
projL~
x = (v~1 · ~
x)v~1
projL~x = v~1(v~1 · ~
x)
= v~1v~1T ~
x
= M~ x,
x = v~1v~1T ~
projv ~ x + · · · + v~mv~mT ~
x
= (v~1v~1T + · · · + v~mv~mT )~ x
| | − v~1T −
= v~1 . . . v~m ...
~x
| | − v~mT −
Fact 5.3.10 The matrix of an orthogonal
projection
Consider a subspace V of Rn with orthonormal
basis v~1, v~2, . . . , v~m. The matrix of the orthog-
onal projection onto V is
| | |
AAT , where A = v~1 v~2 . . . v~m .
| | |
18
Solution