3.IV. MatrixOperations
3.IV. MatrixOperations
Matrix Operations
Proof: Do it yourself.
Example 1.1: f, g : R2 → R2
1 3 0 0
F fB D 2 0 G gB D 1 2
1 0 2 4
B D B D
1 3 v1 3v2
Then 1
2 0
v
f v B D v 2v1
1 0 2 B v
B D 1 D
0 0 0
v1
g v B D 1 2 v v1 2v2
2 4 2 B 2v 4v
B D 1 2 D
v1 3v2 1 3
v1
f g v B D f v B D g v B D v1 2v2 1 2 v
3v 4v 3 4 2 B
1 2 D B D
1 3 0 0 1 3
F G 2 0 1 2 1 2
1 0 2 4 3 4
v1 3v2 0
f v B D 2v1 g v B D v1 2v2
v 2v 4v
1 D 1 2 D
v1 3v2 1 3
v1
f g v B D f v B D g v B D v1 2v2 1 2 v
3v 4v 3 4 2 B
1 2 D B D
1 3
→
f g B D 1 2 hB D H
3 4
B D
1. The trace of a square matrix is the sum of the entries on the main
diagonal (the 1,1 entry plus the 2,2 entry, etc.; we will see the significance of
the trace in Chapter Five).
Show that trace(H + G) = trace(H) + trace(G).
Is there a similar result for scalar multiplication?
3.IV.2. Matrix Multiplication
Lemma 2.1: A composition of linear maps is linear, i.e.,
If h : V → W and g : W → U are linear maps,
then g h : V → U is a linear map.
i.e.,
h1 j
G H gi1 gi 2 gi r
h2 j pi j P
hr j
P j G H j
Theorem 2.6:
A composition of linear maps is represented by the matrix product of the representatives.
Proof:
Let h: V n → W r and g: W r → U m be linear maps with representations
RepB Ch h B C H RepC D g g C D G
r r n n
r n
G H v B i gi k wk gi k h k j v j gi k h k j v j G H i j v j
k 1 k 1 j 1 j 1 k 1 j 1
→ RepD g h v G H v B v V
Example 2.2, 4:
Let h : R4 → R2 and g : R2 → R3 with bases B R4 , C R2 , D R3 so that
4 6 8 2 1 1
H RepB C h h B C G RepC D g gC D 0 1
5 7 9 3 B C
1 0
C D
1 1 9 13 17 5
→ 4 6 8 2
F RepB D g h 0 1 5 7 9 3 5 7 9 3
1 0 B C
C D 4 6 8 2 B D
v1 v1
v v
Let v B 2 then 4 6 8 2 2 4v1 6v2 8v3 2v4
v3 wC H vB
5 7 9 3 B C v3 5v 7v 9v 3v
1 2 3 4C
v4 B v4 B
v1
9 13 17 5 v 9v1 13v2 17v3 5v4
x D G H v B 5 7 9 3 2 5v1 7v2 9v3 3v4
4 6 8 2 v3 4v 6v 8v 2v
B D v 1 2 3 4 D
4 B
1 2 5 6 19 22 5 6 1 2 23 34
3 4 7 8 43 50 7 8 3 4 31 46
Example 2.10:
5 6 1 2 0 23 34 0 1 2 0 5 6
7 8 3 4 0 31 46 0 3 4 0 7 8 is not defined
Theorem 2.12:
Matrix products, if defined, are associative
(FG)H = F(GH)
and distributive over matrix addition
F(G + H) = FG + FH and (G + H)F = GF + HF.
1 1 9 13 17 5
0 1 4 6 8 2
5 7 9 3 5 7 9 3
1 0 4 6 8 2
Lemma 3.7: ( Row i of A ) B = ( Row i of AB )
1 1 9 13 17 5
0 1 4 6 8 2
5 7 9 3 5 7 9 3
1 0 4 6 8 2
A ( Column j of B ) = ( Column j of AB )
1 1 9 13 17 5
0 1 4 6 8 2 5 7 9 3
5 7 9 3
1 0 4 6 8 2
Definition 3.2: Unit Matrix
A matrix with all zeroes except for a one in the i, j entry is an i, j unit matrix.
0 1 8 9 4 0 2 2 8 2 9 2 4
0 0 5 6 7
0 0 5 6 7
0 0 0 0 0 0
0 0 8 9 4 0 0 0 0 08 9 4 0 0
0
0 1 0 2
5 6 7 0 5 5 6 7 0 2 5
8 9 4 0 0 0 8 8 9 4 0 0 0 2 8
0 0 0 0
0 1 8 9 4 1 1 5 8 6 9 7 4
0 0 5 6 7 0 0 0 0 0 5 6 7
8 9 4 8 9 4 0 0 0
1 0 5 6 7 0 0 0
0 0
Definition 3.8: Main Diagonal
The main diagonal (or principal diagonal or diagonal) of a square matrix
goes from the upper left to the lower right.
1 0 0
0 1 0
Definition 3.9: Identity Matrix I n n I n n i j i j
0 0 1
a11 0 0
0 a 0
A nn 22
diag a11, a22 , , ann A nn i j aii i j
0 0 ann
Definition 3.14: Permutation Matrix
A permutation matrix is square and is all zeros except for a single one in each
row and column.
From the left (right) these matrices permute rows (columns).
1 0 0
1
i k i
I Mi k M2 3 0 3 0
0 0 1
1 0 0
i j
2 I
Pi , j P2, 3 0 0 1
0 1 0
1 0 0
j k i j C2 , 3 3 0 1 0
3 I Ci , j k
0 3 1
1 0 0 a b a b
C2 , 3 3 A 0 1 0 c d c d
0 3 1 e f 3c e 3d f
The Gauss’s method and Gauss-Jordan reduction can be accomplished
by a single matrix that is the product of elementary reduction matrices.
Corollary 3.22:
For any matrix H there are elementary reduction matrices R1, . . . , Rr
such that Rr Rr1 ··· R1 H is in reduced echelon form.
Exercises 3.IV.3
1. The need to take linear combinations of rows and columns in tables of
numbers arises often in practice. For instance, this is a map of part of Vermont
and New York.
2. The trace of a square matrix is the sum of the entries on its diagonal (its
significance appears in Chapter Five). Show that trace(GH) = trace(HG).
3. A square matrix is a Markov matrix if each entry is between zero and one
and the sum along each row is one. Prove that a product of Markov matrices
is Markov.
3.IV.4. Inverses
Example 4.1: x
y x
Let π: R3 → R2 be the projection map y
z
x
and η : R2 → R3 be the embedding x y
y
0
The composition π η: R2 → R2 is the identity map on R2 :
x
x y
x π is a left inverse map of η.
y y η is a right inverse map of π.
0
The composition η π : R3 → R3 is not the identity map on R3 :
x x
y x y π has no left inverse.
y η has no right inverse.
z 0
The zero map has neither left nor right inverse.
Then
Corollary 4.12:
The inverse for a 22 matrix exists and equals
if and only if ad bc 0.
Exercises 3.IV.4
1 T
1
1 T T2 T3
Generalize.
3. Prove: if the sum of the elements of a square matrix is k, then the sum of
the elements in each row of the inverse matrix is 1/k.