Matrix Groups
Matrix Groups
Among the most important examples of groups are groups of matrices. The textbook
briefly discusses groups of matrices in Chapter 2, and then largely forgets about them.
These notes remedy this omission.
Note that the matrices in a matrix group must be square (to be invertible), and
must all have the same size. Thus there are 2 2 matrix groups, 3 3 matrix groups,
4 4 matrix groups, and so forth. The size of the matrices is sometimes referred to
as the degree of the matrix group.
Because the group operation for a matrix group is matrix multiplication, the
identity element of a matrix group is always the n n identity matrix, and inverses
in a matrix group are just the usual inverse matrices.
The most important matrix groups are the general linear groups.
Since a matrix is invertible if and only if its determinant is nonzero, GL(n, F ) can
also be defined as the group of all n n matrices with entries in F having nonzero
determinant. (Note that the determinant of a matrix with entries in F is by definition
an element of F .)
EXAMPLE 1 GL(2, Z2 )
If F is an infinite field such as R or C, then the general linear group GL(n, F ) has
infinite order. However, if F is a finite field, then GL(n, F ) is a finite group.
For example, consider all 2 2 matrices over the field Z2 . There are sixteen such
matrices, since each of the four entries can be either 0 or 1. Of these sixteen matrices,
exactly six of them have nonzero determinant:
1 0 0 1 1 0 0 1 1 1 1 1
, , , , , .
0 1 1 0 1 1 1 1 0 1 1 0
The general linear group GL(n, F ) is the most general matrix group, in the same
way that Sn is the most general permutation group. In particular, every matrix group
is just a subgroup of some GL(n, F ).
The most important subgroup of GL(n, F ) is the special linear group.
It is not hard to see that the set of all diagonal matrices in GL(n, F ) forms a subgroup.
We shall verify this using the Two-Step Subgroup Test.
1. First, observe that the product of two diagonal matrices is again diagonal, with
1 0 0 1 0 0 1 1 0 0
0 2 0 0 2 0 0 2 2 0
=
.. .. . . .. .. .. . . .. .. .. .. ..
. . . . . . . . . . . .
0 0 n 0 0 n 0 0 n n
Linear Transformations
For the remainder of these notes, we will only be considering vectors and matrices
over the real numbers.
Let Rn denote n-dimensional Euclidean space, i.e. the set of all column vec-
tors with n components. For example, R2 is the Euclidean plane, and R3 is three-
dimensional Euclidean space.
If A is an n n matrix and v Rn , the product Av is again an element of Rn .
This lets us think of any n n matrix as a function from Rn to itself.
T (v) = Av
for all v Rn .
You may be surprised that the reflection, dilation, and rotation in the last ex-
ample were linear transformations. This brings up several questions. First, is every
reflection, dilation, or rotation of the plane a linear transformation? And what about
other sorts of transformations, such as translations?
The following theorem places some restriction on the possible linear transforma-
tions.
Geometrically, this theorem says that any linear transformation must fix the origin
in Rn . Thus a translation cannot a linear transformation, and a reflection, dilation,
or rotation can only be a linear transformation if it fixes the origin.
Our next theorem gives a complete geometric classification of linear transforma-
tions. This is a surprisingly difficult theorem to prove, and we shall not attempt to
do so here.
Using this theorem, we can immediately see that many familiar geometric trans-
formations are in fact linear transformations. These include:
Any reflection of R3 across a plane through the origin, or more generally any
reflection of Rn across an (n 1)-dimensional subspace.
Matrix Groups 6
Transformation Groups
As we have seen, some of the most important geometric transformations of Rn are
linear transformations. As a result, the symmetry group of a geometric object can
sometimes be thought of as a group of linear transformations.
The following theorem can be very helpful for working with transformation groups.
PROOF Recall that TA (v) = Av and TB (v) = Bv for any vector v Rn . Then
(TA TB )(v) = TA (TB (v)) = TA (Bv) = A(Bv) = (AB)v
for all v Rn , which proves that TA TB is the linear transformation corresponding
to AB.
For part (2), let TA1 be the linear transformation corresponding to A1 , and let
TI be the linear transformation corresponding to the n n identity matrix. Since
AA1 = A1 A = I, it follows from part (1) that TA TA1 = TA1 TA = TI . But TI
is the identity function on Rn , and therefore TA1 an the inverse function for TA .
This theorem lets us work with transformation groups as though they were groups
of matrices. Indeed, every transformation group is isomorphic to a group of matrices.
that
1 0 0 1
T = and T = ,
0 1 1 0
0 1
so the matrix for T is .
1 0
The method of Theorem 5 also works for linear transformations in three dimen-
sions.
Using the method of Theorem 5, we can find matrices corresponding to the el-
ements of any transformation group. In most cases, the fastest approach is to use
Theorem 5 to find matrices corresponding to the elements of a generating set, and
then multiply to obtain the remaining matrices.