Module 1 Theory of Matrices
Module 1 Theory of Matrices
Module 1 Theory of Matrices
• In general, 𝐴𝐵 ≠ 𝐵𝐴.
• If 𝐴𝐵 = 0 does not imply that 𝐴 = 0 𝑜𝑟 𝐵 = 0.
• If 𝐴𝐵 = 𝐴𝐶 does not imply that, 𝐵 = 𝐶.
• Distributive Law holds for Matrix multiplication
𝐴 𝐵 + 𝐶 = 𝐴𝐵 + 𝐴𝐶, 𝐴 + 𝐵 𝐶 = 𝐴𝐶 + 𝐵𝐶
• 𝐴2 = 𝐴 × 𝐴, 𝐴3 = 𝐴2 × 𝐴 𝑖. 𝑒 𝐴𝑚 𝑛 = 𝐴𝑚𝑛
• 𝐴𝑚 𝐴𝑛 = 𝐴𝑚+𝑛
• 𝐴′ ′ = 𝐴
• 𝐴𝐵 ′ = 𝐵′ 𝐴′
• 𝐴 + 𝐵 ′ = 𝐴′ + 𝐵′
• (𝐴𝐵)−1 = 𝐵 −1 𝐴−1
• 𝐴𝐵 = 𝐴 . |𝐵|
Elementary Row and Column
Operations
Rank of a Matrix
The matrix is said to be of rank r if there is
1. At least one minor of the order r which is not equal to zero
2. Every minor of order (r+1) is equal to Zero.
Rank is denoted by 𝜌 𝐴 .
1 2 3
𝐸𝑥. 𝐴 = 2 4 6 Note that |A|=0 and also any minor of
3 6 9
order 2 × 2 is 0. Hence, largest order of its non vanishing minor
is 1.
In above form,
is coefficient Matrix.
Solution of system of Linear Equations
AX=B
(m equations, n
unknowns)
𝜌 𝐴 𝐵 = 𝜌(𝐴) 𝜌 𝐴 𝐵 ≠ 𝜌(𝐴)
System AX=B is System AX=B is
consistent inconsistent
Note that
different
values of ‘t’
will yield
different
solutions
Inconsistent
system of
Linear
equations
Follow the
chart for
solution of
system of
linear
equations
Linear Dependence and Independence
By Triangle law,
𝑐Ԧ = 𝑎Ԧ + 𝑏
Set {𝑎,
Ԧ 𝑏, 𝑐}
Ԧ is Linearly dependent.
Set {𝑎,
Ԧ 𝑏} is linearly Independent.
Definition
Linearly Independent Linearly Dependent
The system n vectors Set The system n vectors Set
𝑥1 , 𝑥2 , 𝑥3 … … . 𝑥𝑛 is said to be 𝑥1 , 𝑥2 , 𝑥3 … … . 𝑥𝑛 is said to be
Linearly Independent if every Linearly dependent if every
relation of the type relation of the type
𝑐1 𝑥1 + 𝑐2 𝑥2 + ⋯ . . 𝑐𝑛 𝑥𝑛 = 0 𝑐1 𝑥1 + 𝑐2 𝑥2 + ⋯ . . 𝑐𝑛 𝑥𝑛 = 0
has a unique solution 𝑐1 = 𝑐2 = Has some 𝑐𝑖 ≠ 0.
… … . = 𝑐𝑛 = 0
Remarks
• 𝑐1 𝑥1 + 𝑐2 𝑥2 + ⋯ . . 𝑐𝑛 𝑥𝑛 = 0 is a linear equation with
𝑐1 , c2 , c3 … … cn as unknowns.
• If vectors 𝑥1 , 𝑥2 , 𝑥3 , … … 𝑥𝑛 are linearly dependent then in
equation 𝑐1 𝑥1 + 𝑐2 𝑥2 + ⋯ . . 𝑐𝑛 𝑥𝑛 = 0 by definition some 𝑐𝑘 ≠ 0.
• By simplification we get
• 𝑐𝑘 𝑥𝑘 = −𝑐1 𝑥1 − 𝑐2 𝑥2 … … − 𝑐𝑛 𝑥𝑛
𝑐1 𝑐2 𝑐𝑛
• 𝑥𝑘 = − 𝑥1 − 𝑥2 − … … … . − 𝑥𝑛
𝑐𝑘 𝑐𝑘 𝑐𝑘
• If vectors are dependent then there exist a relation between
the given vectors.
Linearly
Independent
set of Vectors
Linearly
Dependent.
Hence, relation
between vectors
is found.
Linear Transformation
Matrix S is
applied to
vectors
1 0
,
0 1
Matrix S
transforms
A Linear Transformation is a map from ‘n’ each and
dimensional space to itself generally represented every
by vector on
𝑦1 = 𝑎11 𝑥1 + 𝑎12 𝑥2 … . +𝑎1𝑛 𝑥𝑛 the plane
𝑦2 = 𝑎21 𝑥1 + 𝑎22 𝑥2 … . +𝑎2𝑛 𝑥𝑛
.
.
𝑦𝑛 = 𝑎𝑛1 𝑥1 + 𝑎𝑛2 𝑥2 … . +𝑎𝑛𝑛 𝑥𝑛
Matrix Representation of Linear Transformation
Linearly Transformed
(Using some matrix ‘A’)
• Observe blue and red vectors before and after the transformation.
Blue vector has same direction even after the transformation where
as red vector changes its direction.
Eigen Value and Eigen Vectors
Remember
Blue vector?
Graphically Eigen vector is the one that doesn’t change its direction
And Eigen values are the extent by which Eigen vector changes its
length.
Characteristic Equation
• Characteristic Equation is also called characteristic Polynomial
because 𝐴 − 𝜆𝐼 = 0 is a polynomial of degree equal to order of
square matrix A.
• For 2 × 2 matrix, characteristic polynomial is given by
𝑎11 𝑎12 𝜆2 − 𝑆1 𝜆 + 𝐴 = 0
𝑎21 𝑎22 Where 𝑆1 = sum of diagonal matrix = Trace (A)
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − 𝐴 = 0
𝑎11 𝑎12 𝑎13 Where 𝑆1 = sum of diagonal matrix =Trace(A)
𝑎21 𝑎22 𝑎23 𝑆2 = sum of minors of order two of diagonal elements
𝑎31 𝑎32 𝑎33 𝑎22 𝑎23 𝑎11 𝑎13 𝑎11 𝑎12
𝑆2 = 𝑎 + +
32 𝑎33 𝑎31 𝑎33 𝑎21 𝑎22
Method
(To Eigen values and vectors for a given matrix ‘A’)
1. Write the characteristic polynomial for ‘A’ i.e 𝐴 − 𝜆𝐼 = 0. (use
the shortcuts for 2 × 2 and 3 × 3 matrix)
2. Roots of the Characteristic polynomial are Eigen Values. (say 𝜆1 ,
𝜆2 , 𝜆3 … . . )
3. For each Eigen Value 𝜆𝑖 , we can find Eigen vectors by solving the
system of linear equation
𝐴 − 𝜆𝑖 𝐼 𝑋 = 0
Brain Teaser
Can same Eigen vector can correspond to two different Eigen Values?
If no, then why? If yes, can you find it?
HINT: USE THE BASIC DEFINITION OF EIGEN VECTOR
Can you quickly check
Eigen values for
0 −1
?
1 0
Any
conclusion?
As a homework,
Try finding
Eigen vectors
for 𝜆 = 1 & 2?
Interesting facts about this
particular example:
1. Repeated Eigen value.
2. Two variables
3. Matrix symmetric
−𝑟 − 𝑠 −1 −1
𝑟 =𝑟 1 +𝑠 0
𝑠 0 1
𝑋 𝑥
𝑐𝑜𝑠𝜃 𝑠𝑖𝑛𝜃 0 −𝑢
𝑌 = −𝑠𝑖𝑛𝜃 𝑐𝑜𝑠𝜃 0 −𝑣 𝑦
𝑍 𝑧
0 0 1 −𝑤 1
1
• If axis of rotation is changed then only rotation matrix (first
three columns) in above expression will change. Everything
else remains same.
Examples:
Exercises: