Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Determinant For Non Square Matrices

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/298896832

DETERMINANT FOR NON-SQUARE MATRICES

Article · October 2011

CITATIONS READS

5 17,783

3 authors:

Mohan Arunkumar Soujanya Murthy


Government Arts College, Tiruvannamalai Western Governors University
80 PUBLICATIONS 578 CITATIONS 30 PUBLICATIONS 102 CITATIONS

SEE PROFILE SEE PROFILE

Ganapathy G
R.M.D Engineering College
15 PUBLICATIONS 60 CITATIONS

SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Stability for n- dimensional functional equations View project

NON-SQUARE MATRICES determinant and inverse View project

All content following this page was uploaded by Ganapathy G on 19 March 2016.

The user has requested enhancement of the downloaded file.


International J. of Math. Sci. & Engg. Appls. (IJMSEA)
ISSN 0973-9424, Vol. 5 No. V (September, 2011), pp. 389-401

DETERMINANT FOR NON-SQUARE MATRICES

M. ARUNKUMAR, S. MURTHY AND G. GANAPATHY

Abstract
In this paper, the authors generalized the concept of determinant form, square
matrix to non square matrix. We also discuss the properties for non square deter-
minant. Using this we investigate the inverse of non square matrix.

1. Introduction
The term of determinant was introduced by Gauss in 1801 while discussing quadratic
forms . He used the term because of the determinant determines the properties of
quadratic forms. We know that the area of the triangle with vertices (x1 , y1 ), (x2 , y2 )
and (x3 , y3 ) is
1
[x1 (y2 − y3 ) + x2 (y3 − y1 ) + x3 (y1 − y2 )]. (1.1)
2
Similarly the condition for a second degree equation in x and y to represent a pair of
straight line is
abc + 2f gh − af 2 − bg 2 − ch2 = 0. (1.2)

To minimize the difficulty in remembering these type of expressions, Mathematicians


developed the idea of representing the expression in determinant form.

−−−−−−−−−−−−−−−−−−−−−−−−−−−−

Key Words and Phrases : Matrix, Determinant of matrix.


2010 AMS Subject Classification : 26A33.
c http: //www.ascent-journals.com

389
390 M. ARUNKUMAR, S. MURTHY & G. GANAPATHY

The above expression (1.1) and (1.2) can be represented in the form

x1 y1 1 a h g
1
x2 y2 1 and h b f = 0.
2
x3 y3 1 g f c

Again if we determine x, y, z from the three equations

a1 x + b1 y + c1 z; a2 x + b2 y + c2 z; a3 x + b3 y + c3 z,

we obtain a1 (b2 c3 − b3 c2 ) − b1 (a2 c3 − a3 c2 ) + c1 (a2 b3 − a3 b2 ) = 0.


This can be written as
a1 b1 c1
a2 b2 c2 = 0.
a3 b3 c3
Thus a determinant is a particular type of expression written in a special concise
form. Note that the quantities are arranged in the form of a square between two vertical
lines. This arrangement is called a determinant.
1.1 Determinants for Square Matrix
Definition 1.1 [2] : To every square matrix A of order n with entries as real or complex
numbers, we can associate a number called determinant of matrix A and is denoted by
|A| or det(A).
Thus determinant formed by the elements of A and is said to be determinant of
matrix A.  
a11 a12 a11 a12
If A = then its |A| = = a11 a22 − a12 a21 .
a21 a22 a21 a22

Definition 1.2 [2] : Let |A| = |(aij )| be a determinant of square matrix of order n.
The minor of an arbitrary element aij is the determinant obtained by deleting the i-th
row and j-th column in which the element aij stands. The minor of aij is denoted by
Mij .
Definition 1.3 [2] : The cofactor is a signed minor. The cofactor of aij is denoted by
Aij and is defined as Aij = (−1)i+j Mij .
 
a11 a12 a13
Definition 1.4 [2] : Let A =  a21 a22 a23  then its determinant is defined as
a31 a32 a33
|A| = a11 M11 − a12 M12 + a13 M13 .
DETERMINANT FOR NON-SQUARE MATRICES 391

Definition 1.5 [2] : A square matrix A is said to be singular if |A| = 0, otherwise is


said to be non-singular matrix.
1.2 Properties of Determinants for Square Matrix
The following properties are true for determinant of square matrices of any order.
Theorem 1.6 [1] : The value of a determinant not changed when we interchanged rows
into columns and columns into rows. that is |A| = |AT | for any square matrix A.
Theorem 1.7 [1] [2] : If any two rows (columns) of a determinant are interchanged,
then the determinant changes in sign but its numerical value is unaltered.
Theorem 1.8 [1] : If any two rows (columns) of a determinant are identical, then the
value of the determinant is zero.
Theorem 1.9 [1] : If every element in a row (or column) of a determinant is multiplied
by a constant k, then the value of the determinant is multiplied by k.
Theorem 1.10 [1] : If every element in any row (column) can be expressed as the the
sum of two quantities then the given determinant can be expressed as the sum of two
determinants of the same order with the elements of the remaining rows (columns) of
the both being the same.
Theorem 1.11 [1] : A determinant is unaltered when to each element of any row
(column) is added to those of several other rows (columns) multiplied respectively by
constant factors.
In the history of Matrices, yet now mathematicians are interested in finding the
value of determinant for square matrix only, Actually the definition of determinant and
its properties are discussed only for square matrices. There is no definition of determi-
nant for non square matrices. To break this we introduced a new concept called non
square determinant that is determinant for non square matrices and we investigate the
inverse of the non square matrices.

2. Non Square Determinant


Definition 2.1 : Let A be a non square matrix of order m × n. If n > m then the
matrix A is called horizontal matrix, otherwise A is called vertical matrix.
Definition 2.2 : To every non square matrix A of order m × n with entries as real
or complex numbers, we can associate a number called determinant of matrix A and is
denoted by |A| or det(A).
392 M. ARUNKUMAR, S. MURTHY & G. GANAPATHY

If A = [a11 a12 a13 · · · a1n ], then its,

n
X
1+n
|A| = a11 − a12 + a13 − · · · + (−1) a1n = (−1)1+i a1i . (2.1)
i=1

 
a11
 a21  m
  m+1 a (−1)i+1 ai1 . If
P
If A = 
 a31 , then its |A| = a11 − a21 + a31 − · · · + (−1)
 m1 =
 ···  i=1

am1
  n−1 n
a11 a12 · · · a1n ai1 ai2
(−1)pij
P P
A= , then its |A| = and if
a21 a22 · · · a1n i=1 j=i+1 aj1 aj2
 
a11 a12
 a21 a22  n−1 n ai1 ai2
(−1)pij
P P
A= . . , then its |A| = where
 
 . . .
. 

i=1 j=i+1 aj1 aj2
an1 an2


i (−1)i 3
 j−

2 + 4 + 4 if n is even;
pij = .
 i (−1)i 9
j− − + if n is odd.

2 4 4

Definition 2.3 : Let |A| = |(aij )| be a non square determinant of order m × n. The
minor of an arbitrary element aij is the determinant obtained by deleting the i-th row
and j-th column in which the element aij stands. The minor of aij is denoted by Mij .
Definition 2.4 : The cofactor of non square matrix is a signed minor. The cofactor of
aij is denoted by Aij and is defined as

Aij = (−1)i+j Mij . (2.2)

 
a11 a12 a13 · · · a1n
Definition 2.5 : Let A =  a21 a22 a23 · · · a2n  then its determinant is defined
a31 a32 a33 · · · a3n
as

n
X
1+n
|A| = a11 M11 − a12 M12 + a13 M13 − · · · + (−1) M1n = (−1)1+i a1i M1i . (2.3)
i=1
DETERMINANT FOR NON-SQUARE MATRICES 393

 
a11 a12 a13

 a21 a22 a23 

Also if A = 
 a31 a32 a33  then its determinant is defined as

 .. .. .. 
 . . . 
am1 am2 am3

m
X
|A| = a11 M11 − a21 M21 + a31 M31 − · · · + (−1)m+1 Mm1 = (−1)i+1 ai1 Mi1 . (2.4)
i=1

Definition 2.6 : A non square matrix A is said to be singular if |A| = 0, otherwise is


said to be non-singular matrix.
The following properties are true for non square determinants of any order. But we
are going to prove the properties for the non square determinant of order 3 × 4 and 4 × 3
only.
Theorem 2.7 : The value of the non square determinant unchanged when we inter-
changed rows into columns and columns into rows. that is |A| = |AT | for any non square
matrix A.  
a11 a12 a13 a14
Proof : Consider the matrix A =  a21 a22 a23 a24  .
a31 a32 a33 a34
By the definition of |A|, we have

|A| = a11 M11 − a12 M12 + a13 M13 − a14 M14


 
a22 a23 a22 a24 a23 a24
= a11 − +
a32 a33 a32 a34 a33 a34
 
a21 a23 a21 a24 a23 a24
−a12 − +
a31 a33 a31 a34 a33 a34
 
a21 a22 a21 a24 a22 a24
+a13 − +
a31 a32 a31 a34 a32 a34
 
a21 a22 a21 a23 a22 a23
−a14 − +
a31 a32 a31 a33 a32 a33
= a11 a22 a33 − a11 a23 a32 − a11 a22 a34 + a11 a24 a32 + a11 a23 a34
−a11 a24 a33 − a12 a21 a33 + a12 a23 a31 + a12 a21 a34 − a12 a24 a31
−a12 a23 a34 + a12 a24 a33 + a13 a21 a32 − a13 a22 a31 − a13 a21 a34
+a13 a24 a31 + a13 a22 a34 − a13 a24 a32 − a14 a21 a32 + a14 a22 a31
+a14 a21 a33 − a14 a23 a31 − a14 a22 a33 + a14 a23 a32 . (2.5)
394 M. ARUNKUMAR, S. MURTHY & G. GANAPATHY

Let us interchange the rows and columns of A. We have,

a11 a21 a31


a12 a22 a32
|AT | =
a13 a23 a33
a14 a24 a34
= a11 M11 − a12 M21 + a13 M31 − a14 M41
 
a22 a23 a22 a24 a23 a24
= a11 − +
a32 a33 a32 a34 a33 a34
 
a21 a23 a21 a24 a23 a24
−a12 − +
a31 a33 a31 a34 a33 a34
 
a21 a22 a21 a24 a22 a24
+a13 − +
a31 a32 a31 a34 a32 a34
 
a21 a22 a21 a23 a22 a23
−a14 − +
a31 a32 a31 a33 a32 a33
= a11 a22 a33 − a11 a23 a32 − a11 a22 a34 + a11 a24 a32 + a11 a23 a34
−a11 a24 a33 − a12 a21 a33 + a12 a23 a31 + a12 a21 a34 − a12 a24 a31
−a12 a23 a34 + a12 a24 a33 + a13 a21 a32 − a13 a22 a31 − a13 a21 a34
+a13 a24 a31 + a13 a22 a34 − a13 a24 a32 − a14 a21 a32 + a14 a22 a31
+a14 a21 a33 − a14 a23 a31 − a14 a22 a33 + a14 a23 a32 . (2.6)

From equation (2.5) and (2.6) we obtain the proof of the theorem. 2
Theorem 2.8 : If any two rows of horizontal matrix determinant are interchanged, then
the horizontal matrix determinant changes insign but its numericalvalue is unaltered.
a11 a12 a13 a14
Proof : Consider the horizontal matrix A =  a21 a22 a23 a24 .
a31 a32 a33 a34
By the definition of |A|, we have

|A| = a11 M11 − a12 M12 + a13 M13 − a14 M14


= a11 a22 a33 − a11 a23 a32 − a11 a22 a34 + a11 a24 a32 + a11 a23 a34
−a11 a24 a33 − a12 a21 a33 + a12 a23 a31 + a12 a21 a34 − a12 a24 a31
−a12 a23 a34 + a12 a24 a33 + a13 a21 a32 − a13 a22 a31 − a13 a21 a34
+a13 a24 a31 + a13 a22 a34 − a13 a24 a32 − a14 a21 a32 + a14 a22 a31
+a14 a21 a33 − a14 a23 a31 − a14 a22 a33 + a14 a23 a32 . (2.7)
DETERMINANT FOR NON-SQUARE MATRICES 395

Now, let B be the non square determinant obtained from A by interchanging the first
and second rows. Then

|B| = a21 M11 − a22 M12 + a23 M13 − a24 M14


= −[a11 a22 a33 − a11 a23 a32 − a11 a22 a34 + a11 a24 a32 + a11 a23 a34
−a11 a24 a33 − a12 a21 a33 + a12 a23 a31 + a12 a21 a34 − a12 a24 a31
−a12 a23 a34 + a12 a24 a33 + a13 a21 a32 − a13 a22 a31 − a13 a21 a34
+a13 a24 a31 + a13 a22 a34 − a13 a24 a32 − a14 a21 a32 + a14 a22 a31
+a14 a21 a33 − a14 a23 a31 − a14 a22 a33 + a14 a23 a32 ]
= −|A|. (2.8)

Hence the proof is complete. 2


Theorem 2.9 : If any two rows of horizontal matrix determinant are identical, then
the value of the horizontal matrix determinant is zero.
Proof : Let |A| be the determinant value of the horizontal matrix A. Assume that the
first two rows are identical. By Theorem 2.8 interchange the first two rows of A, we
obtain −|A|. Since first two rows are identical even after interchange we get the same
|A|. That is |A| = −|A|. Hence we obtain |A| = 0. This completes the proof of the
theorem. 2
Theorem 2.10 : If every element in a row of horizontal matrix determinant is multiplied
by a constant k, then the value of the horizontal
 matrix determinantis multiplied by k.
a11 a12 a13 a14
Proof : Consider the horizontal matrix A =  a21 a22 a23 a24  . Let us multiply
a31 a32 a33 a34
the first row of A by a constant k. Thus we get a new matrix,
 
ka11 ka12 ka13 ka14
B =  a21 a22 a23 a24  .
a31 a32 a33 a34
Then |B| = ka11 M11 −ka12 M12 +ka13 M13 −ka14 M14 = k|A|.Hencetheproof of thetheoremiscomplete.2
Theorem 2.11 : If every element in any row of an horizontal matrix can be expressed
as the the sum of two quantities then the given horizontal matrix determinant can be
expressed as the sum of two horizontal matrix determinants of the same order with the
elements of the remaining rows of the both being the same.
396 M. ARUNKUMAR, S. MURTHY & G. GANAPATHY

Proof : Consider the horizontal matrix


 
α + a11 β + a12 γ + a13 δ + a − 14
A =  a21 a22 a23 a24 .
a31 a32 a33 a34

Then we have,

|A| = (α + a11 )M11 − (β + a12 )M12 + (γ + a13 )M13 − (δ + a14 )M14


= (αM11 − βM12 + γM13 − δM14 ) + (a11 M11 − a12 M12 + a13 M13 − a14 M14 )
α β γ δ a11 a12 a13 a14
= a21 a22 a23 a24 + a21 a22 a23 a24 .
a31 a32 a33 a34 a31 a32 a33 a34

Hence the proof. 2


Theorem 2.12 : A horizontal matrix determinant is unaltered when to each element
of any row is added to those of several other rows multiplied respectively by constant
factors.  
a11 a12 a13 a14
Proof : Consider the horizontal matrix A =  a21 a22 a23 a24  .
a31 a32 a33 a34
Let B be a determinant obtained when to the elements of the first row of A are
added to those of second row and third row multiplied by l and m. Then
 
s1 s2 s3 s4
B =  a21 a22 a23 a24  .
a31 a32 a33 a34

where si = a1i + la2i + ma3i for all i = 1, 2, 3, 4. Using the Theorem 2.11, we have

a11 a12 a13 a14 la11 la12 la13 la14


|B| = a21 a22 a23 a24 + a21 a22 a23 a24
a31 a32 a33 a34 a31 a32 a33 a34
ma31 ma32 ma33 ma34
+ a21 a22 a23 a24 .
a31 a32 a33 a34

Again using the Theorem 2.10 and Theorem 2.9, we have

|B| = |A| + l(0) + m(0) = |A|.

Hence the proof is complete. 2


DETERMINANT FOR NON-SQUARE MATRICES 397

The following Theorems 2.13, 2.14, 2.15, 2.16 and 2.17 are the immediate conse-
quence of Theorems 2.8, 2.9, 2.10, 2.11, to 2.12 for vertical matrix.
Theorem 2.13 : If any two columns of vertical matrix determinant are interchanged,
then the vertical matrix determinant changes in sign but its numerical value is unaltered.
Theorem 2.14 : If any two columns of vertical matrix determinant are identical, then
the value of the vertical matrix determinant is zero.
Theorem 2.15 : If every element in a column of vertical matrix determinant is mul-
tiplied by a constant k, then the value of the vertical matrix determinant is multiplied
by k.
Theorem 2.16 : If every element in any column of an vertical matrix can be expressed
as the the sum of two quantities then the given vertical matrix determinant can be
expressed as the sum of two vertical matrix determinants of the same order with the
elements of the remaining columns of the both being the same.
Theorem 2.17 : A vertical matrix determinant is unaltered when to each element of
any column is added to those of several other column multiplied respectively by constant
factors.

3. Inverse for Non Square Matrix


In this section, the others discuss about inverse for the non square matrix of any
order. We know that, the fundamental ideas for existence of inverse of square matrix
((ie) it must be non singular).
Definition 3.1 : Let A = (aij ) be a non square matrix. The transpose of the cofactor
matrix of A is called the adjoint matrix of A. It is denoted by adj(A).
Now for the non square matrix, we introduce the new concept “Left inverse” and
“Right inverse” using the following definitions.
Definition 3.2 : A non singular non square matrix A having Left inverse if there exists
a matrix A−1
L such that AL A = I, where I denote the identity matrix.
−1

Definition 3.3 : A non singular non square matrix A having Right inverse if there
exists a matrix A−1
R such that AAR = I, where I denote the identity matrix.
−1

The following Lemma is the immediate consequence of the above definitions and the
definition of inverse for square matrix.
Lemma 3.4 : For any non singular square matrix A, the left inverse and right inverse
398 M. ARUNKUMAR, S. MURTHY & G. GANAPATHY

exist and it is equal to inverse of A, that is

1
L = AR = A
A−1 = adj(A). (3.1)
−1 −1
|A|

The below Theorems 3.5 and Theorem 3.7 tells that the existence of right inverse
and left inverse, for some non-square matrix.
Theorem 3.5 : Every non singular horizontal matrix A having a right inverse A−1
R ,
such that
1
R =
A−1 adj(A). (3.2)
|A|
Proof : We are going to prove this theorem
 for the horizontal
 matrix of order 3 × 4.
a11 a12 a13 a14
Consider the horizontal matrix A =  a21 a22 a23 a24 . Then the determinant
a31 a32 a33 a34
value of the matrix A is

|A| = a11 a22 a33 − a11 a23 a32 − a11 a22 a34 + a11 a24 a32 + a11 a23 a34
−a11 a24 a33 − a12 a21 a33 + a12 a23 a31 + a12 a21 a34 − a12 a24 a31
−a12 a23 a34 + a12 a24 a33 + a13 a21 a32 − a13 a22 a31 − a13 a21 a34
+a13 a24 a31 + a13 a22 a34 − a13 a24 a32 − a14 a21 a32 + a14 a22 a31
+a14 a21 a33 − a14 a23 a31 − a14 a22 a33 + a14 a23 a32 . (3.3)

Now, the minor of a11 is

M11 = a22 a33 − a23 a32 − a22 a34 + a24 a32 + a23 a34 − a24 a33 .

The minor of a12 is

M12 = a21 a33 − a23 a31 − a21 a34 + a24 a31 + a23 a34 − a24 a33 .

The minor of a13 is

M13 = a21 a32 − a22 a31 − a21 a34 + a24 a31 + a22 a34 − a24 a32 .

: The minor of a14 is

M14 = a21 a32 − a22 a31 − a21 a33 + a23 a31 + a22 a33 − a23 a32 .
DETERMINANT FOR NON-SQUARE MATRICES 399

The minor of a21 is

M21 = a12 a33 − a13 a32 − a12 a34 + a14 a32 + a13 a34 − a14 a33 .

The minor of a22 is

M22 = a11 a33 − a13 a31 − a11 a34 + a14 a31 + a13 a34 − a14 a33 .

The minor of a23 is

M23 = a11 a32 − a12 a31 − a11 a34 + a14 a31 + a12 a34 − a14 a32 .

The minor of a24 is

M24 = a11 a32 − a12 a31 − a11 a33 + a13 a31 + a12 a33 − a13 a22 .

The minor of a31 is

M31 = a12 a23 − a13 a22 − a12 a24 + a14 a22 + a13 a24 − a14 a23 .

The minor of a32 is

M32 = a11 a23 − a21 a13 − a11 a33 + a21 a14 + a13 a24 − a14 a23 .

The minor of a33 is

M33 = a11 a22 − a12 a21 − a11 a24 + a14 a21 + a12 a24 − a14 a22 .

The minor of a34 is

M34 = a11 a22 − a12 a21 − a11 a24 + a13 a21 + a12 a24 − a13 a22 .

Hence we have the matrix,


 T
M11 −M12 M13 −M14
1 1 
A−1
R = adj(A) = −M21 M22 −M23 M24  .
|A| |A|
M31 −M32 M33 −M34
Clearly, the matrix A−1
R is the right inverse of A because,
 
a11 a12 a13 a14
1 
AA−1
R = a21 a22 a23 a24 
|A|
a31 a32 a33 a34
 T
M11 −M12 M13 −M14
×  −M21 M22 −M23 M24 
M31 −M32 M33 −M34
= I.
400 M. ARUNKUMAR, S. MURTHY & G. GANAPATHY

  2
1 1 2 0
Example 3.6 : The matrix A =  1 2 1 2  as a right inverse
3 4 1 2

− 12 − 54 1
 
4
 
1 3 1
 
 
 2 4 4 
A−1
R = .
1 1
− 41
 
 
 2 4 
 
− 12 1
4 − 41

It is easy to verify that AA−1


R = I.
The next theorem is a consequence of Theorem 3.7, this shows that the existence of
left inverse for non singular vertical matrix.
Theorem 3.7 : Every non singular vertical matrix A having a left inverse A−1
L , such
that
1
L =
A−1 adj(A). (3.4)
|A|

References

[1] Howard Anton, Elementary Linear Algebra Fourth edition, John Wiley & Sons,
New York.
[2] Biswas, Suddhendu, Topic in algebra of matrices, Acadamic publications, Delhi,
India, 1984.
[3] Groza, Vivian Shaw, College Algebra, Saunders College, Philadelphia, (1980).
[4] Santiaga, M. L., Modern Algebra, Arul Publication, Madras, (1988).
[5] Garratt Birkhoff, Saunders Mac Lane, A survey of Modern Algebra Third edi-
tion, The Macmillan company, New York, (1965).
[6] Murugasamy, N., Pichaikkannu, M., Mathematics Higher Secondary - Second
Year, Tamilnadu TextBook Corporation, Madras, (1996).

M. Arunkumar,
Department of Mathematics, Government Arts College,
Tiruvannamalai - 606 603, TamilNadu, India
E-mail: annarun2002@yahoo.co.in
DETERMINANT FOR NON-SQUARE MATRICES 401

S. Murthy,
Department of Mathematics, Government Arts College for Men,
Krishnagiri-635 001, Tamil Nadu, India
E-mail: smurthy07@yahoo.co.in

G. Ganapathy,
Department of Mathematics, Sacred Heart College,
Tirupattur - 635 601, TamilNadu, India
E-mail: ganagandhi@yahoo.co.in

View publication stats

You might also like