Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Linear 01

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

UNIVERSITY OF GUJRAT

Assignment # 01

Course code: Math- 319


Course title: Linear Algebra
Instructor: Dr. Shahida Bashir
Submitted By:
Mehvish 20011509-094

Department of Mathematics
Topic: history of Linear Algbra.
In order to unfold the history of
linear algebra, it is important that we first determine what Linear
Algebra is. As such, this definition is not a complete and
comprehensive answer, but rather a broad definition loosely
wrapping itself around the subject.
First, linear algebra is the study of a certain algebraic
structure called a vector space (BYU). Second, linear algebra is
the study of linear sets of equations and their transformation
properties. Finally, it is the branch of mathematics charged with
investigating the properties of finite dimensional vector spaces
and linear mappings between such spaces. Now discuss the
history of linear algebra as it relates linear sets of equations and
their transformations and vector spaces.
Around 4000 years ago, the people of Babylon knew how to
solve a simple 2X2 system of linear equations with two
unknowns. Around 200 BC, the Chinese published that “Nine
Chapters of the Mathematical Art,” they displayed the ability to
solve a 3X3 system of equations (Perotti). The simple equation
of ax+b=0 is an ancient question worked on by people from all
walks of life. The power and progress in linear algebra did not
come to fruition until the late 17th century. The emergence of
the subject came from determinants, values connected to a
square matrix, studied by the founder of calculus, Leibnitz, in
the late 17th century. Lagrange came out with his work
regarding Lagrange multipliers, a way to “characterize the
maxima and minima multivariate functions.” (Darkwing) More
than fifty years later, Cramer presented his ideas of solving
systems of linear equations based on determinants more than 50
years after Leibnitz (Darkwing). Interestingly enough, Cramer
provided no proof for solving an n x n system. As we see, linear
algebra has become more relevant since the emergence of
calculus even though it’s foundational equation of ax+b=0 dates
back centuries.

Euler brought to light the idea that a system


of equations doesn’t necessarily have to have a solution
(Perotti). He recognized the need for conditions to be placed
upon unknown variables in order to find a solution. The initial
work up until this period mainly dealt with the concept of
unique solutions and square matrices where the number of
equations matched the number of unknowns. With the turn into
the 19th century Gauss introduced a procedure to be used for
solving a system of linear equations. His work dealt mainly with
the linear equations and had yet to bring in the idea of matrices
or their notations. His efforts dealt with equations of differing
numbers and variables as well as the traditional pre-19th century
works of Euler, Leibnitz, and Cramer. Gauss’ work is now
summed up in the term Gaussian elimination. This method uses
the concepts of combining, swapping, or multiplying rows with
each other in order to eliminate variables from certain equations.
After variables are determined, the student is then to use back
substitution to help find the remaining unknown variables.
As mentioned before, Gauss work dealt much with solving
linear equations themselves initially, but did not have as much to
do with matrices. In order for matrix algebra to develop, a
proper notation or method of describing the process was
necessary. Also vital to this process was a definition of matrix
multiplication and the facets involving it. “The introduction of
matrix notation and the invention of the word matrix were
motivated by attempts to develop the right algebraic language
for studying determinants. In 1848, J.J. Sylvester introduced the
term “matrix,” the Latin word for womb, as a name for an array
of numbers. He used womb, because he viewed a matrix as a
generator of determinants (Tucker, 1993). The other part, matrix
multiplication or matrix algebra came from the work of Arthur
Cayley in 1855. Cayley’s defined matrix multiplication as, “the
matrix of coefficients for the composite transformation T2T1 is
the product of the matrix for T2 times the matrix of T1”
(Tucker, 1993). His work dealing with Matrix multiplication
culminated in his theorem, the Cayley-Hamilton Theorem.
Simply stated, a square matrix satisfies its characteristic
equation. Cayley’s efforts were published in two papers, one in
1850 and the other in 1858. His works introduced the idea of the
identity matrix as well as the inverse of a square matrix. He also
did much to further the ongoing transformation of the use of
matrices and symbolic algebra. He used the letter “A” to
represent a matrix, something that had been very little before his
works. His efforts were little recognized outside of England until
the 1880s. Matrices at the end of the 19th century were heavily
connected with Physics issues and for mathematicians, more
attention was given to vectors as they proved to be basic
mathematical elements.
For a time, however, interest in a lot of linear algebra slowed
until the end of World War II brought on the development of
computers. Now instead of having to break down an enormous
n x n matrix, computers could quickly and accurately solve
these systems of linear algebra. With the advancement of
technology using the methods of Cayley, Gauss, Leibnitz, Euler,
and others determinants and linear algebra moved forward more
quickly and more effective. Regardless of the technology though
Gaussian elimination still proves to be the best way known to
solve a system of linear equations (Tucker, 1993). The influence
of Linear Algebra in the mathematical world is spread wide
because it provides an important base to many of the principles
and practices. Some of the things Linear Algebra is used for are
to solve systems of linear format, to find least-square best fit
lines to predict future outcomes or find trends, and the use of the
Fourier series expansion as a means to solving partial
differential equations. Other more broad topics that it is used for
are to solve questions of energy in Quantum mechanics. It is
also used to create simple every day household games like
Sudoku. It is because of these practical applications that Linear
Algebra has spread so far and advanced.
The key, however, is to understand that the history of linear
algebra provides the basis for these applications. Although linear
algebra is a fairly new subject when compared to other
mathematical practices, it’s uses are widespread. With the
efforts of calculus savvy Leibnitz the concept of using systems
of linear equations to solve unknowns was formalized. Other
efforts from scholars like Cayley, Euler, Sylvester, and others
changed linear systems into the use of matrices to represent
them. Gauss brought his theory to solve systems of equations
proving to be the most effective basis for solving unknowns.
Technology continues to push the use further and further, but the
history of Linear Algebra continues to provide the foundation.
Even though every few years companies update their textbooks,
the fundamentals stay the same.

You might also like