Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
59 views

Linear Algebra: Notes On Algebra I, II & III

The document contains notes on linear algebra. It defines key concepts such as vector spaces, subspaces, direct sums, span, and linear independence. It includes proofs of basic properties and relationships between these concepts, such as that a linearly independent set must be smaller than or equal to a spanning set of the same vector space. The notes cover basics of vector spaces and linear combinations.

Uploaded by

Jonathan RL
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views

Linear Algebra: Notes On Algebra I, II & III

The document contains notes on linear algebra. It defines key concepts such as vector spaces, subspaces, direct sums, span, and linear independence. It includes proofs of basic properties and relationships between these concepts, such as that a linearly independent set must be smaller than or equal to a spanning set of the same vector space. The notes cover basics of vector spaces and linear combinations.

Uploaded by

Jonathan RL
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Notes on Algebra I, II & III

Jonathan Richard Lombardy

jonathanrl3951@gmail.com
September 23, 2019

Contents

I Linear Algebra 1
1 Vector Spaces 1
1.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Span & Linear Independence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Part I
Linear Algebra
1 Vector Spaces
1.1 Basics
Denition 1 (Vector Space & related denitions). A vector set V over a eld F is a set with two
operations - addition and scalar multiplication such that:
• (V, +) is a group with identity 0
• 1 ∈ F is multiplicative identity i.e. 1 · v = v
• Scalar multiplication is distributive i.e a · (u + v) = a · u + a · v
A subspace of a vector space is a subset which is a vector space under the same laws.
Denition 2 (Subspace Sum & Direct Sum). Sum of subspaces U1 , · · · , Un is dened as
U1 + · · · + Un = {u1 + · · · + un |ui ∈ Ui }

V is said to be a direct sum of U1 , · · · , Un if each v ∈ V can be written uniquely as u1 + · · · + un for


ui ∈ Ui , and we write V = U1 ⊕ U2 ⊕ · · · ⊕ Un .

Proposition 3. V = U1 ⊕ U2 ⊕ · · · ⊕ Un if and only if: V = U1 + · · · + Un and 0 can only be written as


0 + · · · + 0.
Proof. Assuming its a direct sum, then other part is just denition. For converse, let v = u1 + · · · + un =
w1 + · · · wn (assuming m ≥ n take some of the wi = 0), then (ui − wi ) = 0 ⇒ ui = wi
P


Proposition 4. V = U ⊕ W if and only if V = U + W and U ∩ W = {0}.


Proof. If v ∈ U ∩ W then 0 = v + (−v) by uniqueness of representation v = 0. Conversely, suppose
v + w = 0 then v = −w ∈ W thus v = w = 0 and so using previous result we conclude. 

1
1.2 Span & Linear Independence
Denition 5 (Span & Linear Independence). Set of all linear combinations of v1 , · · · , vm is called span
of {v1 , · · · , vm } denoted by span(v1 , · · · , vm ). These vectors are said to span the vector space V if their
span is V . In this case, where the vector space is spanned by nitely many elements of it, we call
P a nite dimensional vector space. The list of vectors v1 , · · · , vm are called linearly independent if
V
ai · vi = 0 ⇒ ai = 0, otherwise they are called linearly dependent.

Proposition 6 (Removing elements from Span). Suppose x1 6= 0, · · · , xm are linearly dependent then
the following are true:
• There exists j ≥ 2 such that xj ∈ span(x1 , · · · , xj−1 )
• Removing this xj won't change the span.

Proof. From linear dependence we get xj (largest index with non-zero coecient). Take an element in
span and substitute the value of xj in terms of others. 
Theorem 7 (Linearly independent sets are smaller than spanning sets). Let X be a nite dimensional
vector space, and L be a linearly independent set and S be a spanning set. Then |L| ≤ |S|.
Proof. Let L = {l1 , · · · , lm } and S = {s1 , · · · , sn }. Then {l1 } ∪ S still is linearly dependent. By previous
lemma there is a si such that removing it will keep the span same. Repeat this till we have added all
the li , if we can't have sj to remove at nay step we will have a contradiction to linear independence. 

You might also like