A set of Vectors are linearly dependant if each of them can be expressed as Linear Combinations of each other. 1
Definition
Let be a Vector Space fixed with Field . Let the scalars and the set of vectors be
Linear Dependence
is linearly dependant if and only if:
In other words, if the zero vector can be represented as a Linear Combination of a set of vectors, where at least one scalar is non-zero, then the vectors are linearly dependant
Containing the zero vector
A set of vectors containing is always linearly dependant.
Linear Independence
is linearly independent if it is not linearly dependent. Alternatively, is linearly independent if and only if:
In other words, the only way for a set of linearly independent vectors to sum up to the zero vector is if all of them are multiplied by 0.
Checking for Linear Independence
Let be a vector space fixed over field . Let be a set of vectors,
We can create a matrix, of size containing column matrices :
If we augment a column matrix of size representing
We get:
This is the Matrix Representation Of A Linear System and can be solved via Gauss-Jordan Elimination to obtain a RREF.
is a set of linearly independent vectors if and only if the matrix has a unique solution:
If (resulting in being a square matrix) then:
Theorem: Linear Dependence
A set of vectors in the vector space is linearly dependent if
Since is a matrix, the Matrix Rank of is given by the smaller dimension i.e.
If then
Therefore the vectors must be linearly dependant.