Linear independence is a central concept in linear algebra. Two or more vectors are said to be linearly independent if none of them can be written as a linear combination of the others. On the contrary, if at least one of them can be written as a linear combination of the others, then they are said to be linearly dependent.
In the remainder of this lecture we will give a formal definition of linear independence, we will explain its meaning and we will provide some examples.
Let us start with a formal definition of linear dependence.
Definition
Let
be a linear space. Some vectors
are said to be linearly dependent if and only if there exist
scalars
such
that
and
at least one of the
scalars
is different from zero.
The requirement that at least one scalar be different from zero is fundamental.
First of all, without this requirement the definition would be trivial: we
could always
chooseand
obtain as a
result
for
any set of
vectors.
Secondly, if one of the coefficients of the linear combination is different
from zero (suppose, without loss of generality, it is
),
then we can
write
that
is,
is a linear combination of the vectors
with coefficients
.
This fact motivates the informal definition of linear dependence we have given
in the introduction above: two or more vectors are linearly dependent if at
least one of them can be written as a linear combination of the others.
The assumption
is without loss of generality because we can always change the order of the
vectors and assign the first position to a vector corresponding to a non-zero
coefficient (by assumption there exists at least one such vector).
Example
Let
and
be
column vectors defined as
follows.
The
linear
combination
gives
as a result the zero vector
because
As
a consequence,
and
are linearly dependent.
It is now straightforward to give a definition of linear independence.
Definition
Let
be a linear space. Some vectors
are said to be linearly independent if and only if they are
not linearly dependent.
It follows from this definition that, in the case of linear
independence,implies
In other words, when the vectors are linearly independent, their only linear combination that gives the zero vector as a result has all coefficients equal to zero.
Example
Let
and
be
column vectors defined as
follows.
Consider
a linear combination of these two vectors with coefficients
and
:
This
is equal
to
Therefore,
we have
that
if
and only
if
that
is, if and only if
.
As a consequence, the two vectors are linearly independent.
Below you can find some exercises with explained solutions.
Define the following
vectors:
Are
and
linearly independent?
Consider a linear combination with
coefficients
and
:
Such
a linear combination gives as a result the zero vector if and only if
that
is, if and only if the two coefficients
and
solve the system of linear
equations
This
system can be solved as follows. From the second equation, we
obtain
which,
substituted in the first equation,
gives
Thus,
and
.
Therefore, the only linear combination of
and
giving the zero vector as a result has all coefficients equal to zero. This
means that
and
are linearly independent.
Let
,
and
be
vectors defined as
follows:
Why
are these vectors linearly dependent?
Notice that the vector
is a scalar multiple of
:
or
As
a consequence, a linear combination of
,
and
,
with coefficients
,
and
,
gives as a
result
Thus,
there exists a linear combination of the three vectors such that the
coefficients of the combination are not all equal to zero, but the result of
the combination is equal to the zero vector. This means that the three vectors
are linearly dependent.
Let
be a real number. Define the following
vectors:
Are
and
linearly independent?
Take a linear combination with
coefficients
and
:
This
linear combination is equal to the zero vector if and only if
that
is, if and only if the two coefficients
and
solve the system of linear
equations
A
solution to this system can be found as follows. We subtract the second
equation from the first and
obtain
or
By
substitution into the second equation, we
get
or
Now,
there are two possible cases. If
(first case), then
and, as a consequence,
.
Thus, in this case the only linear combination of
and
giving the zero vector as a result has all coefficients equal to zero. This
means that
and
are linearly independent. If instead
(second case), then any value of
will satisfy the
equation
Choose
a number different from zero and denote it by
.
Then, the system of linear equations will be solved by
and
.
Thus, in this case there are infinite linear combinations with at least one
coefficient different from zero that give the zero vector as a result (a
different combination for each choice of
).
This means that
and
are linearly dependent.
Please cite as:
Taboga, Marco (2021). "Linear independence", Lectures on matrix algebra. https://www.statlect.com/matrix-algebra/linear-independence.
Most of the learning materials found on this website are now available in a traditional textbook format.