Linear combinations, bases

Notes

Let $V$ be a vector space and $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p \in V$ be a collection of vectors. A linear combination of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p$ are sums of the form:

$$ \alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2 + \dots + \alpha_p\mathbf{v}_p = \sum^p_{k=1} \alpha_k\mathbf{v}_k $$

Note

Vector space basis

A system of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_n \in V$ is called a basis (for the vector space $V$) if any vector $\mathbf{v} \in V$ admits a unique representation as a linear combination:

$$ \mathbf{v} = \alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2 + \dots + \alpha_n\mathbf{v}_n = \sum^n_{k=1} \alpha_k\mathbf{v}_k $$

The coefficients $\alpha_1, \alpha_2, \dots, \alpha_n$ are called coordinates of the vector $\mathbf{v}$.

Note

Generation system

A system of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p \in V$ is called a generation system (spanning system or complete system) in $V$ if any vector $\mathbf{v} \in V$ admits a representation as a linear combination:

$$ \mathbf{v} = \alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2 + \dots + \alpha_p\mathbf{v}_p = \sum^p_{k=1} \alpha_k\mathbf{v}_k $$

The difference from the definition of a basis is that we do not assume that the representation above is unique.

Note

Trivial linear combination

A linear combination $\alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2 + \dots + \alpha_p\mathbf{v}_p$ is called trivial if $\alpha_k = 0$ holds for all $k$.

A trivial linear combination is always equal to $\mathbf{0}$.

Definition. A system of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p \in V$ is called linearly independent if only the trivial linear combination of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p$ equals $\mathbf{0}$.

Definition. A system of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p \in V$ is called linearly dependent if and only if there exist scalars $\alpha_1, \alpha_2, \dots, \alpha_p$ with $\sum^p_{k=1} |\alpha_k| \neq 0$ such that:

$$ \sum^p_{k=1} \alpha_k\mathbf{v}_k = \mathbf{0} $$

Proposition 2.6. A system of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p \in V$ is linearly dependent if and only if one of the vectors $\mathbf{v}_k$ can be represented as a linear combination of the other vectors:

$$ \mathbf{v}_k = \sum^p_{j=1, j \not = k} \beta_j\mathbf{v}_j $$

Proposition 2.7. A system of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_p \in V$ is a basis if and only if it is linearly independent and complete.

Proposition 2.8. Any generating system contains a basis.

Exercises

2.1

Find a basis in the space of $3 \times 2$ matrices $M_{3 \times 2}$.


$$ \mathbf{e}_1 = \begin{pmatrix} 1 & 0 \\ 0 & 0 \\ 0 & 0 \end{pmatrix},\quad \mathbf{e}_2 = \begin{pmatrix} 0 & 1 \\ 0 & 0 \\ 0 & 0 \end{pmatrix}, \\ \mathbf{e}_3 = \begin{pmatrix} 0 & 0 \\ 1 & 0 \\ 0 & 0 \end{pmatrix},\quad \mathbf{e}_4 = \begin{pmatrix} 0 & 0 \\ 0 & 1 \\ 0 & 0 \end{pmatrix}, \\ \mathbf{e}_5 = \begin{pmatrix} 0 & 0 \\ 0 & 0 \\ 1 & 0 \end{pmatrix},\quad \mathbf{e}_6 = \begin{pmatrix} 0 & 0 \\ 0 & 0 \\ 0 & 1 \end{pmatrix} $$

2.2

True or false:

a) Any set containing a zero vector is linearly dependent;

b) A basis must contain $\mathbf{0}$;

c) Subsets of linearly dependent sets are linearly dependent;

d) Subsets of linearly independent sets are linearly independent;

e) If $\alpha_1\mathbf{v}_1 + \alpha_2\mathbf{v}_2 + \dots + \alpha_n\mathbf{v}_n = \mathbf{0}$, then all scalars $\alpha_k$ are zero.


a) True. If the set contains $\mathbf{0}$, it is linearly dependent since $\mathbf{0}$ can be expressed as a linear combination of vectors in the set.

b) False. A basis cannot contain $\mathbf{0}$ because a linearly independent system cannot include the zero vector.

c) False. Subsets of linearly dependent sets can be linearly independent.

d) True. Any subset of a linearly independent set remains linearly independent.

e) False. If the set is linearly dependent, then $\sum^p_{k=1} |\alpha_k| \not = 0$, which means at least one coefficient is nonzero.

2.3

Recall that a matrix is called symmetric if $A^T = A$. Write down a basis in the space of symmetric $2 \times 2$ matrices (there are many possible answers). How many elements are in the basis?


$$ \mathbf{e}_1 = \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix}, \\ \mathbf{e}_2 = \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix}, \\ \mathbf{e}_3 = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} $$

There are 3 vectors in the basis.

2.4

Write down a basis for the space of:

a) $3 \times 3$ symmetric matrices;

b) $n \times n$ symmetric matrices;

c) $n \times n$ antisymmetric ($A^T = -A$) matrices.


a)

$$ \mathbf{e}_1 = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}, \quad \mathbf{e}_2 = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{pmatrix}, \\ \mathbf{e}_3 = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 1 \end{pmatrix}, \quad \mathbf{e}_4 = \begin{pmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 0 \end{pmatrix}, \\ \mathbf{e}_5 = \begin{pmatrix} 0 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \end{pmatrix}, \quad \mathbf{e}_6 = \begin{pmatrix} 0 & 0 & 1 \\ 0 & 0 & 0 \\ 1 & 0 & 0 \end{pmatrix} $$

b)

The basis is composed of $n(n+1)/2$ vectors. The first $n$ vectors are given by the diagonal elements, where each element in position $k$ (with $k$ from 1 to $n$) is 1 and the remaining elements are 0. The remaining vectors are the off-diagonal elements where each element is in the upper triangle and its diagonal reflection.

c)

The basis is composed of $n(n-1)/2$ vectors. The basis follows the same pattern as the previous case, with the difference that the off-diagonal elements’ reflections have a negative sign and the main diagonal elements are always zero.

2.5

Let a system of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r$ be linearly independent but not generating. Show that it is possible to find a vector $\mathbf{v}_{r+1}$ such that the system $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r, \mathbf{v}_{r+1}$ is linearly independent.

HINT: Take for $\mathbf{v}_{r+1}$ any vector that cannot be represented as a linear combination $\sum^r_{k=1} \alpha_k\mathbf{v}_k$ and show that the extended system is linearly independent.


If the system $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r$ is not generating, then there exists a vector $\mathbf{v}_{r+1}$ that cannot be expressed as a linear combination of the previous vectors. By Proposition 2.6, the extended system of vectors is linearly independent, because none of the first $r$ vectors can be linearly combined to represent the vector $\mathbf{v}_{r+1}$.

2.6

Is it possible that vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly dependent, but the vectors $\mathbf{w}_1 = \mathbf{v}_1 + \mathbf{v}_2$, $\mathbf{w}_2 = \mathbf{v}_2 + \mathbf{v}_3$ and $\mathbf{w}_3 = \mathbf{v}_3 + \mathbf{v}_1$ are linearly independent?


No. Since any vector $\mathbf{v}$ can be represented as a linear combination of the vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$, it follows that we can also find a representation as a linear combination of the vectors $\mathbf{w}_1, \mathbf{w}_2, \mathbf{w}_3$.