Linear Algebra/Linear Dependence

In this chapter, we will briefly treat the case of linear dependence of columns. Later, on the section on vectors, we will see what linear dependence means in general. For now, however, we consider its relevance to determinants.

Definition
Consider m columns of numbers, each with n numbers from a field F:

$$C_1= \begin{Vmatrix} a_{11}\\ a_{21}\\ ...\\ a_{n1} \end{Vmatrix},

C_2= \begin{Vmatrix} a_{12}\\ a_{22}\\ ...\\ a_{n2} \end{Vmatrix},

..,

C_m= \begin{Vmatrix} a_{1m}\\ a_{2m}\\ ...\\ a_{nm} \end{Vmatrix}$$

We define the addition of two columns to be the column with their corresponding entries added, and we define scalar multiplication of a number in the field F with the column simply to be the column with each of their entries multiplied by that number.

We call the sum

$$ \alpha_1\begin{Vmatrix} a_{11}\\ a_{21}\\ ...\\ a_{n1} \end{Vmatrix}+ \alpha_2\begin{Vmatrix} a_{12}\\ a_{22}\\ ...\\ a_{n2} \end{Vmatrix}+ ..+ \alpha_m\begin{Vmatrix} a_{1m}\\ a_{2m}\\ ...\\ a_{nm} \end{Vmatrix}$$

with any field elements $$\alpha_1,\alpha_2,...,\alpha_m$$ as a linear combination of the columns $$C_1,C_2,...,C_n$$

Now suppose that these columns form a determinant of order n. Then we have the following result:

Theorem:

If one of the columns is a linear combination of the other, then the determinant is 0.

Proof:

By a theorem proven earlier, a determinant is 0 when one of the columns (or rows) of the determinant is a linear combination of the others. Since a column which is a linear combination of the others remains a linear combination of the others in the determinant, the determinant is 0.

The converse of this theorem is also true, and we will prove this later on this page.