Introductory Linear Algebra/Vectors and subspaces

Introduction
Without distinguishing row and column vectors, we can define as follows:

A special type of vector is the.

However, in linear algebra, we sometimes need to distinguish row and column vectors, which are defined as follows:

The two basic vector operations are addition and scalar multiplication. Using these two operations only, we can multiple vectors as in the following definition.

Another concept that is closely related to linear combinations is.

Linear independence
Then, we will introduce an intuitive result about linear dependence, in the sense that the results match with the name 'linear dependence'.

Then, we will introduce a proposition for vectors.

Then, we will discuss two results that relate linear independence with SLE.

Subspaces
Then, we will discuss subspaces. Simply speaking, they are some subsets of $$\mathbb R^n$$ that have some nice properties. To be more precise, we have the following definition.

We can see from this example that the entries themselves do not really matter. Indeed, we have the following general result:

In particular, we have special names for some of the spans, as follows:

Then, we will introduce some more terminologies related to subspaces.

The following theorem highlights the importance of.

Then, we will discuss some ways to construct a basis.

Its proof is complicated.

A terminology that is related to is.

Recall that there are infinitely many bases for a subspace. Luckily, all bases have the same number of vectors, and so the dimension of subspace is unique, as one will expect intuitively. This is assured bFy the following theorem.

Then, we will discuss the bases of row, column and null spaces, and also their dimensions.

We have special names for the dimensions of row, column and null spaces, as follows:

Indeed, the row rank and the column rank of each matrix are the same.

Because of this proposition, we have the following definition.

Then, we will introduce an important theorem that relates and.