Linear Algebra/Matrix Multiplication

After representing addition and scalar multiplication of linear maps in the prior subsection, the natural next map operation to consider is composition.

To see how the representation of the composite arises out of the representations of the two compositors, consider an example.

The theorem is an example of a result that supports a definition. We can picture what the definition and theorem together say with this arrow diagram ("wrt" abbreviates "with respect to").



Above the arrows, the maps show that the two ways of going from $$ V $$ to $$ X $$, straight over via the composition or else by way of $$ W $$, have the same effect



\vec{v}\stackrel{g\circ h}{\longmapsto}g(h(\vec{v})) \qquad \vec{v}\stackrel{h}{\longmapsto}h(\vec{v})\stackrel{g}{\longmapsto}g(h(\vec{v})) $$

(this is just the definition of composition). Below the arrows, the matrices indicate that the product does the same thing&mdash; multiplying $$GH$$ into the column vector $${\rm Rep}_{B}(\vec{v})$$ has the same effect as multiplying the column first by $$H$$ and then multiplying the result by $$G$$.



{\rm Rep}_{B,D}(g\circ h) =GH= {\rm Rep}_{C,D}(g)\,{\rm Rep}_{B,C}(h) $$

The definition of the matrix-matrix product operation does not restrict us to view it as a representation of a linear map composition. We can get insight into this operation by studying it as a mechanical procedure. The striking thing is the way that rows and columns combine.

One aspect of that combination is that the sizes of the matrices involved is significant. Briefly, $$m \! \times \! r\text{ times }r \! \times \! n\text{ equals }m \! \times \! n$$.

In terms of the underlying maps, the fact that the sizes must match up reflects the fact that matrix multiplication is defined only when a corresponding function composition



\text{dimension } n \text{ space} \;\stackrel{h}{\longrightarrow}\; \text{dimension } r \text{ space} \;\stackrel{g}{\longrightarrow}\; \text{dimension } m \text{ space} $$

is possible.

Another aspect of the way that rows and columns combine in the matrix product operation is that in the definition of the $$i,j$$ entry



p_{i,j} = g_{i,{\color{red}1}}h_{{\color{red}1},j} +g_{i,{\color{red}2}}h_{{\color{red}2},j} +\dots+g_{i,{\color{red}r}}h_{{\color{red}r},j} $$

the red subscripts on the $$g$$'s are column indicators while those on the $$h$$'s indicate rows. That is, summation takes place over the columns of $$G$$ but over the rows of $$H$$; left is treated differently than right, so $$ GH $$ may be unequal to $$ HG $$. Matrix multiplication is not commutative.

Except for the lack of commutativity, matrix multiplication is algebraically well-behaved. Below are some nice properties and more are in Problem 10 and Problem 11.

We have now seen how the representation of the composition of two linear maps is derived from the representations of the two maps. We have called the combination the product of the two matrices. This operation is extremely important. Before we go on to study how to represent the inverse of a linear map, we will explore it some more in the next subsection.

Exercises
/Solutions/