User:Felix QW/sandbox

Inverse matrices play a key role in linear algebra, and particularly in computations. However, only square matrices can possibly be invertible. This leads us to introduce the Moore-Penrose inverse of a potentially non-square real- or complex-valued matrix, which satisfies some but not necessarily all of the properties of an inverse matrix.

We will see below that given a matrix $$A$$, there exists a unique matrix $$A^+$$ that satisfies all four of the Moore–Penrose conditions. They generalise the properties of the usual inverse.

Basic properties of the Hermitian conjugate
We assemble some basic properties of the conjugate transpose for later use. In the following lemmas, $$A$$ is a matrix with complex elements and n columns, $$B$$ is a matrix with complex elements and n rows.

Existence and uniqueness
We establish existence and uniqueness of the Moore-Penrose inverse for every matrix.

This leads us to the natural definition:

Basic properties
We have already seen above that the Moore-Penrose inverse generalises the classical inverse to potentially non-square matrices. We will now list some basic properties of its interaction with the Hermitian conjugate, leaving most of the proofs as exercises to the reader.

The following identities hold:
 * 1) A+ = A+ A+* A$$
 * 2) A+ = A$$ A+* A+
 * 3) A = A+* A$$ A
 * 4) A = A A$$ A+*
 * 5) A$$ = A$$ A A+
 * 6) A$$ = A+ A A$$

Proof of the first one: $$A^+ = A^+AA^+$$ and $$AA^+ = \left(AA^+\right)^*$$ imply that $$A^+ = A^+\left(A A^+\right)^* = A^+A^{+^*}A^*$$. □

The remaining identities are left as exercises.

Reduction to the Hermitian case
The results of this section show that the computation of the pseudoinverse is reducible to its construction in the Hermitian case. It suffices to show that the putative constructions satisfy the defining criteria.

Products
We now turn to calculating the Moore-Penrose inverse for a product of two matrices, $$C = AB.$$

Another important special case which approximates closely that of invertible matrices is when $$A$$ has full column rank and $$B$$ has full row rank.

We finally derive a formula for calculating the Moore-Penrose inverse of $$A A^*$$.

Projectors and subspaces
The defining feature of classical inverses is that $$A A^{-1} = A^{-1} A = I.$$ What can we say about $$A A^+$$ and $$A^+ A$$?

We can derive some properties easily from the more basic properties above:

We can conclude that $$P = AA^+$$ and $$Q = A^+ A$$ are orthogonal projections.

We finish our analysis by determining image and kernel of the mappings encoded by the Moore-Penrose inverse.

Applications
We present two applications of the Moore-Penrose inverse in solving linear systems of equations.

Least-squares minimization
Moore-Penrose inverses can be used for least-squares minimisation of a system of equations that might not necessarily have an exact solution.

Minimum-norm solution to a linear system
The proof above also shows that if the system $$Ax = b$$ is satisfiable i.e. has a solution, then necessarily $$z = A^+b$$ is a solution (not necessarily unique). We can say more:

An immediate consequence of this result is that $$z$$ is also the uniquely smallest solution to the least-squares minimization problem for all $$Ax = b$$, including when $$ A $$ is neither injective nor surjective. It can be shown that the least-squares approximation $$Az = y \approx b$$ is unique. Thus it is necessary and sufficient for all $$x$$ that solve the least-squares minimization to satisfy $$Ax = y = Az = AA^+b$$. This system always has a solution (not necessarily unique) as $$Az$$ lies in the column space of $$A$$. From the above result the smallest $$x$$ which solves this system is $$A^+(AA^+b) = A^+b = z$$.