Linear Algebra/Spectral Theorem

Given a Hermitian matrix $$A$$, $$A$$ is always diagonalizable. It is also the case that all eigenvalues of $$A$$ are real, and that all eigenvectors are mutually orthogonal. This is given by the "Spectral Theorem":

The columns of $$U = \begin{pmatrix} \vec{u}_1 & \vec{u}_2 & \dots & \vec{u}_n \end{pmatrix}$$ are the eigenvectors of $$U$$, and the diagonal entries of $$\Lambda = \begin{pmatrix} \lambda_1 & 0 & \cdots & 0 \\ 0 & \lambda_2 & \cdots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \cdots & \lambda_n \end{pmatrix}$$ are the corresponding eigenvalues.

In essence $$A$$ can be decomposed into a "spectrum" of rank 1 projections: $$A = \sum_{i=1}^n \lambda_i(\vec{u}_i\vec{u}_i^H)$$

The spectral theorem can in fact be proven without the need for the characteristic polynomial of $$A$$, or any of the derivative theorems.