User:Espen180/Quantum Mechanics/Preliminary Mathematics

The erader is expected to be familiar with the contents of the Linear Algebra wikibook.

Hilbert Space
A Hilbert space is a generalized complex vector space $$V$$, which may have an (uncountably) infinite number of dimensions, and on which the following inner product is defined: For any $$u,v,w\in V$$ and $$\alpha,\beta\in\mathbb{C}$$, we define their inner product $$\langle u,v\rangle$$ such that


 * i) $$\langle u,v\rangle = \langle v,u\rangle ^*$$,


 * ii) $$\langle u,\alpha v+\beta w\rangle = \alpha \langle u,v\rangle + \beta\langle u,w\rangle$$, and


 * iii) $$\langle u,u\rangle \geq 0$$, and $$\langle u,u\rangle = 0$$ if and only if $$u=0$$.

Furthermore, we require that $$V$$ is complete with respect to the norm $$||u||^2=\langle u,u\rangle$$. Let $$\{ u_i \}_{i=1}^\infty$$ be a sequence such that for every real number $$\epsilon > 0$$, there exists an integer $$N>0$$ such that for all integers $$n,m>N$$,


 * $$||u_n-u_m||<\epsilon$$.

Then the sequence is called Cauchy, and the completeness axiom states that every Cauchy sequence of vectors $$\{u_i\}_{i=1}^\infty$$ in $$V$$ converges to a vector $$u_\infty$$ in $$V$$.

Two vectors $$u,v\in V$$ are called orthogonal if $$\langle u,v\rangle =0$$. A set of vectors orthogonal to one another is neccesarily linearly independent. The proof is left to the reader as an excercise.

A linearly independent subset $$B$$ of $$V$$ is called a basis set if all vectors in $$V$$ have a unique linear expansion in terms of the basis vectors.

Example 1: Let $$L_2[0,\pi)$$ be the set of all square-integrable functions on the real line segment $$[0,\pi)$$, and let $$f(x)$$ be any such function. Then, since $$f(x)$$ has a unique Fourier expansion, the set $$B=\{sin(cx),cos(cx)|c\in\mathbb{N}\}$$ is a basis set for $$L_2[0,\pi)$$.

Hilbert spaces are frequently taken to be function spaces, that is, spaces whose elements are functions of some kind. The kind of Hilbert space we will be using is called a rigged Hilbert space, in which we generalize to spaces of distributions. In effect, this allows the use of the Dirac delta function $$\delta(x)$$, defined by


 * $$\int_{-\epsilon}^{\epsilon} \delta(x)\mathrm{d}x=1$$ for any real $$\epsilon>0$$.

Since a function is given uniquely by specifying its value at all elements of its domain, the set $$B=\{\delta(x-c)|c\in\mathrm{R}\}$$ is a basis for any function space on the real line.

The Dual Space
Associated with every Hilbert space $$V$$ is the corresponding dual space $$V^*$$, consisting of linear functionals on $$V$$. A linear functional on $$V$$ is a linear function $$f\,:\, V\rightarrow \mathbb{C}$$ such that $$f(\alpha u + \beta v)=\alpha f(u)+\beta f(v)$$.

Let $$B$$ be an orthogonal basis set in $$V$$. We then construct the set $$B^*$$ in $$V^*$$ by sending $$b\in B$$ to $$b^*\in B^*$$, where $$b^*$$ is the functional given by $$b^*(v)=\langle b,v\rangle$$ for all $$v\in V$$.

Operators
An operator on a Hilber space $$V$$ is a linear transformation $$A\,:\, V\rightarrow V$$.

Given two operators $$A$$ and $$B$$ on $$V$$, we can define their composition $$AB$$ by $$(AB)v=A(Bv)$$ for all $$v\in V$$.

The identity function $$\mathrm{id}\,:\,V\rightarrow V$$ defined by $$\mathrm{id}(v)=v$$ for all $$v\in V$$ is an operator on $$V$$.

Given an operator $$A$$ on $$V$$, the inverse operator, if it exists, is the operator $$A^{-1}$$ on $$V$$ such that $$AA^{-1}=A^{-1}A=\mathrm{id}$$.

Given an operator $$A$$ on $$V$$, we define its Hermitian adjoint, or simply adjoint, as the unique operator $$A^\dagger$$ such that for any $$u,v\in V$$, we have


 * $$\langle u,Av\rangle = \langle A^\dagger u,v\rangle$$

An operator is called Hermitian if $$A^\dagger = A$$. It is called unitary if $$A^\dagger = A^{-1}$$.

Given two operators $$A$$ and $$B$$ on $$V$$, define their commutator $$[A,B]=AB-BA$$.

An operator A is called normal if $$[A,A^\dagger]=0$$.

It is trivi al to show that if an operator is either Hermitian of unitary, then it must neccesarily be normal.

Eigenvalues and Eigenvectors
Let $$A$$ be an operator on a Hilbert space $$V$$ and concider the equation


 * $$Av=\lambda v$$.

This is called an eigenvalue equation. $$\lambda$$ is called an eigenvalue of $$A$$, and $$v$$ an eigenvector. We assume the reader to be familar with the eigenvalue problem in the finite-dimensional case. We will now prove a very useful theorem. If $$A$$ is Hermitian, then the eigenvectors of $$A$$ constitute a basis for $$V$$.