Functional Analysis/Hilbert spaces

A normed space is called a pre-Hilbert space if for each pair $$(x, y)$$ of elements in the space there is a unique complex (or real) number called an inner product of $$x$$ and $$y$$, denoted by $$\langle x, y \rangle$$, subject to the following conditions:
 * (i) The functional $$f(x) = \langle x, y \rangle$$ is linear.
 * (ii) $$\langle x, y \rangle = \overline {\langle y, x \rangle}$$
 * (iii) $$\langle x, x \rangle > 0$$ for every nonzero $$x$$

The inner product in its second variable is not linear but antilinear: i.e., if $$g(y) = \langle x, y \rangle$$, then $$g(\alpha y) = \bar{\alpha} y$$ for scalars $$ \alpha$$. We define $$\| x \| = \langle x, x \rangle^{1/2}$$ and this becomes a norm. Indeed, it is clear that $$\| \alpha x \| = | \alpha | \| x \|$$ and (iii) is the reason that $$\| x \| = 0$$ implies that $$x = 0$$. Finally, the triangular inequality follows from the next lemma.

3.1 Lemma (Schwarz's inequality) $$|\langle x, y \rangle| \le \|x\|\|y\|$$ where the equality holds if and only if we can write $$x = \lambda y$$ for some scalar $$\lambda$$.

If we assume the lemma for a moment, it follows:

since $$\operatorname{Re}(\alpha) \le | \alpha |$$ for any complex number $$\alpha$$
 * $$\| x + y \|^2$$
 * $$= \| x \|^2 + 2 \operatorname{Re} \langle x, y \rangle + \| y \|^2 \le \| x \|^2 + 2 | \langle x, y \rangle | + \| y \|^2 $$
 * $$\le (\| x \| + \| y \|)^2$$
 * }
 * $$\le (\| x \| + \| y \|)^2$$
 * }
 * }

Proof of Lemma: First suppose $$\|x\| = 1$$. If $$\alpha = \overline {\langle x, y \rangle}$$, it then follows:
 * $$0 \le \| \alpha x - y\|^2 = | \alpha |^2 - 2 \operatorname{Re} (\alpha \langle x, y \rangle) + \|y\|^2 = -| \alpha |^2 + \|y\|^2$$

where the equation becomes $$0$$ if and only if $$x = \lambda y$$. Since we may suppose that $$x \ne 0$$, the general case follows easily. $$\square$$

3.2 Theorem A normed linear space is a pre-Hilbert space if and only if $$\|x - y\|^2 = 2\|x\|^2 + 2\|y\|^2 - \|x+y\|^2$$.

Proof: The direct part is clear. To show the converse, we define
 * $$\langle x, y \rangle = 4^{-1} (\|x+y\|^2 - \|x-y\|^2 + i\|x+iy\|^2 - i\|x-iy\|^2)$$.

It is then immediate that $$\langle x, y \rangle = \overline{\langle y, x \rangle}$$, $$\langle -x, y \rangle = -\langle x, y \rangle$$ and $$\langle ix, y \rangle = i \langle x, y \rangle$$. Moreover, since the calculation:

we have: $$\langle x_1 + x_2, y \rangle = \langle x_1, y \rangle + \langle x_2, y \rangle$$. If $$\alpha$$ is a real scalar and $$\alpha_j$$ is a sequence of rational numbers converging to $$\alpha$$, then by continuity and the above, we get: $$\langle \alpha x, y \rangle = \lim_{j \to \infty} \langle \alpha_j x, y \rangle = \alpha \langle x, y \rangle. \square$$
 * $$\|x_1 + x_2 + y\|^2 - \|x_1 + x_2 - y\|^2$$
 * $$=2\|x_1+y\|^2 - 2\|x_1-y\|^2 - \|x_1 - x_2 + y\|^2 - \|x_1 - x_2 - y\|^2$$
 * $$=\sum_{j=1}^2 \|x_j+y\|^2 - \|x_j-y\|^2$$,
 * }
 * $$=\sum_{j=1}^2 \|x_j+y\|^2 - \|x_j-y\|^2$$,
 * }
 * }

3.3 Lemma ''Let $$\mathfrak{H}$$ be a pre-Hilbert. Then $$x_j \to x$$ in norm if and only if for any $$y \in \mathfrak{H}$$ $$\|x_j\| \to \|x\|$$ and $$\langle x_j - x, y \rangle \to 0$$ as $$j \to \infty$$.

Proof: The direct part holds since:
 * $$| \|x_j\| - \|x\| | + |\langle x_j - x, y \rangle| \le \|x_j - x\|(1 + \|y\|) \to 0$$ as $$j \to \infty$$.

Conversely, we have:
 * $$\| x_j - x \|^2 = \|x_j\|^2 - 2\operatorname{Re} \langle x_j, x \rangle + \|x\|^2 \to 0$$ as $$j \to \infty$$

$$\square$$

3.4 Lemma ''Let $$D$$ be a non-empty convex closed subset of a Hilbert space. Then $$D$$ admits a unique element $$z$$ such that
 * $$\| z \| = \inf \{ \|x\|; x \in D \}$$.''

Proof: By $$\delta$$ denote the right-hand side. Since $$D$$ is nonempty, $$\delta > 0$$. For each $$n = 1, 2, ...$$, there is some $$x_n \in D$$ such that $$0 \le \|x_n\| - \delta \le n^{-1}$$. That is, $$\delta = \lim_{n \to \infty} \| x_n \|$$. Since $$D$$ is convex,
 * $${ x_n + x_m \over 2 } \in D$$ and so $$\delta \le {1 \over 2} \|x_n + x_m\|$$.

It follows: This is to say, $$x_n$$ is Cauchy. Since $$D$$ is a closed subset of a complete metric space, whence it is complete, there is a limit $$z \in D$$ with $$\|z\| = \delta$$. The uniqueness follows since if $$\|w\| = \delta$$ we have
 * $$\|z - w\|^2 = 2\|z\|^2 + 2\|w\|^2 - \|z+w\|^2$$

where the right side is $$\le 0$$ for the same reason as before. $$\square$$

The lemma may hold for a certain Banach space that is not a Hilbert space; this question will be investigated in the next chapter.

For a nonempty subset $$E \subset \mathfrak{H}$$, define $$E^\bot$$ to be the intersection of the kernel of the linear functional $$u \mapsto \langle u, v \rangle$$ taken all over $$v \in E$$. (In other words, $$E^\bot$$ is the set of all $$x \in \mathfrak{H}$$ that is orthogonal to every $$y \in E$$.) Since the kernel of a continuous function is closed and the intersection of linear spaces is again a linear space, $$E^\bot$$ is a closed (linear) subspace of $$\mathfrak{H}$$. Finally, if $$x \in E \cap E^\bot$$, then $$0 = \langle x, x \rangle = \|x\|$$ and $$x = 0$$.

3.5 Lemma ''Let $$\mathcal{M}$$ be a linear subspace of a pre-Hilbert space. Then $$z \in \mathcal{M}^\bot$$ if and only if $$\| z \| = \inf \{ \| z + w \| ; w \in \mathcal{M}\}$$.

Proof: (<=). Let $$w \in \mathcal M$$. By our condition, we have that $$\lVert z \rVert \leq \lVert z + w\rVert$$. Squaring both sides gives $$\lVert z \rVert^2 \leq \lVert z + w\rVert^2$$. Expanding this using inner products and rearranging gives $$2\Re \langle z, w \rangle \geq -\lVert w \rVert^2$$. The same thing is true (by the same argument) for $$-w$$, so we get $$2\Re \langle z, -w \rangle \geq -\lVert w \rVert^2$$. This altogether implies that $$-\lVert w \rVert^2 \leq 2\Re \langle z, -w \rangle = -2\Re \langle z, w \rangle \leq \lVert w \rVert^2$$, from which we get $$2|\Re \langle z, w \rangle| \leq \lVert w \rVert^2$$. Consider a real $$\lambda > 0$$; by the same argument we have that $$2|\Re\langle z, w \rangle| \leq \lambda \lVert w\rVert^2$$. Since this is true for all $$\lambda > 0$$, we get $$\Re \langle z, w \rangle = 0$$. Since furthermore we have $$i w \in \mathcal M$$, we have that $$0 = \Re \langle z, iw \rangle = -\Im\langle z, w \rangle$$. We conclude that $$\langle z, w \rangle = 0$$.

(=>) Let $$w \in \mathcal M$$. We have that $$\lVert z + w \rVert^2 = \lVert z \rVert^2 + 2\Re\langle z, w \rangle + \lVert w \rVert^2 = \lVert z \rVert^2 + \lVert w \rVert^2 \geq \lVert z \rVert^2$$. Taking the first and last term in this quality, and applying the square root, gives $$\lVert z + w \rVert \geq \lVert z \rVert$$. Finally, notice that for $$w = 0 \ni \mathcal M$$, the infimum is obtained because $$\lVert z + w \rVert = \lVert z \rVert$$. $$\square$$

3.6 Theorem (orthogonal decomposition) ''Let $$\mathfrak{H}$$ be a Hilbert space and $$\mathcal{M} \subset \mathfrak{H}$$ be a closed subspace. For every $$x \in \mathfrak{H}$$ we can write
 * $$x = y + z$$

where $$y \in \mathcal{M}$$ and $$z \in \mathcal{M}^\bot$$, and $$y$$ and $$z$$ are uniquely determined by $$x$$.''

Proof: Clearly $$x - \mathcal{M}$$ is convex, and it is also closed since a translation of closed set is again closed. Lemma 3.4 now gives a unique element $$y \in \mathcal{M}$$ such that $$\|x - y \| = \inf \{ \|x - w\|; w \in \mathcal{M}\}$$. Let $$z = x - y$$. By Lemma 3.5, $$z \in \mathcal{M}^\bot$$. For the uniqueness, suppose we have written:
 * $$x = y' + z'$$

where $$y' \in \mathcal{M}$$ and $$z' \in \mathcal{M}^\bot$$. By Lemma 3.5, $$\|x - y' \| = \inf \{ \|x - w\|; w \in \mathcal{M}\}$$. But, as noted early, such $$y'$$ must be unique; i.e., $$y' = y$$. $$\square$$

3.7 Corollary ''Let $$\mathcal{M}$$ be a subspace of a Hilbert space $$\mathfrak{H}$$. Then Proof: By continuity, $$\langle x, \overline{\mathcal{M}} \rangle \subset \overline{\langle x, \mathcal{M} \rangle}$$. (Here, $$\langle x, E \rangle$$ denotes the image of the set $$E$$ under the map $$y \mapsto \langle x, y \rangle$$.) This gives:
 * (i) $$\mathcal{M}^\bot = \{0\}$$ if and only if $$\mathcal{M}$$ is dense in $$\mathfrak{H}$$.
 * (ii) $$\mathcal{M}^{\bot\bot} = \overline{\mathcal{M}}$$.
 * $$\mathcal{M}^\bot = \overline{\mathcal{M}}^\bot$$ and so $$\mathfrak{H} = \overline{\mathcal{M}} \oplus \mathcal{M}^\bot$$

by the orthogonal decomposition. (i) follows. Similarly, we have:
 * $$\mathfrak{H} = \mathcal{M}^\bot \oplus \mathcal{M}^{\bot\bot} = \mathcal{M}^\bot \oplus \overline{\mathcal{M}}$$.

Hence, (ii). $$\square$$

3.8 Theorem (representation theorem) ''Every continuous linear functional $$f$$ on a Hilbert space $$\mathfrak{H}$$ has the form:
 * $$f(x) = \langle x, y \rangle$$ with a unique $$y \in \mathcal{M}$$ and $$\|f\| = \|y\|_\mathfrak{H}$$

Proof: Let $$\mathcal{M} = f^{-1} ( \{0\} )$$. Since $$f$$ is continuous, $$\mathcal{M}$$ is closed. If $$\mathcal{M} = \mathfrak{H}$$, then take $$y = 0$$. If not, by Corollary 3.6, there is a nonzero $$z \in \mathfrak{H}$$ orthogonal to $$\mathcal{M}$$. By replacing $$z$$ with $$z \|z\|^{-1}$$ we may suppose that $$\|z\| = 1$$. For any $$x \in \mathfrak{H}$$, since $$zf(x) - f(z)x$$ is in the kernel of $$f$$ and thus is orthogonal to $$z$$, we have:
 * $$0 = \langle z f(x) - f(z) x, z \rangle = \langle z, z \rangle f(x) - \langle f(z)x, z \rangle$$

and so:
 * $$f(x) = \langle x, \overline{f(z)} z \rangle$$

The uniqueness follows since $$\langle x, y_1 \rangle = \langle x, y_2 \rangle$$ for all $$x \in \mathfrak{H}$$ means that $$y_1 - y_2 \in \mathfrak{H}^\bot = \{0\}$$. Finally, we have the identity:
 * $$\|y\| = |\langle {y \over \|y\|}, y \rangle | \le \|f\| \le \|y\|$$

where the last inequality is Schwarz's inequality. $$\square$$

3.9 Exercise Using Lemma 1.6 give an alternative proof of the preceding theorem.

In view of Theorem 3.5, for each $$x \in \mathfrak{H}$$, we can write: $$x = y + z$$ where $$y \in \mathcal{M}$$, a closed subspace of $$\mathfrak{H}$$, and $$z \in \mathcal{M}^\bot$$. Denote each $$y$$, which is uniquely determined by $$x$$, by $$\pi(x)$$. The function $$\pi$$ then turns out to be a linear operator. Indeed, for given $$x_1, x_2 \in \mathfrak{H}$$, we write:
 * $$x_1 = y_1 + z_1, x_2 = y_2 + z_2$$ and $$x_1 + x_2 = y_3 + z_3$$

where $$y_j \in \mathcal{M}$$ and $$z_j \in \mathcal{M}^\bot$$ for $$j = 1, 2, 3$$. By the uniqueness of decomposition
 * $$\pi(x_1) + \pi(x_2) = y_1 + y_2 = y_3 = \pi(x_1 + x_2)$$.

The similar reasoning shows that $$\pi$$ commutes with scalars. Now, for $$x = y + z \in \mathfrak{H}$$ (where $$y \in \mathcal{M}$$ and $$z \in \mathcal{M}^\bot$$), we have:
 * $$\|x\|^2 = \|\pi(x)\|^2 + \|z\|^2 \ge \|\pi(x)\|^2$$

That is, $$\pi$$ is continuous with $$\|\pi\| \le 1$$. In particular, when $$\mathcal{M}$$ is a nonzero space, there is $$x_0 \in \mathcal{M}$$ with $$\pi(x_0) = x_0$$ and $$\|x_0\| = 1$$ and consequently $$\|\pi\| = 1$$. Such $$\pi$$ is called an orthogonal projection (onto $$\mathcal{M}$$).

The next theorem gives an alternative proof of the Hahn-Banach theorem.

3 Theorem ''Let $$\mathcal{M}$$ be a linear (not necessarily closed) subspace of a Hilbert space. Every continuous linear functional on $$\mathcal{M}$$ can be extended to a unique continuous linear functional on $$\mathfrak{H}$$ that has the same norm and vanishes on $$\mathcal{M}^\bot$$.''

Proof: Since $$\mathcal{M}$$ is a dense subset of a Banach space $$\overline{\mathcal{M}}$$, by Theorem 2.something, we can uniquely extend $$f$$ so that it is continuous on $$\overline{\mathcal{M}}$$. Define $$g = f \circ \pi_{\overline{\mathcal{M}}}$$. By the same argument used in the proof of Theorem 2.something (Hahn-Banach) and the fact that $$\|\pi_\mathcal{F}\|= 1$$, we obtain $$\|f\| = \|g\|$$. Since $$g = 0$$ on $$\mathcal{M}^\bot$$, it remains to show the uniqueness. For this, let $$h$$ be another extension with the desired properties. Since the kernel of $$f - h$$ is closed and thus contain $$\overline{\mathcal{M}}$$, $$f = h$$ on $$\overline{\mathcal{M}}$$. Hence, for any $$x \in \mathfrak{H}$$,
 * $$h(x) = (h \circ \pi_{\overline{\mathcal{M}}})x = (f \circ \pi_{\overline{\mathcal{M}}})x = g(x)$$.

The extension $$g$$ is thus unique. $$\square$$

3 Theorem ''Let $$\mathcal{M}_n$$ be an increasing sequence of closed subspaces, and $$\mathcal{M}$$ be the closure of $$\mathcal{M}_1 \cup \mathcal{M}_2 \cup ...$$. If $$\pi_\mathcal{M}$$ is an orthogonal projection onto $$\mathcal{M}$$, then for every $$x \in \mathcal{M}$$ $$\pi_{\mathcal{M}_n}(x) \to x$$.''

Proof: Let $$\mathcal{N} = \{ x \in \mathcal{M}; \pi_{\mathcal{M}_n}(x) \to x (n \to \infty) \}$$. Then $$\mathcal{N}$$ is closed. Indeed, if $$x_j \in \mathcal{N}$$ and $$x_j \to x$$, then
 * $$\|\pi_{\mathcal{M}_n}(x) - x \| \le 2\|x - x_j\| + \|\pi_{\mathcal{M}_n}(x_j) - x_j \|$$

and so $$x \in \mathcal{N}$$. Since $$\mathcal{M} \subset \overline{\mathcal{N}}$$, the proof is complete. $$\square$$

Let $$(\mathfrak{H}_j, \|\cdot\|_j = \langle \cdot, \cdot \rangle_j)$$ be Hilbert spaces. The direct sum of $$\mathfrak{H}_1 \oplus \mathfrak{H}_2$$ is defined as follows: let $$\mathfrak{H}_1 \oplus \mathfrak{H}_2 = \{ (x_1, x_2); x_1 \in \mathfrak{H}_1, x_2 \in \mathfrak{H}_2 \} $$ and define
 * $$\langle x_1 \oplus x_2, y_1 \oplus y_2 \rangle = \langle x_1, y_1 \rangle_1 + \langle x_2, y_2 \rangle_2$$.

It is then easy to verify that $$(\mathfrak{H}_1 \oplus \mathfrak{H}_2, \langle \cdot, \cdot \rangle)$$ is a Hilbert space. It is also clear that this definition generalizes to a finite direct sum of Hilbert spaces. (For an infinite direct sum of Hilbert spaces, see Chapter 5.)

Recall from the previous chapter that an isometric surjection between Banach spaces is called "unitary".

3 Lemma (Hilbert adjoint) ''Define $$V: \mathfrak{H}_1 \oplus \mathfrak{H}_2 \to \mathfrak{H}_2 \oplus \mathfrak{H}_1$$ by $$V(x_1 \oplus x_2) = -x_2 \oplus x_1$$. (Clearly, $$V$$ is a unitary operator.) Then $$(V\operatorname{gra}T)^\bot$$ is a graph (of some linear operator) if and only if $$T$$ is densely defined.'' Proof: Set $$\mathcal{M} = (V\operatorname{gra}T)^\bot$$. Let $$u \in (\operatorname{dom}T^*)^\bot$$. Then
 * $$0 = \langle 0, -Tv \rangle_2 + \langle u, v \rangle_2 = \langle 0 \oplus u, -Tv \oplus v \rangle$$ for every $$v$$.

That is to say, $$0 \oplus u \in \mathcal{M}$$, which is a graph of a linear operator by assumption. Thus, $$u = 0$$. For the converse, suppose $$f \oplus u_1, f \oplus u_2 \in \mathcal{M}$$. Then
 * $$0 = \langle f \oplus u_j, -Tv \oplus v \rangle = \langle f, -Tv \rangle_2 + \langle u_j, v \rangle_1 \qquad $$ $$(j = 1, 2)$$

and so $$\langle u_1 - u_2, v \rangle_1 = 0$$ for every $$v$$ in the domain of $$T$$, dense. Thus, $$u_1 = u_2$$, and $$\mathcal{M}$$ is a graph of a function, say, $$S$$. The linear of $$S$$ can be checked in the similar manner.$$\square$$

Remark: In the proof of the lemma, the linear of $$T$$ was never used.

For a densely defined $$T$$, we thus obtained a linear operator which we call $$T^*$$. It is characterized uniquely by:
 * $$0 = \langle f, -Tu \rangle_2 + \langle T^*f, u \rangle_1 = \langle f \oplus T^*f, V(u \oplus Tu) \rangle$$ for every $$u$$,

or, more commonly,
 * $$\langle Tu, f \rangle_2 = \langle u, T^*f \rangle_1$$ for every $$u$$.

Furthermore, $$T^*f$$ is defined if and only if
 * $$u \mapsto \langle Tu, f \rangle$$

is continuous for every $$u \in \operatorname{dom}T$$. The operator $$T^*$$ is called the Hilbert adjoint (or just adjoint) of $$T$$. If $$T$$ is closed in addition to having dense domain, then
 * $$(V'\operatorname{gra}T^*)^\bot = (V' (V\operatorname{gra}T)^\bot)^\bot = \operatorname{gra}T^{\bot\bot} = \operatorname{gra}T$$

Here, $$V'(x_2, x_1) = -x_1 \oplus x_2$$. By the above lemma, $$T^*$$ is densely defined. More generally, if a densely defined operator $$T$$ has a closed extension $$S$$ (i.e., $$\operatorname{gra}T \subset \operatorname{gra}S = \overline{\operatorname{gra}S}$$), then $$S$$ and $$S^*$$ are both densely defined. It follows: $$\operatorname{gra}S^* \subset \operatorname{gra}T^*$$. That is, $$T^*$$ is densely defined and $$T^{**}$$ exists. That $$S = T^{**}$$ follows from the next theorem.

3 Theorem ''Let $$T: \mathfrak{H}_1 \to \mathfrak{H}_2$$ be a densely defined operator. If $$T^*$$ is also densely defined, then
 * $$\overline{\operatorname{gra}T} = \operatorname{gra}T^{**} = \operatorname{gra}S$$

for any closed extension $$S$$ of $$T$$.

Proof: As above,
 * $$(V'\operatorname{gra}T^*)^\bot = \operatorname{gra}T^{\bot\bot}$$

Here, the left-hand side is a graph of $$T^{**} $$. For the second identity, since $$\operatorname{gra}S$$ is a Hilbert space, it suffices to show $$ \operatorname{gra}T^\bot \cap \operatorname{gra}S = \{0\} $$. But this follows from Lemma 3.something.$$\square$$

The next corollary is obvious but is important in application.

3 Corollary ''Let $$\mathfrak{H}_1, \mathfrak{H}_2$$ be Hilbert spaces, and $$T: \mathfrak{H}_1 \to \mathfrak{H}_2$$ a closed densely defined linear operator. Then $$u \in \operatorname{dom}T$$ if and only if there is some $$K > 0$$ such that:
 * $$\|\langle T^*f, u \rangle \| \le K\|f\|$$ for every $$f \in \operatorname{dom}T^*$$

3 Lemma ''Let $$T: \mathfrak{H}_1 \to \mathfrak{H}_2$$ be a densely defined linear operator. Then $$\operatorname{ker}T^* = (\operatorname{ran}T)^\bot.$$''

Proof: $$f$$ is in either the left-hand side or the right-hand side if and only if:
 * $$0 = \langle T^*f, u \rangle = \langle f, Tu \rangle$$ for every $$u$$.

(Note that $$\langle f, Tu \rangle = 0$$ for every $$u$$ implies $$f \in \operatorname{dom}T^*$$.) $$\square$$

In particular, a closed densely defined operator has closed kernel. As an application we shall prove the next theorem.

3 Theorem ''Let $$T: \mathfrak{H}_1 \to \mathfrak{H}_2$$ be a closed densely defined linear operator. Then $$T$$ is surjective if and only if there is a $$K > 0$$ such that
 * $$\|f\|_2 \le K\|T^*f\|_1$$ for every $$f \in \operatorname{dom}T^*$$.''

Proof: Suppose $$T$$ is surjective. Since $$T$$ has closed range, it suffices to show the estimate for $$f \in (\operatorname{ker}T^*)^\bot = \operatorname{ran}T$$. Let $$u \in (\operatorname{ker}T)^\bot$$ with $$Tu = f$$. Denoting by $$G$$ the inverse of $$T$$ restricted to $$(\operatorname{ker}T)^\bot$$, we have:
 * $$\|f\|^2_2 \le \|T^*f\|_1 \|Gf\|_1 \le \|T^*f\| \|G\| \|f\|_2$$

The last inequality holds since $$G$$ is continuous by the closed graph theorem. To show the converse, let $$g \in \mathfrak{H}_2$$ be given. Since $$T^*$$ is injective, we can define a linear functional $$L$$ by $$L(T^*f) = \langle f, g \rangle_2$$ for $$f \in \mathfrak{H}_2$$.,
 * $$|L(T^*f)| = |\langle f, g \rangle_2| \le K\|T^*f\|$$ for every $$f \in \operatorname{dom}T^*$$.

Thus, $$L$$ is continuous on the range of $$T^*$$. It follows from the Hahn-Banach theorem that we may assume that $$L$$ is defined and continuous on $$\mathfrak{H}_1$$. Thus, by Theorem 3.something, we can write $$L(\cdot) = \langle \cdot, u \rangle_1$$ in $$\mathfrak{H}_1$$ with some $$u$$. Since $$L(T^*f)$$ is continuous for $$f \in \operatorname{dom}T^*$$,
 * $$L(T^*f) = \langle f, g \rangle_2 = \langle T^*f, u \rangle_1 = \langle f, T^{**}u \rangle_2$$ for every $$f \in \operatorname{dom}T^*$$.

Hence, $$Tu = T^{**}u = g$$. $$\square$$

3 Corollary ''Let $$T, \mathfrak{H}_1, \mathfrak{H}_2$$ be as given in the preceding theorem. Then $$\operatorname{ran}T$$ is closed if and only if $$\operatorname{ran}T^*$$ is closed.''

Proof: Define $$S: \mathfrak{H}_1 \to \operatorname{ran}T$$ by $$S = T$$. It thus suffices to show $$S^*$$ is surjective when $$T$$ has closed range (or equivalently $$S$$ is surjective.) Suppose $$S^*f_j$$ is convergent. The preceding theorem gives:
 * $$\|f_j - f_k\|_2 \le K\|S^*(f_j - f_k)\|_1 \to 0$$ as $$j, k \to \infty$$.

Thus, $$f_j \oplus S^*f_j$$ is Cauchy in the graph of $$S^*$$, which is closed. Hence, $$S^*f_j$$ converges within the range of $$S^*$$. The converse holds since $$T^{**} = T$$. $$\square$$

We shall now consider some concrete examples of densely defined linear operators.

3 Theorem ''$$T:\mathfrak{H}_1 \to \mathfrak{H}_2$$ is continuous if and only if $$T^*$$ is continuous. Moreover, when $$T$$ is continuous,
 * $$\|T\|^2 = \|T^*T\| = \|TT^*\| = \|T^*\|^2$$.''

Proof: It is clear that $$T^*$$ is defined everywhere, and its continuity is a consequence of the closed graph theorem. Conversely, if $$T^*$$ is continuous, then $$T^{**}$$ is continuous and $$T = T^{**}$$. For the second part,
 * $$\| T^*f \|^2_1 = | \langle T T^*f, f \rangle | \le \|T\| \|T^*f\|_2 \|f\|_2$$ for every $$f$$.

Thus, $$T^*$$ is continuous with $$\|T^*\| \le \|T\|$$. In particular, $$T^*T$$ is continuous, and so:
 * $$\| T^*f \|^2_1 \le \|T T^*\| \|f\|_2^2$$ for every $$f$$.

That is to say, $$\|T^*\|^2 \le \|T T^*\| \le \|T\|^2$$. Applying this result to $$T^*$$ in place of $$T$$ completes the proof.

The identity in the theorem shows that $$B(\mathcal{H})$$ is a $$C^*$$-algebra, which is a topic in Chapter 6.

3 Lemma ''Let $$S, T \in B(\mathfrak H)$$. If $$\langle Tx, x \rangle = \langle Sx, x \rangle$$ for $$x \in \mathfrak{H}$$, then $$S = T$$.

Proof: Let $$R = T - S$$. We have $$0 = \langle R(x + y), x + y \rangle = \langle Rx, y \rangle + \langle Ry, x \rangle$$ and $$0 = i\langle R(x + iy), x + iy \rangle = \langle Rx, y \rangle + i^2 \langle Ry, x \rangle$$. Summing the two we get: $$0 = 2\langle Rx, y \rangle$$ for $$x, y \in \mathfrak{H}$$. Taking $$y = Rx$$ gives $$0 = \|Rx\|^2$$ for all $$x \in \mathfrak{H}$$ or $$R = 0$$. $$\square$$

Remark: the above lemma is false if the underlying field is $$\mathbf{R}$$.

Recall that an isometric surjection is called unitary.

3 Corollary A linear operator $$U: \mathfrak{H}_1 \to \mathfrak{H}_2$$ is unitary if and only if $$U^*U$$ and $$UU^*$$ are identities.

Proof: Since $$(U^*Ux \mid x) = \|Ux\|^2 = (x \mid x)>$$, we see that $$U^*U$$ is the identity. Since $$UU^*U = U$$, $$UU^*$$ is the identity on the range of U, which is $$\mathcal{H}_2$$ by surjectivity. Conversely, since $$\|Ux\|^2_2 = \langle U^*Ux, x \rangle_1 = \|x\|^2_1$$, $$U$$ is an isometry. $$\square$$

Curiously, the hypothesis on linearity can be omitted:

3 Theorem If $$U: \mathcal{H}_1 \to \mathcal{H}_2$$ is a function'' such that
 * $$\| U(x) - U(y) \|_2 = \|x - y\|_1$$

for every x and y and $$U(0) = 0$$, then $$U$$ is a linear operator (and so unitary).''

Proof: Note that U is continuous. Since $$\|U(x)\| = \|U(x) - U(0)\| = \|x\|$$, we have:
 * $$\|x - y\|_1^2 = \|U(x) - U(y)\|_2^2 = \|x\|_1^2 - 2\operatorname{Re} ( U(x) \mid U(y) ) + \|y\|_1^2$$.

Thus,
 * $$\operatorname{Re}(x \mid y)_1 = \operatorname{Re}(U(x), U(y))_2$$

It now follows:
 * $$\|U(\alpha x + y) - \alpha U(x) - U(y)\|_2^2 = \|U(\alpha x + y) - U(\alpha x)\|_2^2 - 2\operatorname{Re} (y \mid y) + \|U(y)\|_2^2 = \|y\|_1^2 - 2\|y\|_1^2 + \|y\|_1^2 = 0$$

for any $$x, y \in \mathcal{H}_1$$ and scalar $$\alpha$$. $$\square$$

There is an analog of this result for Banach space. See, for example, http://www.helsinki.fi/~jvaisala/mazurulam.pdf)

3 Exercise ''Construct an example so as to show that an isometric operator (i.e., a linear operator that preserves norm) need not be unitary. (Hint: a shift operator.)''

A densely defined linear operator $$T$$ is called "symmetric" if $$\operatorname{gra}T \subset \operatorname{gra}T^*$$. If the equality in the above holds, then $$T$$ is called "self-adjoint". In light of Theorem 3.something, every self-adjoint is closed and densely defined. If $$T$$ is symmetric, then since $$T^{**}$$ is an extension of $$T$$,
 * $$\operatorname{gra}T \subset \operatorname{gra}T^* \cap \operatorname{gra}T^{**}$$.

3 Theorem ''Let $$T_j: \mathfrak{H}_j \to \mathfrak{H}_{j+1}$$ be densely defined linear operators for $$j = 1, 2$$. Then $$\operatorname{gra}T_1^* \circ T_2^* \subset \operatorname{gra}(T_2 \circ T_1)^*$$ where the equality holds if $$T_j^{**} = T_j$$ $$(j = 1, 2)$$ and $$T_1^* \circ T_2^*$$ is closed and densely defined.''

Proof: Let $$u \in \operatorname{dom}(T_1^* \circ T_2^*)$$. Then
 * $$\langle T_2 \circ T_1 v, u \rangle = \langle T_1 v, T_2^* u \rangle = \langle v, T_1^* \circ T_2^* u \rangle$$ for every $$v \in \operatorname{dom}(T_2 \circ T_1)$$.

But, by definition, $$(T_2 \circ T_1)^*u$$ denotes $$T_1^* \circ T_2^* u $$. Hence, $$(T_2 \circ T_1)^*$$ is an extension of $$T_1^* \circ T_2^*$$. For the second part, the fact we have just proved gives:
 * $$\operatorname{gra}T_1^* \circ T_2^* \subset \operatorname{gra}(T_2 \circ T_1)^* = \operatorname{gra}(T_2^{**} \circ T_1^{**})^* \subset \operatorname{gra}(T_1^* \circ T_2^*)^{**}$$. $$\square$$

3 Theorem ''Let $$T:\mathfrak{H}_1 \to \mathfrak{H}_2$$ be a Hilbert spaces. If $$T:\mathfrak{H}_1 \to \mathfrak{H}_2$$ is a closed densely defined operator, then $$T^*T$$ is a self-adjoint operator (in particular, densely defined and closed.)''

Proof: In light of the preceding theorem, it suffices to show that $$T^*T$$ is closed. Let $$u_j \in \operatorname{dom}T^*T$$ be a sequence such that $$(u_j, T^*T u_j)$$ converges to limit $$(u, v)$$. Since
 * $$\|T u_j - T u_k \|_2 \le 2 (\|T^*T (u_j - u_k) \|_1 + \|u_j - u_k\|_1)$$,

there is some $$f \in \mathfrak{H}_2$$ such that: $$\|Tu_j - f\|_2 \to 0$$. It follows from the closedness of $$T^*$$ that $$T^*f = v$$. Since $$\|u_j - u\|_1 + \|Tu_j - f\|_2 \to 0$$ and $$T$$ is closed, $$T^*Tu = T^*f = v$$. $$\square$$

3 Theorem ''Let $$T$$ be a symmetric densely defined operator. If $$T$$ is surjective, then $$T$$ is self-adjoint and injective and $$T^{-1}$$ is self-adjoint and bounded.''

Proof: If $$Tu = 0$$,
 * $$\langle Tu, v \rangle = \langle u, Tv \rangle$$ and $$u = 0$$

if $$T$$ has a dense range (for example, it is surjective). Thus, $$T$$ is injective. Since $$T^{-1}$$ is closed (by Lemma 2.something) and $$\operatorname{ran}T = \mathfrak{H}_2$$, $$T^{-1}: \mathfrak{H}_2 \to \operatorname{dom}T$$ is a continuous linear operator. Finally, we have:
 * $$\operatorname{gra}T^{-1} = V \operatorname{gra}T \subset V \operatorname{gra}T^* = \operatorname{gra}(T^*)^{-1} = \operatorname{gra}(T^{-1})^*$$.

Here, $$V(x_1 \oplus x_2) = x_2 \oplus x_1$$, and the equality holds since the domains of $$T$$ and $$T^*$$ coincide. Hence, $$T^{-1}$$ is self-adjoint. Since we have just proved that the inverse of a self-adjoint is self-adjoint, we have: $$(T^{-1})^{-1}$$ is self-adjoint.$$\square$$

3 Theorem ''Let $$\mathcal{M}$$ be a closed linear subspace of a Hilbert space $$\mathfrak{H}$$. Then $$\pi$$ is an orthogonal projection onto $$\mathcal{M}$$ if and only if $$\pi = \pi^* = \pi^2$$ and the range of $$\pi$$ is $$\mathcal{M}$$.''

Proof: The direct part is clear except for $$\pi = \pi^*$$. But we have:
 * $$\langle \pi(x), x \rangle = \|\pi(x)\|^2$$

since $$\pi(x)$$ and $$x - \pi(x)$$ are orthogonal. Thus, $$\pi$$ is real and so self-adjoint then. For the converse, we only have to verify $$x - \pi(x) \in \mathcal{M}^\bot$$ for every $$x$$. But we have: $$\pi (x - \pi(x)) = 0$$ and $$\operatorname{ker} (\pi) = \operatorname{ker} (\pi^*) = (\operatorname{ran} (\pi))^\bot = \mathcal{M}^\bot$$. $$\square$$

We shall now turn our attention to the spectral decomposition of a compact self-adjoint operator. Let $$T:\mathfrak{H} \to \mathfrak{H}$$ be a compact operator.