The Spectral Theorem

3 Orthogonal projections

  • Destination: Mathlib.Analysis.InnerProductSpace.Projection

  • Principal reference: Chapter 12 of [Walter Rudin, Functional Analysis.][Rud87].

Let \(H\) be a complex Hilbert space and \(K\) be a closed subspace of \(H\). We denote \(K^\perp \) the orthogonal complement of \(K\) in \(H\). Any vector \(x \in H\) can be written as \(x = x_K + x_{K^\perp }\), where \(x_K \in K, x_{K^\perp } \in K^\perp \). The map \(p(K) : x \to x_K\) is called the orthogonal projection to \(K\).

Lemma 12

It holds that \(p(K) = p(K)^2 = p(K)^*\).

Proof

The first equality follows by the uniqueness of the orthogonal decomposition.

The second equality follows because \(\langle y, p(K)x \rangle = \langle y, x_K\rangle = \langle y_K, x \rangle = \langle p(K)y, x\rangle \) by orthogonality.

Lemma 13

For \(p \in \mathcal{B}(H)\) such that \(p = p^2 = p^*\), there is a closed subspace \(K\) such that \(p = p(K)\).

Proof

By \(p = p^2\), it is a projection. Let \(K\) be the image of \(p\). Note that \(x = (p + (1-p))x = px + (1-p)x\) for any \(x \in H\) and \(\langle px, (1-p)x\rangle = \langle x, (p-p)x\rangle = 0\). So this gives the orthogonal decomposition.

Lemma 14 Rudin 12.6, part 1

Let \(\{ x_n\} \) be a sequence of pairwise orthogonal vectors in \(H\). Then the following are equivalent.

  • \(\sum _{n=1}^\infty x_n\) converges in the norm topology of \(H\).

  • \(\sum _{n=1}^\infty \| x_n\| ^2 {\lt} \infty \).

Proof

Note that, by orthogonality, \(\| \sum _{j=m}^n x_j\| ^2 = \sum _{j=m}^n \| x_j\| ^2\). Therefore, the second condition implies that the sequence \(\sum _{j=1}^n x_j\) is Cauchy.

Conversely, as \(\sum _{j=1}^n x_j\) converges in norm, the square of its norm \(\sum _{j=1}^n \| x_j\| ^2\) converges.

Lemma 15 Rudin 12.6, part 2

Let \(\{ x_n\} \) be a sequence of pairwise orthogonal vectors in \(H\). Then the following are equivalent.

  • \(\sum _{n=1}^\infty x_n\) converges in the norm topology of \(H\).

  • \(\sum _{n=1}^\infty \langle x, y\rangle \) converges for all \(y \in H\).

Proof

The first condition implies the second by Cauchy-Schwartz.

Assume that \(\sum _{n=1}^\infty \langle x, y\rangle \) converges for all \(y \in H\). Define \(\Lambda _n y = \sum _{j=1}^n \langle y, x_j\rangle \). As this converges for each \(y\), by Banach-Steinhaus, \(\{ \| \Lambda _n\| \} \) is bounded. As \(\| \Lambda _n\| = \sqrt{\sum _{j=1}^n \| x_j\| }\), this gives the first condition.