3 Orthogonal projections
Destination: Mathlib.Analysis.InnerProductSpace.Projection
Principal reference: Chapter 12 of [Walter Rudin, Functional Analysis.][Rud87].
Let \(H\) be a complex Hilbert space and \(K\) be a closed subspace of \(H\). We denote \(K^\perp \) the orthogonal complement of \(K\) in \(H\). Any vector \(x \in H\) can be written as \(x = x_K + x_{K^\perp }\), where \(x_K \in K, x_{K^\perp } \in K^\perp \). The map \(p(K) : x \to x_K\) is called the orthogonal projection to \(K\).
It holds that \(p(K) = p(K)^2 = p(K)^*\).
The first equality follows by the uniqueness of the orthogonal decomposition.
The second equality follows because \(\langle y, p(K)x \rangle = \langle y, x_K\rangle = \langle y_K, x \rangle = \langle p(K)y, x\rangle \) by orthogonality.
For \(p \in \mathcal{B}(H)\) such that \(p = p^2 = p^*\), there is a closed subspace \(K\) such that \(p = p(K)\).
By \(p = p^2\), it is a projection. Let \(K\) be the image of \(p\). Note that \(x = (p + (1-p))x = px + (1-p)x\) for any \(x \in H\) and \(\langle px, (1-p)x\rangle = \langle x, (p-p)x\rangle = 0\). So this gives the orthogonal decomposition.
Let \(\{ x_n\} \) be a sequence of pairwise orthogonal vectors in \(H\). Then the following are equivalent.
\(\sum _{n=1}^\infty x_n\) converges in the norm topology of \(H\).
\(\sum _{n=1}^\infty \| x_n\| ^2 {\lt} \infty \).
Note that, by orthogonality, \(\| \sum _{j=m}^n x_j\| ^2 = \sum _{j=m}^n \| x_j\| ^2\). Therefore, the second condition implies that the sequence \(\sum _{j=1}^n x_j\) is Cauchy.
Conversely, as \(\sum _{j=1}^n x_j\) converges in norm, the square of its norm \(\sum _{j=1}^n \| x_j\| ^2\) converges.
Let \(\{ x_n\} \) be a sequence of pairwise orthogonal vectors in \(H\). Then the following are equivalent.
\(\sum _{n=1}^\infty x_n\) converges in the norm topology of \(H\).
\(\sum _{n=1}^\infty \langle x, y\rangle \) converges for all \(y \in H\).
The first condition implies the second by Cauchy-Schwartz.
Assume that \(\sum _{n=1}^\infty \langle x, y\rangle \) converges for all \(y \in H\). Define \(\Lambda _n y = \sum _{j=1}^n \langle y, x_j\rangle \). As this converges for each \(y\), by Banach-Steinhaus, \(\{ \| \Lambda _n\| \} \) is bounded. As \(\| \Lambda _n\| = \sqrt{\sum _{j=1}^n \| x_j\| }\), this gives the first condition.