2025-04-27

1096: Separable Hilbert Space Has Orthonormal Schauder Basis

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that separable Hilbert space has orthonormal Schauder basis

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that any separable Hilbert space has an orthonormal Schauder basis.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\mathbb{R}, \mathbb{C}\}\), with the canonical field structure
\((V, dist)\): \(\in \{\text{ the Hilbert spaces }\}\), with any inner product, \(\langle \bullet, \bullet \rangle\), with the topology induced by \(dist\)
//

Statements:
\(V \in \{\text{ the separable topological spaces }\}\) with any countable dense subset, \(S\)
\(\implies\)
\(\widetilde{S} := \text{ the Gram-Schmidt orthonormalization of } S = \{b_1, b_2, ...\} \in \{\text{ the orthonormal Schauder bases for } V\}\)
//


2: Proof


Whole Strategy: Step 1: let \(v \in V\) be any, see that \(v' := \sum_j \langle v, b_j \rangle b_j\) converges, and let \(v'' := v - v'\); Step 2: for each \(w \in (\widetilde{S})\), see that \(\Vert v'' \Vert^2 \le \Vert v - w \Vert^2\); Step 3: for each \(\epsilon\), take any \(w \in (\widetilde{S})\) such that \(\Vert v - w \Vert^2 \lt \epsilon^2\), and see that \(\Vert v'' \Vert = 0\); Step 4: conclude the proposition.

Step 1:

Let \(v \in V\) be any.

By the proposition that for any Hilbert space, any countable orthonormal subset, and any element of the Hilbert space, the linear combination of the subset with the the-element-and-subset-element-inner-product coefficients converges, \(v' := \sum_j \langle v, b_j \rangle b_j\) converges.

Let \(v'' := v - v'\).

The inner product with any 1 argument fixed is continuous, by the proposition that for any real or complex vectors space with the topology induced by the metric induced by the norm induced by any inner product, the inner product with any 1 argument fixed is a continuous map.

So, any limit of any argument of inner product can be taken outside the inner product, by the proposition that for any continuous map and any net with directed index set that converges to any point on the domain, the image of the net converges to the image of the point and if the codomain is Hausdorff, the convergence of the image of the net is the image of the point, which will be used hereafter without any further explanation.

Let us see that for each \(b_l \in \widetilde{S}\), \(\langle v'', b_l \rangle = 0\), which will be used later.

\(\langle v'', b_l \rangle = \langle v - v', b_l \rangle = \langle v - lim_n \sum^n_{j = 1} \langle v, b_j \rangle b_j, b_l \rangle = \langle lim_n (v - \sum^n_{j = 1} \langle v, b_j \rangle b_j), b_l \rangle = lim_n \langle v - \sum^n_{j = 1} \langle v, b_j \rangle b_j, b_l \rangle = lim_n (\langle v, b_l \rangle - \sum^n_{j = 1} \langle v, b_j \rangle \langle b_j, b_l \rangle) = lim_n (\langle v, b_l \rangle - \sum^n_{j = 1} \langle v, b_j \rangle \delta_{j, l}) = lim_n (\langle v, b_l \rangle - \langle v, b_l \rangle) = lim_n 0 = 0\).

Step 2:

Let the subspace generate by \(\widetilde{S}\) be denoted as \((\widetilde{S})\).

Let \(w \in (\widetilde{S})\) be any.

Let us see that \(\Vert v'' \Vert^2 \le \Vert v - w \Vert^2\).

\(w = \sum^n_{j = 1} w^j b_j\): refer to Note for the definition of sub-'vectors space' generated by subset of vectors space.

\(\langle v - w, v - w \rangle = \langle v'' + v' - \sum^n_{j = 1} w^j b_j, v'' + v' - \sum^n_{l = 1} w^l b_l \rangle = \langle v'', v'' \rangle + \langle v'', v' - \sum^n_{l = 1} w^l b_l \rangle + \langle v' - \sum^n_{j = 1} w^j b_j, v'' \rangle + \langle v' - \sum^n_{j = 1} w^l b_j, v' - \sum^n_{l = 1} w^l b_l \rangle = \langle v'', v'' \rangle + \langle v'', lim_m \sum^m_{j = 1} \langle v, b_j \rangle b_j - \sum^n_{l = 1} w^l b_l \rangle + \langle lim_m \sum^m_{j = 1} \langle v, b_j \rangle b_j - \sum^n_{j = 1} w^j b_j, v'' \rangle + \langle v' - \sum^n_{j = 1} w^l b_j, v' - \sum^n_{l = 1} w^l b_l \rangle = \langle v'', v'' \rangle + \langle v'', lim_m (\sum^m_{j = 1} \langle v, b_j \rangle b_j - \sum^n_{l = 1} w^l b_l) \rangle + \langle lim_m (\sum^m_{j = 1} \langle v, b_j \rangle b_j - \sum^n_{j = 1} w^j b_j), v'' \rangle + \langle v' - \sum^n_{j = 1} w^l b_j, v' - \sum^n_{l = 1} w^l b_l \rangle = \langle v'', v'' \rangle + lim_m \langle v'', \sum^m_{j = 1} \langle v, b_j \rangle b_j - \sum^n_{l = 1} w^l b_l \rangle + lim_m \langle \sum^m_{j = 1} \langle v, b_j \rangle b_j - \sum^n_{j = 1} w^j b_j, v'' \rangle + \langle v' - \sum^n_{j = 1} w^l b_j, v' - \sum^n_{l = 1} w^l b_l \rangle = \langle v'', v'' \rangle + lim_m 0 + lim_m 0 + \langle v' - \sum^n_{j = 1} w^l b_j, v' - \sum^n_{l = 1} w^l b_l \rangle = \langle v'', v'' \rangle + 0 + 0 + \langle v' - \sum^n_{j = 1} w^l b_j, v' - \sum^n_{l = 1} w^l b_l \rangle\), which means that \(\langle v'', v'' \rangle \le \langle v - w, v - w \rangle\).

Step 3:

As \(S\) is dense in \(V\), \((\widetilde{S})\) is dense in \(V\), because each element of \(S\) is a linear combination of \(\widetilde{S}\).

Let \(\epsilon \in \mathbb{R}\) be any such that \(0 \lt \epsilon\).

There is a \(w \in (\widetilde{S})\) such that \(\Vert v - w \Vert^2 \lt \epsilon^2\), because \((\widetilde{S})\) is dense in \(V\).

But by Step 2, \(\langle v'', v'' \rangle \le \langle v - w, v - w \rangle \lt \epsilon^2\), which means that \(\langle v'', v'' \rangle = 0\), which implies that \(v'' = 0\).

That means that \(v = v'\).

Step 4:

So, for each \(v \in V\), \(v = \sum_{j} \langle v, b_j \rangle b_j\).

Let us see that the decomposition is unique.

1st, let us see that for each \(v = \sum_j v^j b_j\) and \(v' = \sum_j v'^j b_j\), \(v' - v = \sum_j (v'^j - v^j) b_j\).

\(\Vert v' - v - \sum^n_{j = 1} (v'^j - v^j) b_j \Vert = \Vert (v' - \sum^n_{j = 1} v'^j b_j) - (v - \sum^n_{j = 1} v^j b_j) \Vert \le \Vert v' - \sum^n_{j = 1} v'^j b_j \Vert + \Vert v - \sum^n_{j = 1} v^j b_j \Vert\), but for each \(\epsilon \in \mathbb{R}\) such that \(0 \lt \epsilon\), there is an \(N' \subseteq \mathbb{N}\) such that for each \(n \in \mathbb{N}\) such that \(N' \lt n\), \(\Vert v' - \sum^n_{j = 1} v'^j b_j \Vert \lt \epsilon / 2\) and there is an \(N \subseteq \mathbb{N}\) such that for each \(n \in \mathbb{N}\) such that \(N \lt n\), \(\Vert v - \sum^n_{j = 1} v^j b_j \Vert \lt \epsilon / 2\), and so, we can take \(max (N', N)\) and for each \(n \in \mathbb{N}\) such that \(max (N', N) \lt n\), \(\Vert v' - \sum^n_{j = 1} v'^j b_j \Vert + \Vert v - \sum^n_{j = 1} v^j b_j \Vert \lt \epsilon / 2 + \epsilon / 2 = \epsilon\), so, \(\Vert v' - v - \sum^n_{j = 1} (v'^j - v^j) b_j \Vert \lt \epsilon\), which means that \(v' - v = \sum_j (v'^j - v^j) b_j\).

2nd, let us see that \(v = \sum_j v^j b_j = 0\) implies that \(v^j = 0\).

If \(v^l \neq 0\), \(\langle v, b_l \rangle = \langle \sum_j v^j b_j, b_l \rangle = \langle lim_n \sum^n_{j = 1} v^j b_j, b_l \rangle = lim_n \langle \sum^n_{j = 1} v^j b_j, b_l \rangle = lim_n \sum^n_{j = 1} v^j \langle b_j, b_l \rangle = lim_n \sum^n_{j = 1} v^j \delta_{j, l} = lim_n v^l = v^l \neq 0\), a contradiction against \(v = 0\): for \(\langle 0, b_l \rangle = c\), \(\langle 0, b_l \rangle = \langle 0 + 0, b_l \rangle = \langle 0, b_l \rangle + \langle 0, b_l \rangle = c + c = 2 c = c\), so, \(c = 0\).

Now, if there is a \(v = \sum_j v^j b_j\), \(0 = v - v = \sum_j (v^j - \langle v, b_j \rangle) b_j\), which implies that \(v^j = \langle v, b_j \rangle\).

So, \(\widetilde{S}\) is an orthonormal Schauder basis.


References


<The previous article in this series | The table of contents of this series | The next article in this series>