definition of orthogonal projection from vectors space with inner product into vectors subspace
Topics
About: vectors space
The table of contents of this article
Starting Context
- The reader knows a definition of %field name% vectors space.
- The reader knows a definition of inner product on real or complex vectors space.
- The reader knows a definition of projection from vectors space into vectors subspace.
Target Context
- The reader will have a definition of orthogonal projection from vectors space with inner product into vectors subspace.
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\( F\): \(\in \{\mathbb{R}, \mathbb{C}\}\), with the canonical field structure
\( V'\): \(\in \{\text{ the } F \text{ vectors spaces }\}\), with any inner product, \(\langle \bullet, \bullet \rangle: V' \times V' \to F\)
\( V\): \(\in \{\text{ the vectors subspaces of } V'\}\)
\(*f\): \(: V' \to V\), \(\in \{\text{ the projections }\}\)
//
Conditions:
\(\forall v' \in V' (\forall v \in V (\langle v' - f (v'), v \rangle = 0))\)
//
2: Note
This definition is not saying that for any \(V'\) and \(V\), there is such a map, but is saying that if there is such a map, it is called "projection".
If there is such a \(f (v')\) for a \(v'\), it is unique, by the proposition that for any vectors space with the norm induced by any inner product, any subspace, and any vector on the superspace, if there is a vector on the subspace whose distance to the vector is the minimum, it is unique and the difference of the vectors is perpendicular to the subspace, and if there is a vector on the subspace such that the difference of the vectors is perpendicular to the subspace, it is unique and the distance is the minimum.
There are some typical cases in which \(f\) exists.
A typical case is that \(V\) is finite-dimensional: \(V\) has an orthonormal basis, \(B = \{b_1, ..., b_d\}\); then, \(f (v') = \sum_{j \in \{1, ..., d\}} \langle v', b_j \rangle b_j\) is the one: for each \(v \in V\), \(v = \sum_{j \in \{1, ..., d\}} v^j b_j\) and \(\langle v' - f (v'), v \rangle = \langle v' - f (v'), \sum_{j \in \{1, ..., d\}} v^j b_j \rangle = \sum_{j \in \{1, ..., d\}} \overline{v^j} \langle v' - f (v'), b_j \rangle = \sum_{j \in \{1, ..., d\}} \overline{v^j} (\langle v', b_j \rangle - \langle \sum_{l \in \{1, ..., d\}} \langle v', b_l \rangle b_l, b_j \rangle) = \sum_{j \in \{1, ..., d\}} \overline{v^j} (\langle v', b_j \rangle - \sum_{l \in \{1, ..., d\}} \langle v', b_l \rangle \langle b_l, b_j \rangle) = \sum_{j \in \{1, ..., d\}} \overline{v^j} (\langle v', b_j \rangle - \sum_{l \in \{1, ..., d\}} \langle v', b_l \rangle \delta_{l, j}) = \sum_{j \in \{1, ..., d\}} \overline{v^j} (\langle v', b_j \rangle - \langle v', b_j \rangle) = 0\).
Although it has used a basis, it does really depend on the choice of basis, because \(f\) is uniquely determined by the proposition that for any vectors space with the norm induced by any inner product, any subspace, and any vector on the superspace, if there is a vector on the subspace whose distance to the vector is the minimum, it is unique and the difference of the vectors is perpendicular to the subspace, and if there is a vector on the subspace such that the difference of the vectors is perpendicular to the subspace, it is unique and the distance is the minimum.
Another typical case is that \(V'\) is any Hilbert space and \(V\) is any closed vectors subspace: the proposition that for any Hilbert space, any nonempty closed convex subset, and any point on the Hilbert space, there is the unique point on the subset whose distance to the point is the minimum and the proposition that for any vectors space with the norm induced by any inner product, any subspace, and any vector on the superspace, if there is a vector on the subspace whose distance to the vector is the minimum, it is unique and the difference of the vectors is perpendicular to the subspace, and if there is a vector on the subspace such that the difference of the vectors is perpendicular to the subspace, it is unique and the distance is the minimum.
\(V\) is inevitably a Hilbert space, by the proposition that for any complete metric space, any closed subspace is complete.
When \(V\) is separable (when \(V'\) is separable, \(V\) is inevitably so), \(V\) has an orthonormal Schauder basis, \(B = \{b_1, b_2, ...\}\), by the proposition that any separable Hilbert space has an orthonormal Schauder basis, and \(f (v') = \sum_j \langle v', b_j \rangle b_j\) is the one: it converges by the proposition that for any Hilbert space, any countable orthonormal subset, and any element of the Hilbert space, the linear combination of the subset with the the-element-and-subset-element-inner-product coefficients converges; for each \(v \in V\), \(v = \sum_j v^j b_j\) and \(\langle v' - f (v'), v \rangle = \langle v' - f (v'), \sum_j v^j b_j \rangle = \sum_j \overline{v^j} \langle v' - f (v'), b_j \rangle = \sum_j \overline{v^j} \langle v' - \sum_l \langle v', b_l \rangle b_l, b_j \rangle = \sum_j \overline{v^j} (\langle v', b_j \rangle - \sum_l \langle v', b_l \rangle \langle b_l, b_j \rangle) = \sum_j \overline{v^j} (\langle v', b_j \rangle - \sum_l \langle v', b_l \rangle \delta_{l, j}) = \sum_j \overline{v^j} (\langle v', b_j \rangle - \langle v', b_j \rangle) = 0\).
Let us see that \(f\) is a projection from vectors space into vectors subspace by the definition of projection from vectors space into vectors subspace.
\(f\) is linear, because \(\langle (r^1 v'_1 + r^2 v'_2) - (r^1 f (v'_1) + r^2 f (v'_2)), v \rangle = \langle r^1 (v'_1 - f (v'_1)) + r^2 (v'_2 - f (v'_2)), v \rangle = r^1 \langle v'_1 - f (v'_1), v \rangle + r^2 \langle v'_2 - f (v'_2), v \rangle = r^1 0 + r^2 0 = 0\), which means that \(f (r^1 v'_1 + r^2 v'_2) = r^1 f (v'_1) + r^2 f (v'_2)\).
For each \(v \in V\), for each \(w \in V\), \(\langle v - v, w \rangle = \langle 0, w \rangle = 0\), which means that \(f (v) = v\).