2024-08-25

747: For Motion Between Same-Finite-Dimensional Real Vectors Spaces with Norms Induced by Inner Products That Fixes 0, Motion Is Orthogonal Linear Map

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for motion between same-finite-dimensional real vectors spaces with norms induced by inner products that fixes 0, motion is orthogonal linear map

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any motion between any same-finite-dimensional real vectors spaces with the norms induced by any inner products that (the motion) fixes 0, the motion is an orthogonal linear map.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\( V_1\): \(\in \{\text{ the } d \text{ -dimensional normed real vectors spaces }\}\) with the norm, \(\Vert \bullet \Vert_1\), induced by any inner product, \(\langle \bullet, \bullet \rangle_1\)
\( V_2\): \(\in \{\text{ the } d \text{ -dimensional normed real vectors spaces }\}\) with the norm, \(\Vert \bullet \Vert_2\), induced by any inner product, \(\langle \bullet, \bullet \rangle_2\)
\(f\): \(: V_1 \to V_2\), \(\in \{\text{ the motions }\}\)
//

Statements:
\(f (0) = 0\)
\(\implies\)
\(f \in \{\text{ the orthogonal linear maps }\}\)
//


2: Natural Language Description


For any \(d\)-dimensional normed real vectors space, \(V_1\), with the norm, \(\Vert \bullet \Vert_1\), induced by any inner product, \(\langle \bullet, \bullet \rangle_1\), any \(d\)-dimensional normed real vectors space, \(V_2\), with the norm, \(\Vert \bullet \Vert_2\), induced by any inner product, \(\langle \bullet, \bullet \rangle_2\), and any motion, \(f: V_1 \to V_2\), such that \(f (0) = 0\), \(f\) is an orthogonal linear map.


3: Proof


Whole Strategy: Step 1: take any orthonormal basis of \(V_1\), \(\{e_1, ..., e_d\}\); Step 2: see that \(\{f (e_1), ..., f (e_d)\}\) is an orthonormal basis of \(V_2\); Step 3: let \(c^j e_j \in V_1\) be any and see that \(f (c^j e_j) = c^j f (e_j)\); Step 4: see that \(f\) is linear; Step 5: conclude the proposition.

Step 1:

Let us take any orthonormal basis of \(V_1\), \(\{e_1, ..., e_d\}\), which is possible by the Gram-Schmidt orthonormalization.

Step 2:

\(\{f (e_1), ..., f (e_d)\}\) is orthonormal, by the proposition that for any motion between any real vectors spaces with the norms induced by any inner products that (the motion) fixes 0, any orthonormal subset of the domain is mapped to an orthonormal subset. \(\{f (e_1), ..., f (e_d)\}\) is linearly independent, by the proposition that for any vectors space with any inner product, any set of nonzero orthogonal elements is linearly independent. \(\{f (e_1), ..., f (e_d)\}\) is a basis of \(V_2\), by the proposition that for any finite-dimensional vectors space, any linearly independent subset with dimension number of elements is a basis.

Step 3:

Let \(v = c^j e_j \in V_1\) be any.

\(f (v)\) is a linear combination of \(\{f (e_1), ..., f (e_d)\}\), which a basis of \(V_2\). Let \(f (v) = c'^j f (e_j)\).

\(\Vert v \Vert_1 = \Vert v - 0 \Vert_1 = \Vert f (v) - f (0) \Vert_2 = \Vert f (v) - 0 \Vert_2 = \Vert f (v) \Vert_2\).

\(\Vert v \Vert_1^2 = \Vert c^j e_j \Vert_1^2 = \sum_{j \in \{1, ..., d\}} {c^j}^2\); \(\Vert f (v) \Vert_2^2 = \Vert c'^j f (e_d) \Vert_2^2 = \sum_{j \in \{1, ..., d\}} {c'^j}^2\). So, \(\sum_{j \in \{1, ..., d\}} {c^j}^2 = \sum_{j \in \{1, ..., d\}} {c'^j}^2\).

Besides, for each \(k \in \{1, ..., d\}\), \(\Vert v - e_k \Vert_1 = \Vert f (v) - f (e_k) \Vert_2\).

\(\Vert v - e_k \Vert_1^2 = \Vert c^j e_j - e_k \Vert_1^2 = \Vert \sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} c^j e_j + (c^k - 1) e_k \Vert_1^2 = \sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} {c^j}^2 + (c^k - 1)^2\); \(\Vert f (v) - f (e_k) \Vert_2^2 = \Vert c'^j f (e_d) - f (e_k) \Vert_2^2 = \Vert \sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} c'^j f (e_j) + (c'^k - 1) f (e_k) \Vert_2^2 = \sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} {c'^j}^2 + (c'^k - 1)^2\). So, \(\sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} {c^j}^2 + (c^k - 1)^2 = \sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} {c'^j}^2 + (c'^k - 1)^2\).

\(\sum_{j \in \{1, ..., d\}} {c^j}^2 - (\sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} {c^j}^2 + (c^k - 1)^2) = \sum_{j \in \{1, ..., d\}} {c'^j}^2 - (\sum_{j \in \{1, ..., k - 1, \hat{k}, k + 1, ..., d\}} {c'^j}^2 + (c'^k - 1)^2)\), which implies that \({c^k}^2 - (c^k - 1)^2 = {c'^k}^2 - (c'^k - 1)^2\), which implies that \(c^k = c'^k\).

So, \(f (c^j e_j) = c^j f (e_j)\).

Step 4:

Let \(v = c^j e_j, v' = c'^j e_j \in V_1\) and \(r, r' \in \mathbb{R}\) be any.

\(f (r v + r' v') = f (r c^j e_j + r' c'^j e_j) = f ((r c^j + r' c'^j) e_j) = (r c^j + r' c'^j) f (e_j) = r c^j f (e_j) + r' c'^j f (e_j) = r f (v) + r' f (v')\).

So, \(f\) is linear.

Step 5:

For each \(v \in V_1\), \(\Vert v \Vert_1 = \Vert v - 0 \Vert_1 = \Vert f (v) - f (0) \Vert_2 = \Vert f (v) - 0 \Vert_2 = \Vert f (v) \Vert_2\).

So, \(f\) is an orthogonal linear map.


References


<The previous article in this series | The table of contents of this series | The next article in this series>