description/proof of that for Euclidean vectors space with Euclidean norm and \(2\) vectors with same norm, there is orthogonal linear map that maps one of vectors to other whose canonical representative matrix is orthogonal, and when dimension is equal to or larger than \(2\), determinant can me made \(1\)
Topics
About: vectors space
The table of contents of this article
Starting Context
- The reader knows a definition of Euclidean-normed Euclidean vectors space.
- The reader knows a definition of orthogonal linear map.
- The reader knows a definition of canonical representative matrix of linear map between finite-product-of-copies-of-field vectors spaces.
- The reader knows a definition of orthogonal matrix.
- The reader knows a definition of Gram-Schmidt orthonormalization of countable subset of vectors space with inner product.
- The reader admits the proposition that for any finite-dimensional vectors space, any linearly independent subset can be expanded to be a basis by adding a finite number of elements.
- The reader admits the proposition that between any vectors spaces, any map that maps any basis onto any basis bijectively and expands the mapping linearly is a 'vectors spaces - linear morphisms' isomorphism.
- The reader admits the proposition that for any linear map from any Euclidean-normed Euclidean vectors space into itself, the map is orthogonal if and only if the canonical representative matrix is orthogonal.
Target Context
- The reader will have a description and a proof of the proposition that for any Euclidean vectors space with the Euclidean norm and any \(2\) vectors with any same norm, there is an orthogonal linear map that maps any one of the vectors to the other whose canonical representative matrix is orthogonal, and when the dimension is equal to or larger than \(2\), the determinant of the matrix can me made \(1\).
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\(d\): \(\in \mathbb{N} \setminus \{0\}\)
\(\mathbb{R}^d\): \(= \text{ the Euclidean vectors space with the Euclidean norm }\)
\(\{v, v'\}\): \(\subseteq \mathbb{R}^d\), such that \(\Vert v \Vert = \Vert v' \Vert\)
//
Statements:
\(\exists M \in \{\text{ the } d \times d \text{ orthogonal matrices } \} (v'^t = M v^t)\)
\(\land\)
(
\(2 \le d\)
\(\implies\)
\(\exists M \in \{\text{ the determinant } 1 d \times d \text{ orthogonal matrices } \} (v'^t = M v^t)\)
)
//
2: Note
Also the proposition that for any complex Euclidean vectors space with the complex Euclidean norm and any \(2\) vectors with any same norm, there is an orthogonal linear map that maps any one of the vectors to the other whose canonical representative matrix is unitary, and when the dimension is equal to or larger than \(2\), the determinant of the matrix can me made \(1\) holds.
3: Proof
Whole Strategy: Step 1: conclude the proposition for when \(\Vert v \Vert = \Vert v' \Vert = 0\); Step 2: suppose that \(0 \lt \Vert v \Vert = \Vert v' \Vert\); Step 3: take any orthonormal bases, \((b_1 := v / \Vert v \Vert, b_2, ..., b_d)\) and \((b'_1 := v' / \Vert v' \Vert, b'_2, ..., b'_d)\), and define the linear map, \(f: \mathbb{R}^d \to \mathbb{R}^d, v = v^j b_j \mapsto v^j b'_j\); Step 4: see that \(f\) is orthogonal and that the canonical representative matrix is orthogonal; Step 5: suppose that \(2 \le d\); Step 6: take the vector, \(v'' := (\Vert v \Vert, 0, ..., 0)\), and an orthogonal matrix, \(N\), such that \(v^t = N v''^t\) and multiply the last column of \(N\) by \(det N\) with the result orthogonal matrix, \(\widetilde{N}\), and take an orthogonal matrix, \(N'\), such that \(v'^t = N' v''^t\) and multiply the last column of \(N'\) by \(det N'\) with the result orthogonal matrix, \(\widetilde{N'}\): Step 7: see that \(v'^t = \widetilde{N'} {\widetilde{N}}^{-1} v^t\) and \(M := \widetilde{N'} {\widetilde{N}}^{-1}\) will do.
Step 1:
Let us suppose that \(\Vert v \Vert = \Vert v' \Vert = 0\).
That means that \(v = v' = 0\).
Then, any orthogonal matrix, for example, \(I\), will do: \(v'^t = I v^t\); \(I^* = I^{-1}\); \(det I = 1\) (even when \(d \lt 2\)).
Step 2:
Let us suppose that \(0 \lt \Vert v \Vert = \Vert v' \Vert\).
Step 3:
Let \(\mathbb{R}^d\) have the Euclidean inner product, by which the Euclidean norm is induced.
Let us take any orthonormal basis, \((b_1 := v / \Vert v \Vert, b_2, ..., b_d)\) for \(\mathbb{R}^d\), which is possible, by the proposition that for any finite-dimensional vectors space, any linearly independent subset can be expanded to be a basis by adding a finite number of elements and the definition of Gram-Schmidt orthonormalization of countable subset of vectors space with inner product.
Let us take any orthonormal basis, \((b'_1 := v' / \Vert v' \Vert, b'_2, ..., b'_d)\) for \(\mathbb{R}^d\), which is possible, likewise.
Let us define the map, \(f: \mathbb{R}^d \to \mathbb{R}^d, v = v^j b_j \mapsto v^j b'_j\), which is linear (in fact, a 'vectors spaces - linear morphisms' isomorphism), by the proposition that between any vectors spaces, any map that maps any basis onto any basis bijectively and expands the mapping linearly is a 'vectors spaces - linear morphisms' isomorphism: \(f\) maps \(b_j\) to \(b'_j\) and is expanding the mapping between the bases linearly.
Step 4:
\(f\) is orthogonal, because for each \(v = v^j b_j \in \mathbb{R}^d\), \(\Vert f (v) \Vert = \Vert f (v^j b_j) \Vert = \Vert v^j b'_j \Vert = \sqrt{\langle v^j b'_j, v^l b'_l \rangle} = \sqrt{v^j v^l \langle b'_j, b'_l \rangle} = \sqrt{v^j v^l \delta_{j, l}} = \sqrt{\langle v^j b_j, v^l b_l \rangle} = \Vert v^j b_j \Vert = \Vert v \Vert\).
So, its canonical representative matrix, \(M\), is orthogonal, by the proposition that for any linear map from any Euclidean-normed Euclidean vectors space into itself, the map is orthogonal if and only if the canonical representative matrix is orthogonal.
As \(b'_1 = f (b_1)\), \(v' / \Vert v' \Vert = f (v / \Vert v \Vert) = 1 / \Vert v \Vert f (v) = 1 / \Vert v' \Vert f (v)\), which means that \(v' = f (v)\).
So, \(v'^t = M v^t\).
Step 5:
Let us suppose that \(2 \le d\).
Step 6:
Note that \(det N det N = 1\), because as \(N^t N = I\), \(det (N^t N) = det I = 1\), but \(det (N^t N) = det N^t det N = det N det N\).
Let us take the vector, \(v'' := (\Vert v \Vert, 0, ..., 0)\).
\(\Vert v'' \Vert = \Vert v \Vert\), so, there is an orthogonal matrix, \(N\), such that \(v^t = N v''^t\), by Step 4.
Let us multiply the last column of \(N\) by \(det N\) with the result matrix, \(\widetilde{N}\).
\(\widetilde{N}\) is orthogonal, because for \(N = \begin{pmatrix} N^1_1 & ... & N^1_d \\ ... \\ N^d_1 & ... & N^d_d \end{pmatrix}\), \(\widetilde{N} = \begin{pmatrix} N^1_1 & ... & N^1_{d - 1} & det N N^1_d \\ ... \\ N^d_1 & ... & N^d_{d - 1} & det N N^d_d \end{pmatrix}\), and \(\widetilde{N}^t \widetilde{N} = \begin{pmatrix} N^1_1 & ... & N^d_1 \\ ... \\ N^1_{d - 1} & ... & N^d_{d - 1} \\ det N N^1_d & ... & det N N^d_d \end{pmatrix} \begin{pmatrix} N^1_1 & ... & N^1_{d - 1} & det N N^1_d \\ ... \\ N^d_1 & ... & N^d_{d - 1} & det N N^d_d \end{pmatrix}\), whose \((j, l)\) component is \(\sum_{m \in \{1, ..., d\}} N^m_j N^m_l\) for \(j, l \lt d\); is \(\sum_{m \in \{1, ..., d\}} det N N^m_j N^m_l\) for \(j = d\) and \(l \lt d\); is \(\sum_{m \in \{1, ..., d\}} N^m_j det N N^m_l\) for \(j \lt d\) and \(l = d\); and is \(\sum_{m \in \{1, ..., d\}} det N N^m_j det N N^m_l = \sum_{m \in \{1, ..., d\}} N^m_j N^m_l\) for \(j = l = d\), but \(det N\) appears only when \(j \neq l\), and then, \(\sum_{m \in \{1, ..., d\}} N^m_j N^m_l = 0\), so, the component is \(0\), and when \(j = l\), \(\sum_{m \in \{1, ..., d\}} N^m_j N^m_l = 1\), so, the component is \(1\).
\(det \widetilde{N} = det N det N\), by a property of determinant of matrix, \(= 1\).
\(v^t = \widetilde{N} v''^t\), because as \(v'' = (\Vert v \Vert, 0, ..., 0)\), the last column of \(\widetilde{N}\) does not influence the result at all (which is true only because \(2 \le d\), so, the last column is not the 1st column).
\(\Vert v'' \Vert = \Vert v' \Vert\), so, there is an orthogonal matrix, \(N'\), such that \(v'^t = N' v''^t\), by Step 4.
Let us multiply the last column of \(N'\) by \(det N'\) with the result matrix, \(\widetilde{N'}\).
\(\widetilde{N'}\) is orthogonal, as before.
\(det \widetilde{N'} = 1\), as before.
\(v'^t = \widetilde{N'} v''^t\), as before.
Step 7:
So, \(v'^t = \widetilde{N'} v''^t = \widetilde{N'} {\widetilde{N}}^{-1} v^t\).
Let \(M := \widetilde{N'} {\widetilde{N}}^{-1}\).
\(M\) is orthogonal, because \(M^t M = (\widetilde{N'} {\widetilde{N}}^{-1})^t \widetilde{N'} {\widetilde{N}}^{-1} = ({\widetilde{N}}^{-1})^t \widetilde{N'}^t \widetilde{N'} {\widetilde{N}}^{-1} = ({\widetilde{N}}^{-1})^t I {\widetilde{N}}^{-1} = ({\widetilde{N}}^{-1})^t {\widetilde{N}}^{-1} = ({\widetilde{N}}^t)^t {\widetilde{N}}^t = \widetilde{N} {\widetilde{N}}^t = I\).
\(det M = det (\widetilde{N'} {\widetilde{N}}^{-1}) = det \widetilde{N'} det {\widetilde{N}}^{-1} = 1\).
So, \(M\) satisfies the conditions for the proposition.