description/proof of that for finite-dimensional vectors space with \(2\) bases with \(1\) shared element and symmetric or antisymmetric \((0, 2)\)-tensor whose interior multiplication by shared element Is \(0\), if restriction of tensor on one subspace is non-degenerate, other restriction is non-degenerate
Topics
About: vectors space
The table of contents of this article
Starting Context
- The reader knows a definition of non-degenerate \((0, 2)\)-tensor of vectors space.
- The reader knows a definition of interior multiplication of tensor by vector.
- The reader knows a definition of \((j, l)\)-minor of matrix.
- The reader admits the proposition that for any vectors space over any field and any square matrix over the field with dimension equal to or smaller than the dimension of the vectors space, the matrix is invertible if it maps a linearly-independent set of vectors to a linearly-independent set of vectors, and if the matrix is invertible, it maps any linearly-independent set of vectors to a linearly-independent set of vectors.
- The reader admits the proposition that for any module with any basis, the components set of any element with respect to the basis is unique.
- The reader admits the proposition that for any square matrix over any field with any \(2\) square diagonal blocks with \(1\) of the rest blocks \(0\), the determinant is nonzero if and only if the determinants of the diagonal blocks are nonzero.
- The reader admits the proposition that for any 'vectors spaces - linear morphisms' isomorphism, the image of any linearly independent subset or any basis of the domain is linearly independent or a basis on the codomain.
- The reader admits the proposition that for any linear surjection from any finite-dimensional vectors space, if the dimension of the codomain is equal to or larger than that of the domain, the surjection is a bijection.
Target Context
- The reader will have a description and a proof of the proposition that for any finite-dimensional vectors space with any \(2\) bases with any \(1\) shared element, any symmetric or antisymmetric \((0, 2)\)-tensor whose interior multiplication by the shared element Is \(0\), and the \(2\) vectors subspaces spanned by the bases minus the shared element, if the restriction of the tensor on one of the subspace is non-degenerate, the restriction on the other subspace is non-degenerate.
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(d'\): \(\in \mathbb{N} \setminus \{0\}\)
\(V'\): \(\in \{\text{ the } d' \text{ -dimensional } F \text{ vectors spaces }\}\)
\(B_1\): \(= (b, b_{1, 1}, ..., b_{1, d' - 1})\), \(\in \{\text{ the bases for } V'\}\)
\(B_2\): \(= (b, b_{2, 1}, ..., b_{2, d' - 1})\), \(\in \{\text{ the bases for } V'\}\)
\(V_1\): \(= Span (\{b_{1, 1}, ..., b_{1, d' - 1}\})\), \(\in \{\text{ the vectors subspaces of } V'\}\)
\(V_2\): \(= Span (\{b_{2, 1}, ..., b_{2, d' - 1}\})\), \(\in \{\text{ the vectors subspaces of } V'\}\)
\(t\): \(\in T^0_2 (V')\), such that \(i_b (t) = 0\)
\(t_1\): \(= t \vert_{V_1 \times V_1}: V_1 \times V_1 \to F\)
\(t_2\): \(= t \vert_{V_2 \times V_2}: V_2 \times V_2 \to F\)
//
Statements:
\(t_1 \in \{\text{ the non-degenerate } (0, 2) \text{ -tensors }\}\)
\(\implies\)
\(t_2 \in \{\text{ the non-degenerate } (0, 2) \text{ -tensors }\}\)
//
2: Note
The condition, \(i_b (t) = 0\), is crucial for this proposition, as is seen in Proof.
3: Proof
Whole Strategy: Step 1: take the transition matrix, \(M'\), such that \((b, b_{2, 1}, ..., b_{2, d' - 1}) = (b, b_{1, 1}, ..., b_{1, d' - 1}) M'\), and see that its \((1, 1)\)-minor, \(M\), is invertible; Step 2: see that \(t_2 (b_{2, j}, \bullet) = M^l_j t (b_{1, l}, \bullet) \vert_{V_2}\); Step 3: see that \(\{r^j t_2 (b_{2, j}, v) \vert v \in V_2\} = \{r^j M^n_j t_1 (b_{1, n}, v) \vert v \in V_1\}\); Step 4: suppose that \(r^j t_2 (b_{2, j}, \bullet) = 0\), and see that \(r^j M^n_j t_1 (b_{1, n}, \bullet) = 0\) and \(r^j = 0\).
Step 1:
Let us take the transition matrix from \(B_1\) to \(B_2\), \(M'\), such that \((b, b_{2, 1}, ..., b_{2, d' - 1}) = (b, b_{1, 1}, ..., b_{1, d' - 1}) M'\).
\(M'\) is invertible, by the proposition that for any vectors space over any field and any square matrix over the field with dimension equal to or smaller than the dimension of the vectors space, the matrix is invertible if it maps a linearly-independent set of vectors to a linearly-independent set of vectors, and if the matrix is invertible, it maps any linearly-independent set of vectors to a linearly-independent set of vectors.
The 1st column of \(M'\) is \(\begin{pmatrix} 1 \\ 0 \\ ... \\ 0 \end{pmatrix}\), because \(b = b M'^1_1 + b_{1, j} M'^j_1\), which implies that \(M'^1_1 = 1\) and \(M'^j_1 = 0\): the decomposition of \(b\) with respect to \(B_1\) is unique, by the proposition that for any module with any basis, the components set of any element with respect to the basis is unique.
Let \(M\) be the \((1, 1)\)-minor of \(M'\), with the rows index and the column index \((1, ..., d' - 1)\).
\(det M \neq 0\), by the proposition that for any square matrix over any field with any \(2\) square diagonal blocks with \(1\) of the rest blocks \(0\), the determinant is nonzero if and only if the determinants of the diagonal blocks are nonzero.
Step 2:
For each \(j \in \{1, ..., d' - 1\}\), \(b_{2, j} = b M'^1_j + b_{1, l} M^l_j\).
So, \(t_2 (b_{2, j}, \bullet) = t_2 (b M'^1_j + b_{1, l} M^l_j, \bullet) = t (b M'^1_j + b_{1, l} M^l_j, \bullet) \vert_{V_2} = M'^1_j t (b, \bullet) \vert_{V_2} + M^l_j t (b_{1, l}, \bullet) \vert_{V_2} = M'^1_j 0 \vert_{V_2} + M^l_j t (b_{1, l}, \bullet) \vert_{V_2}\), because \(i_b (t) = 0\), \(= M^l_j t (b_{1, l}, \bullet) \vert_{V_2}\).
Step 3:
Let \(v = v^m b_{2, m} \in V_2\) be any.
\(v = v^m (b M'^1_m + b_{1, n} M^n_m)\).
\(t_2 (b_{2, j}, v) = M^l_j t (b_{1, l}, v^m (b M'^1_m + b_{1, n} M^n_m)) = M^l_j t (b_{1, l}, v^m b M'^1_m + v^m b_{1, n} M^n_m) = M^l_j t (b_{1, l}, v^m b M'^1_m) + M^l_j t (b_{1, l}, v^m b_{1, n} M^n_m) = M^l_j v^m M'^1_m t (b_{1, l}, b) + M^l_j t (b_{1, l}, v^m b_{1, n} M^n_m) = + \text{ or } - M^l_j v^m M'^1_m t (b, b_{1, l}) + M^l_j t (b_{1, l}, v^m b_{1, n} M^n_m)\), because \(t\) is symmetric or antisymmetric, \(= + \text{ or } - M^l_j v^m M'^1_m 0 + M^l_j t (b_{1, l}, v^m b_{1, n} M^n_m)\), because \(i_b (t) = 0\), \(= M^l_j t (b_{1, l}, b_{1, n} M^n_m v^m)\).
For each \(j \in \{1, ..., d' - 1\}\), let \(r^j \in F\) be any.
\(r^j t_2 (b_{2, j}, v) = r^j M^l_j t (b_{1, l}, b_{1, n} M^n_m v^m)\), which is \(r^j M^l_j t (b_{1, l}, \bullet)\) operated on \(b_{1, n} M^n_m v^m\).
But as \(M\) is invertible, \((M^1_m v^m, ..., M^{d' - 1}_m v^m)^t = M \begin{pmatrix} v^1 \\ ... \\ v^{d' - 1} \end{pmatrix}\) covers \(F^{d' - 1}\), and so, \(b_{1, n} M^n_m v^m\) covers the whole \(V_1\).
So, \(\{r^j t_2 (b_{2, j}, v) \vert v \in V_2\} = \{r^j M^l_j t (b_{1, l}, v) \vert v \in V_1\} = \{r^j M^l_j t_1 (b_{1, l}, v) \vert v \in V_1\}\).
Step 4:
Let us suppose that \(r^j t_2 (b_{2, j}, \bullet) = 0\), which is for seeing that \((t_2 (b_{2, 1}, \bullet), ..., t_2 (b_{2, d' - 1}, \bullet))\) is linearly independent.
That implies that for each \(v \in V_2\), \((r^j t_2 (b_{2, j}, \bullet)) (v) = 0\), while the left hand side is \(r^j t_2 (b_{2, j}, v)\).
That implies that \(\{r^j t_2 (b_{2, j}, v) \vert v \in V_2\} = \{0\}\).
By Step 3, \(\{r^j M^l_j t_1 (b_{1, l}, v) \vert v \in V_1\} = \{0\}\).
That implies that \(r^j M^l_j t_1 (b_{1, l}, \bullet) = 0\).
As \(t_1\) is non-degenerate, \(\widehat{t_1}: V_1 \to {V_1}^*\) is a 'vectors spaces - linear morphisms' isomorphism, as is mentioned in the definition of non-degenerate \((0, 2)\)-tensor of vectors space.
So, \((t_1 (b_{1, 1}, \bullet), ..., t_1 (b_{1, d' - 1}, \bullet))\) is linearly independent, by the proposition that for any 'vectors spaces - linear morphisms' isomorphism, the image of any linearly independent subset or any basis of the domain is linearly independent or a basis on the codomain.
But as \(M\) is invertible, \((M^l_1 t_1 (b_{1, l}, \bullet), ..., M^l_{d' - 1} t_1 (b_{1, l}, \bullet))\) is linearly independent, by the proposition that for any vectors space over any field and any square matrix over the field with dimension equal to or smaller than the dimension of the vectors space, the matrix is invertible if it maps a linearly-independent set of vectors to a linearly-independent set of vectors, and if the matrix is invertible, it maps any linearly-independent set of vectors to a linearly-independent set of vectors.
That implies that \(r^j\) s are \(0\).
So, \((t_2 (b_{2, 1}, \bullet), ..., t_2 (b_{2, d' - 1}, \bullet))\) is linearly independent.
As \({V_2}^*\) is \((d' - 1)\)-dimensional, \(\widehat{t_2}: V_2 \to {V_2}^*\) is surjective, so, it is bijective, by the proposition that for any linear surjection from any finite-dimensional vectors space, if the dimension of the codomain is equal to or larger than that of the domain, the surjection is a bijection.
So, \(t_2\) is non-degenerative, as is mentioned in the definition of non-degenerate \((0, 2)\)-tensor of vectors space.