Showing posts with label Definitions and Propositions. Show all posts
Showing posts with label Definitions and Propositions. Show all posts

2025-03-02

1028: For Tensor Product of \(k\) Finite-Dimensional Vectors Spaces over Field, Transition of Components of Element w.r.t. Standard Bases w.r.t. Bases for Vectors Spaces Is This

<The previous article in this series | The table of contents of this series |

description/proof of that for tensor product of \(k\) finite-dimensional vectors spaces over field, transition of components of element w.r.t. standard bases w.r.t. bases for vectors spaces is this

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for the tensor product of any \(k\) finite-dimensional vectors spaces over any field, the transition of the components of any element w.r.t. the standard bases w.r.t. any bases for the vectors spaces is this.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(\{V_1, ..., V_k\}\): \(\subseteq \{\text{ the finite-dimensional } F \text{ vectors spaces }\}\)
\(V_1 \otimes ... \otimes V_k\): \(= \text{ the tensor product }\)
\(\{B_1, ..., B_k\}\): \(B_j \in \{\text{ the bases for } V_j\} = \{{b_j}_l \vert 1 \le l \le dim V_j\}\)
\(\{B'_1, ..., B'_k\}\): \(B'_j \in \{\text{ the bases for } V_j\} = \{{b_j}_l \vert 1 \le l \le dim V_j\}\)
\(B\): \(= \{[(({b_1}_{l_1}, ..., {b_k}_{l_k}))] \vert {b_j}^{l_j} \in B_j\}\), \(\in \{\text{ the bases for } V_1 \otimes ... \otimes V_k\}\)
\(B'\): \(= \{[(({b'_1}_{l_1}, ..., {b'_k}_{l_k}))] \vert {b'_j}^{l_j} \in B'_j\}\), \(\in \{\text{ the bases for } V_1 \otimes ... \otimes V_k\}\)
//

Statements:
\({b'_j}_l = {b_j}_m {M_j}^m_l\)
\(\implies\)
\(\forall f = f^{l_1, ..., l_k} [(({b_1}_{l_1}, ..., {b_k}_{l_k}))] = f'^{m_1, ..., m_k} [(({b'_1}_{m_1}, ..., {b'_k}_{m_k}))] \in V_1 \otimes ... \otimes V_k (f'^{l_1, ..., l_k} = {{M_1}^{-1}}^{l_1}_{m_1} ... {{M_k}^{-1}}^{l_k}_{m_k} f^{m_1, ..., m_k})\)
//


2: Proof


Whole Strategy: just apply the proposition that for the tensor product of any \(k\) finite-dimensional vectors spaces over any field, the transition of the standard bases with respect to any bases for the vectors spaces is this, the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this, and the proposition that for the tensors space with respect to any field and any finite number of finite-dimensional the field vectors spaces and the field or the tensor product of any finite-dimensional vectors spaces over any field, the transition of any standard bases or the components is a square matrix, and the inverse matrix is the product of the inverses; Step 1: see that \([(({b'_1}_{l_1}, ..., {b'_k}_{l_k}))] = [(({b_1}_{m_1}, ..., {b_k}_{m_k}))] {M_1}^{m_1}_{l_1} ... {M_k}^{m_k}_{l_k}\); Step 2: conclude the proposition.

Step 1:

\([(({b'_1}_{l_1}, ..., {b'_k}_{l_k}))] = [(({b_1}_{m_1}, ..., {b_k}_{m_k}))] {M_1}^{m_1}_{l_1} ... {M_k}^{m_k}_{l_k}\), by the proposition that for the tensor product of any \(k\) finite-dimensional vectors spaces over any field, the transition of the standard bases with respect to any bases for the vectors spaces is this.

Step 2:

Let us apply the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this.

By the proposition that for the tensors space with respect to any field and any finite number of finite-dimensional the field vectors spaces and the field or the tensor product of any finite-dimensional vectors spaces over any field, the transition of any standard bases or the components is a square matrix, and the inverse matrix is the product of the inverses, \(f'^{l_1, ..., l_k} = {{M_1}^{-1}}^{l_1}_{m_1} ... {{M_k}^{-1}}^{l_k}_{m_k} f^{m_1, ..., m_k}\).


References


<The previous article in this series | The table of contents of this series |

1027: For Tensors Space w.r.t. Field and \(k\) Finite-Dimensional Vectors Spaces over Field and Field, Transition of Components of Tensor w.r.t. Standard Bases w.r.t. Bases for Vectors Spaces Is This

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for tensors space w.r.t. field and \(k\) finite-dimensional vectors spaces over field and field, transition of components of tensor w.r.t. standard bases w.r.t. bases for vectors spaces is this

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for the tensors space with respect to any field and any \(k\) finite-dimensional vectors spaces over the field and the field, the transition of the components of any tensor with respect to the standard bases w.r.t. any bases for the vectors spaces is this.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(\{V_1, ..., V_k\}\): \(\subseteq \{\text{ the finite-dimensional } F \text{ vectors spaces }\}\)
\(L (V_1, ..., V_k: F)\): \(= \text{ the tensors space }\)
\(\{B_1, ..., B_k\}\): \(B_j \in \{\text{ the bases for } V_j\} = \{{b_j}_l \vert 1 \le l \le dim V_j\}\)
\(\{B'_1, ..., B'_k\}\): \(B'_j \in \{\text{ the bases for } V_j\} = \{{b_j}_l \vert 1 \le l \le dim V_j\}\)
\(\{B^*_1, ..., B^*_k\}\): \(B^*_j = \text{ the dual basis of } B_j = \{{b_j}^l \vert 1 \le l \le dim V_j\}\)
\(\{B'^*_1, ..., B'^*_k\}\): \(B'^*_j = \text{ the dual basis of } B_j = \{{b_j}^l \vert 1 \le l \le dim V_j\}\)
\(B^*\): \(= \{{b_1}^{j_1} \otimes ... \otimes {b_k}^{j_k} \vert {b_l}^{j_l} \in B^*_l\}\), \(\in \{\text{ the bases for } L (V_1, ..., V_k: F)\}\)
\(B'^*\): \(= \{{b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k} \vert {b'_l}^{j_l} \in B'^*_l\}\), \(\in \{\text{ the bases for } L (V_1, ..., V_k: F)\}\)
//

Statements:
\({b'_j}_l = {b_j}_m {M_j}^m_l\)
\(\implies\)
\(\forall f = f_{j_1, ..., j_k} {b_1}^{j_1} \otimes ... \otimes {b_k}^{j_k} = f'_{l_1, ..., l_k} {b'_1}^{l_1} \otimes ... \otimes {b'_k}^{j_l} \in L (V_1, ..., V_k: F) (f'_{l_1, ..., l_k} = f_{j_1, ..., j_k} {M_1}^{j_1}_{l_1} ... {M_k}^{j_k}_{l_k})\)
//


2: Proof


Whole Strategy: just apply the proposition that for the tensors space with respect to any field and any \(k\) finite-dimensional vectors spaces over the field and the field, the transition of the standard bases with respect to any bases for the vectors spaces is this, the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this, and the proposition that for the tensors space with respect to any field and any finite number of finite-dimensional the field vectors spaces and the field or the tensor product of any finite-dimensional vectors spaces over any field, the transition of any standard bases or the components is a square matrix, and the inverse matrix is the product of the inverses; Step 1: see that \({b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k} = {{M_1}^{-1}}^{j_1}_{l_1} ... {{M_k}^{-1}}^{j_k}_{l_k} {b_1}^{l_1} \otimes ... \otimes {b_k}^{l_k}\); Step 2: conclude the proposition.

Step 1:

\({b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k} = {{M_1}^{-1}}^{j_1}_{l_1} ... {{M_k}^{-1}}^{j_k}_{l_k} {b_1}^{l_1} \otimes ... \otimes {b_k}^{l_k}\), by the proposition that for the tensors space with respect to any field and any \(k\) finite-dimensional vectors spaces over the field and the field, the transition of the standard bases with respect to any bases for the vectors spaces is this.

Step 2:

Let us apply the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this.

By the proposition that for the tensors space with respect to any field and any finite number of finite-dimensional the field vectors spaces and the field or the tensor product of any finite-dimensional vectors spaces over any field, the transition of any standard bases or the components is a square matrix, and the inverse matrix is the product of the inverses, \(f'_{l_1, ..., l_k} = f_{j_1, ..., j_k} {M_1}^{j_1}_{l_1} ... {M_k}^{j_k}_{l_k}\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>

1026: For Tensors Space or Tensor Product of Vectors Spaces, Transition of Standard Bases or Components Is Square Matrix, and Inverse Is Product of Inverses

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for tensors space or tensor product of vectors spaces, transition of standard bases or components is square matrix, and inverse is product of inverses

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for the tensors space with respect to any field and any finite number of finite-dimensional the field vectors spaces and the field or the tensor product of any finite-dimensional vectors spaces over any field, the transition of any standard bases or the components is a square matrix, and the inverse matrix is the product of the inverses.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(\{V_1, ..., V_k\}\): \(\subseteq \{\text{ the finite-dimensional } F \text{ vectors spaces }\}\)
\(L (V_1, ..., V_k: F)\): \(= \text{ the tensors space }\)
\(V_1 \otimes ... \otimes V_k\): \(= \text{ the tensor product }\)
\(\{B_1, ..., B_k\}\): \(B_j \in \{\text{ the bases of } V_j\} = \{{b_j}_l \vert 1 \le l \le dim V_j\}\)
\(\{B'_1, ..., B'_k\}\): \(B'_j \in \{\text{ the bases of } V_j\} = \{{b'_j}_l \vert 1 \le l \le dim V_j\}\)
\(\{B^*_1, ..., B^*_k\}\): \(B^*_j = \text{ the dual basis of } B_j = \{{b_j}^l \vert 1 \le l \le dim V_j\}\)
\(\{B'^*_1, ..., B'^*_k\}\): \(B'^*_j = \text{ the dual basis of } B'_j = \{{b'_j}^l \vert 1 \le l \le dim V_j\}\)
\(B^*\): \(= \{{b_1}^{j_1} \otimes ... \otimes {b_k}^{j_k} \vert {b_l}^{j_l} \in B^*_l\}\), \(\in \{\text{ the bases for } L (V_1, ..., V_k: F)\}\)
\(B'^*\): \(= \{{b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k} \vert {b'_l}^{j_l} \in B'^*_l\}\), \(\in \{\text{ the bases for } L (V_1, ..., V_k: F)\}\)
\(B\): \(= \{[(({b_1}_{j_1}, ..., {b_k}_{j_k}))] \vert {b_l}_{j_l} \in B_l\}\), \(\in \{\text{ the bases for } V_1 \otimes ... \otimes V_k\}\)
\(B'\): \(= \{[(({b'_1}_{j_1}, ..., {b'_k}_{j_k}))] \vert {b'_l}_{j_l} \in B'_l\}\), \(\in \{\text{ the bases for } V_1 \otimes ... \otimes V_k\}\)
//

Statements:
\({b'_j}_l = {b_j}_m {M_j}^m_l\)
\(\implies\)
(
(
\({b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k} = {{M_1}^{-1}}^{j_1}_{l_1} ... {{M_k}^{-1}}^{j_k}_{l_k} {b_1}^{l_1} \otimes ... \otimes {b_k}^{l_k}\)
\(\land\)
\(M^{j_1, ..., j_k}_{l_1, ..., l_k} := {{M_1}^{-1}}^{j_1}_{l_1} ... {{M_k}^{-1}}^{j_k}_{l_k}\) is a square matrix
\(\land\)
\({M^{-1}}^{j_1, ..., j_k}_{l_1, ..., l_k} = {M_1}^{j_1}_{l_1} ... {M_k}^{j_k}_{l_k}\)
)
\(\land\)
(
\([(({b'_1}_{j_1}, ..., {b'_k}_{j_k}))] = [(({b_1}_{l_1}, ..., {b_k}_{l_k}))] {M_1}^{l_1}_{j_1} ... {M_k}^{l_k}_{j_k}\)
\(\land\)
\(M^{l_1, ..., l_k}_{j_1, ..., j_k} := {M_1}^{l_1}_{j_1} ... {M_k}^{l_k}_{j_k}\) is a square matrix
\(\land\)
\({M^{-1}}^{l_1, ..., l_k}_{j_1, ..., j_k} = {{M_1}^{-1}}^{l_1}_{j_1} ... {{M_k}^{-1}}^{l_k}_{j_k}\)
)
)
//


2: Proof


Whole Strategy: Step 1: see that the transition for the bases for \(L (V_1, ..., V_k: F)\) holds; Step 2: see that \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) is a square matrix; Step 3: see that the inverse of \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) is as is claimed; Step 4: see that the transition for the bases for \(V_1 \otimes ... \otimes V_k\) holds; Step 5: see that \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\) is a square matrix; Step 6: see that the inverse of \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\) is as is claimed; Step 7: see that also the components transitions are some square matrices.

Step 1:

\(B^*\) and \(B'^*\) are indeed some bases for \(L (V_1, ..., V_k: F)\), by the proposition that for any field and any \(k\) finite-dimensional vectors spaces over the field, the tensors space with respect to the field and the vectors spaces and the field has the basis that consists of the tensor products of the elements of the dual bases of any bases of the vectors spaces.

\({b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k} = {{M_1}^{-1}}^{j_1}_{l_1} ... {{M_k}^{-1}}^{j_k}_{l_k} {b_1}^{l_1} \otimes ... \otimes {b_k}^{l_k}\) holds, by the proposition that for the tensors space with respect to any field and any \(k\) finite-dimensional vectors spaces over the field and the field, the transition of the standard bases with respect to any bases for the vector spaces is this.

Step 2:

\(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) may not look like any matrix unless \(k = 1\), because it is a multi-dimensional array.

But the set of the combinations, \(J := \{(j_1, ..., j_k) \vert 1 \le j_1 \le dim V_1, ..., 1 \le j_k \le dim V_k\}\), whose order is \(dim V_1 * ... * dim V_k\), can be regarded as a single index set. And \(J = \{(l_1, ..., l_k) \vert 1 \le l_1 \le dim V_1, ..., 1 \le l_k \le dim V_k\}\) can be regarded as a single index set.

So, \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) can be regarded to be a \((dim V_1 * ... * dim V_k) \times (dim V_1 * ... * dim V_k)\) square matrix: the order of the index, \(J\), can be chosen arbitrary, for example, \((1, ..., 1), (1, ..., 2), ..., (dim V_1, ..., dim V_k)\), which is the most natural one.

Also each of \({b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k}\) and \({b_1}^{l_1} \otimes ... \otimes {b_k}^{l_k}\) can be regarded to be a column vector (a kind of matrix) with the chosen order of \(J\).

Then, \({b'_1}^{j_1} \otimes ... \otimes {b'_k}^{j_k} = M^{j_1, ..., j_k}_{l_1, ..., l_k} {b_1}^{l_1} \otimes ... \otimes {b_k}^{l_k}\) is the usual multiplication of the square matrix and the column vector.

In fact, that is natural, because it is a transition of bases for a vectors space, and although we denote the basis, \(B^*\), as \(\{{b_1}^{j_1} \otimes ... \otimes {b_k}^{j_k}\}\), just because that is somehow convenient for clarifying what each element is, in fact, the basis can be denoted also like \(\{e^1, ..., e^{dim V_1 * ... * dim V_k}\}\).

For any another matrix, \(N^{m_1, ..., m_k}_{j_1, ..., j_k}\), with the chosen order of \(J\), \(N^{m_1, ..., m_k}_{j_1, ..., j_k} M^{j_1, ..., j_k}_{l_1, ..., l_k}\) is the usual multiplication of the square matrices.

Step 3:

The reason why we want to regard \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) as a square matrix is that we want to take the inverse of it, while the reason why we want to take the inverse of it is that the inverse represents the transition of the components, by the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this: it certainly has the inverse, because it is a transition of bases.

The inverse of \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) is the matrix, \(N^{m_1, ..., m_k}_{j_1, ..., j_k}\), such that \(N^{m_1, ..., m_k}_{j_1, ..., j_k} M^{j_1, ..., j_k}_{l_1, ..., l_k} = \delta^{m_1}_{l_1} ... \delta^{m_k}_{l_k}\): the product of the reverse order is automatically guaranteed to be \(I\), because we know that \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) is invertible: from \(N M = I\), \(M N = M N M M^{-1} = M I M^{-1} = I\).

There is \({M_1}^{m_1}_{j_1} ... {M_k}^{m_k}_{j_k}\), which is a \((dim V_1 * ... * dim V_k) \times (dim V_1 * ... * dim V_k)\) matrix.

\({M_1}^{m_1}_{j_1} ... {M_k}^{m_k}_{j_k} M^{j_1, ..., j_k}_{l_1, ..., l_k} = {M_1}^{m_1}_{j_1} ... {M_k}^{m_k}_{j_k} {{M_1}^{-1}}^{j_1}_{l_1} ... {{M_k}^{-1}}^{j_k}_{l_k} = {M_1}^{m_1}_{j_1} {{M_1}^{-1}}^{j_1}_{l_1} ... {M_k}^{m_k}_{j_k} ... {{M_k}^{-1}}^{j_k}_{l_k} = \delta^{m_1}_{l_1} ... \delta^{m_k}_{l_k}\), which means that \({M_1}^{m_1}_{j_1} ... {M_k}^{m_k}_{j_k}\) is the inverse of \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\).

So, \({M^{-1}}^{j_1, ..., j_k}_{l_1, ..., l_k} = {M_1}^{j_1}_{l_1} ... {M_k}^{j_k}_{l_k}\).

\({M^{-1}}^{j_1, ..., j_k}_{l_1, ..., l_k}\) represents the transition of tensor components, by the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this.

Step 4:

\(B\) and \(B'\) are indeed some bases for \(V_1 \otimes ... \otimes V_k\), by the proposition that the tensor product of any \(k\) finite-dimensional vectors spaces has the basis that consists of the classes induced by any basis elements.

\([(({b'_1}_{j_1}, ..., {b'_k}_{j_k}))] = [(({b_1}_{l_1}, ..., {b_k}_{l_k}))] {M_1}^{l_1}_{j_1} ... {M_k}^{l_k}_{j_k}\) holds, by the proposition that for the tensor product of any \(k\) finite-dimensional vectors spaces over any field, the transition of the standard bases with respect to any bases for the vector spaces is this.

Step 5:

\(M^{l_1, ..., l_k}_{j_1, ..., j_k}\) may not look like any matrix unless \(k = 1\), because it is a multi-dimensional array.

But the set of the combinations, \(J := \{(j_1, ..., j_k) \vert 1 \le j_1 \le dim V_1, ..., 1 \le j_k \le dim V_k\}\), whose order is \(dim V_1 * ... * dim V_k\), can be regarded as a single index set. And \(J = \{(l_1, ..., l_k) \vert 1 \le l_1 \le dim V_1, ..., 1 \le l_k \le dim V_k\}\) can be regarded as a single index set.

So, \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\) can be regarded to be a \((dim V_1 * ... * dim V_k) \times (dim V_1 * ... * dim V_k)\) square matrix: the order of the index, \(J\), can be chosen arbitrary, for example, \((1, ..., 1), (1, ..., 2), ..., (dim V_1, ..., dim V_k)\), which is the most natural one.

Also each of \([(({b'_1}_{j_1}, ..., {b'_k}_{j_k}))]\) and \([(({b_1}_{l_1}, ..., {b_k}_{l_k}))]\) can be regarded to be a row vector (a kind of matrix) with the chosen order of \(J\).

Then, \([(({b'_1}_{j_1}, ..., {b'_k}_{j_k}))] = [(({b_1}_{l_1}, ..., {b_k}_{l_k}))] M^{l_1, ..., l_k}_{j_1, ..., j_k}\) is the usual multiplication of the row vector and the square matrix.

In fact, that is natural, because it is a transition of bases for a vectors space, and although we denote the basis, \(B\), as \(\{[(({b_1}_{j_1}, ..., {b_k}_{j_k}))]\}\), just because that is somehow convenient for clarifying what each element is, in fact, the basis can be denoted also like \(\{e_1, ..., e_{dim V_1 * ... * dim V_k}\}\).

For any another matrix, \(N^{m_1, ..., m_k}_{l_1, ..., l_k}\), with the chosen order of \(J\), \(N^{m_1, ..., m_k}_{l_1, ..., l_k} M^{l_1, ..., l_k}_{j_1, ..., j_k}\) is the usual multiplication of the square matrices.

Step 6:

The reason why we want to regard \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\) as a square matrix is that we want to take the inverse of it, while the reason why we want to take the inverse of it is that the inverse represents the transition of the components: it certainly has the inverse, because it is a transition of bases.

The inverse of \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\) is the matrix, \(N^{m_1, ..., m_k}_{l_1, ..., l_k}\), such that \(N^{m_1, ..., m_k}_{l_1, ..., l_k} M^{l_1, ..., l_k}_{j_1, ..., j_k} = \delta^{m_1}_{j_1} ... \delta^{m_k}_{j_k}\): the product of the reverse order is automatically guaranteed to be \(I\), because we know that \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\) is invertible.

There is \({{M_1}^{-1}}^{m_1}_{l_1} ... {{M_k}^{-1}}^{m_k}_{l_k}\), which is a \((dim V_1 * ... * dim V_k) \times (dim V_1 * ... * dim V_k)\) matrix.

\({{M_1}^{-1}}^{m_1}_{l_1} ... {{M_k}^{-1}}^{m_k}_{l_k} M^{l_1, ..., l_k}_{j_1, ..., j_k} = {{M_1}^{-1}}^{m_1}_{l_1} ... {{M_k}^{-1}}^{m_k}_{l_k} {M_1}^{l_1}_{j_1} ... {M_k}^{l_k}_{j_k} = {{M_1}^{-1}}^{m_1}_{l_1} {M_1}^{l_1}_{j_1} ... {{M_k}^{-1}}^{m_k}_{l_k} ... {M_k}^{l_k}_{j_k} = \delta^{m_1}_{j_1} ... \delta^{m_k}_{j_k}\), which means that \({{M_1}^{-1}}^{m_1}_{l_1} ... {{M_k}^{-1}}^{m_k}_{l_k}\) is the inverse of \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\).

So, \({M^{-1}}^{l_1, ..., l_k}_{j_1, ..., j_k} = {{M_1}^{-1}}^{l_1}_{j_1} ... {{M_k}^{-1}}^{l_k}_{j_k}\).

\({M^{-1}}^{l_1, ..., l_k}_{j_1, ..., j_k}\) represents the transition of tensor components, by the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this.

Step 7:

So, we have gotten the components transitions, \({M^{-1}}^{j_1, ..., j_k}_{l_1, ..., l_k}\) and \({M^{-1}}^{l_1, ..., l_k}_{j_1, ..., j_k}\), also which are some matrices likewise, and the inverses are \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) and \(M^{l_1, ..., l_k}_{j_1, ..., j_k}\).


3: Note


We are talking about the transition of bases or the transition of components, not about the tensor components themselves: for example, for a tensor, \(t \in L ({V_1}^*, ..., {V_k}^*, V_1, ..., V_k: F)\), \(t\) can be expressed with the components with respect to a standard basis as \(t^{j_1, ..., j_k}_{l_1, ..., l_k}\), which resembles \(M^{j_1, ..., j_k}_{l_1, ..., l_k}\) in form, but it is not so natural to regard it as a matrix: thinking of \(t (v^1, ... v^k, v_1, ..., v_k) = t^{j_1, ..., j_k}_{l_1, ..., l_k} {v^1}_{j_1} ... {v^k}_{j_k} {v_1}^{l_1} ... {v_k}^{l_k}\), in order to regard it as the multiplication of a matrix and a column vector, the column vector would be like \(({v^1}_1 ... {v^k}_1 {v_1}^1 ... {v_k}^1, {v^1}_1 ... {v^k}_1 {v_1}^1 ... {v_k}^2, ..., {v^1}_{dim V_1} ... {v^k}_{dim V_k} {v_1}^{dim V_1} ... {v_k}^{dim V_k})^t\) instead of like \(({v_1}_1, ..., {v_1}_{dim V_1}, ..., {v_k}^1, ..., {v_k}^{dim V_k})\), which might not be particularly meaningful (if it is meaningful for your situation, of course, it is fine).

The reason why that might not be meaningful is that \(t\) is not any linear map in general (refer to the proposition that a multilinear map is not necessarily linear): \(t\) is \(: V_1^{*} \times ... \times V_k^{*} \times V_1 \times ... \times V_k \to F\), a non-linear map from a \((2 (dim V_1 + ... + dim V_k))\)-dimensional vectors space into \(F\), and thinking of the \((dim V_1 * ... * dim V_k)^2 \times (dim V_1 * ... * dim V_k)^2\) matrix is not meaningful in general.

Of course, as any matrix is just an arrangement of some ring elements, you can always regard \(t^{j_1, ..., j_k}_{l_1, ..., l_k}\) as a matrix, if you want to.


References


<The previous article in this series | The table of contents of this series | The next article in this series>

1025: For Finite-Dimensional Vectors Space, Transition of Components of Vector w.r.t. Change of Bases Is This

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for finite-dimensional vectors space, transition of components of vector w.r.t. change of bases is this

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(V\): \(\in \{\text{ the } F \text{ vectors spaces }\}\)
\(B\): \(\in \{\text{ the bases for } V\} = \{b_s \vert 1 \le s \le dim V\}\)
\(B'\): \(\in \{\text{ the bases for } V\} = \{b'_s = b_j M_s^j \vert 1 \le s \le dim V\}\)
//

Statements:
\(\forall v = v^j b_j = v'^j b'_j \in V\)
(
\(v'^j = {M^{-1}}^j_l v^l\)
)
//


2: Proof


Whole Strategy: Step 1: for \(b'_j v'^j = b_j v^j\), expand \(b'_j\) with \(b_l\) s, and compare the coefficients of \(b_l\) s on the both hand sides.

Step 1:

From \(b'_j v'^j = b_j v^j\), \(b'_j v'^j = b'_l v'^l = b_j M_l^j v'^l = b_j v^j\).

So, \(M_l^j v'^l = v^j\).

So, \(v'^j = {M^{-1}}^j_l v^l\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>

1024: For Tensor Product of \(k\) Finite-Dimensional Vectors Spaces over Field, Transition of Standard Bases w.r.t. Bases for Vectors Spaces Is This

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for tensor product of \(k\) finite-dimensional vectors spaces over field, transition of standard bases w.r.t. bases for vectors spaces is this

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for the tensor product of any \(k\) finite-dimensional vectors spaces over any field, the transition of the standard bases with respect to any bases for the vectors spaces is this.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(\{V_1, ..., V_k\}\): \(\subseteq \{\text{ the finite-dimensional } F \text{ vectors spaces }\}\)
\(V_1 \otimes ... \otimes V_k\): \(= \text{ the tensor product }\)
\(\{B_1, ..., B_k\}\): \(B_j \in \{\text{ the bases for } V_j\} = \{{b_j}_l \vert 1 \le l \le dim V_j\}\)
\(\{B'_1, ..., B'_k\}\): \(B'_j \in \{\text{ the bases for } V_j\} = \{{b'_j}_l \vert 1 \le l \le dim V_j\}\)
\(B\): \(= \{[(({b_1}_{l_1}, ..., {b_k}_{l_k}))] \vert {b_j}^{l_j} \in B_j\}\), \(\in \{\text{ the bases for } V_1 \otimes ... \otimes V_k\}\)
\(B'\): \(= \{[(({b'_1}_{l_1}, ..., {b'_k}_{l_k}))] \vert {b'_j}^{l_j} \in B'_j\}\), \(\in \{\text{ the bases for } V_1 \otimes ... \otimes V_k\}\)
//

Statements:
\({b'_j}_l = {b_j}_m {M_j}^m_l\)
\(\implies\)
\([(({b'_1}_{l_1}, ..., {b'_k}_{l_k}))] = [(({b_1}_{m_1}, ..., {b_k}_{m_k}))] {M_1}^{m_1}_{l_1} ... {M_k}^{m_k}_{l_k}\)
//


2: Proof


Whole Strategy: Step 1: see that \(B\) and \(B'\) are some bases for \(V_1 \otimes ... \otimes V_k\); Step 2: conclude the proposition.

Step 1:

\(B\) and \(B'\) are indeed some bases for \(V_1 \otimes ... \otimes V_k\), by the proposition that the tensor product of any \(k\) finite-dimensional vectors spaces has the basis that consists of the classes induced by any basis elements.

Step 2:

\([(({b'_1}_{l_1}, ..., {b'_k}_{l_k}))] = [(({b_1}_{m_1} {M_1}^{m_1}_{l_1}, ..., {b_k}_{m_k} {M_k}^{m_k}_{l_k}))]\).

We note the fact that in general, \([((v_1, ..., r v_j+ r' v'_j, ..., v_k))] = r [((v_1, ..., v_j, ..., v_k))] + r' [((v_1, ..., v'_j, ..., v_k))]\): refer to Note for the definition of tensor product of \(k\) vectors spaces over field.

So, \([(({b_1}_{m_1} {M_1}^{m_1}_{l_1}, ..., {b_k}_{m_k} {M_k}^{m_k}_{l_k}))] = [(({b_1}_{m_1}, ..., {b_k}_{m_k}))] {M_1}^{m_1}_{l_1} ... {M_k}^{m_k}_{l_k}\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>

2025-02-23

1023: Tensor Product of \(k\) Finite-Dimensional Vectors Spaces Has Basis That Consists of Classes Induced by Basis Elements

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that tensor product of \(k\) finite-dimensional vectors spaces has basis that consists of classes induced by basis elements

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that the tensor product of any \(k\) finite-dimensional vectors spaces has the basis that consists of the classes induced by any basis elements.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(\{V_1, ..., V_k\}\): \(\subseteq \{\text{ the finite-dimensional } F \text{ vectors spaces }\}\)
\(\{B_1, ..., B_k\}\): \(B_l \in \{\text{ the bases for } V_l\}\)
\(V_1 \otimes ... \otimes V_k\): \(= \text{ the tensor product }\)
\(B\): \(= \{[((b_{1, j_1}, ..., b_{k, j_k}))] \vert b_{l, j_l} \in B_l\}\)
//

Statements:
\(B \in \{\text{ the bases for } V_1 \otimes ... \otimes V_k\}\)
//

Let us call \(B\) "the standard basis with respect to \(\{B_1, ..., B_k\}\)": it is not determined unless \(\{B_1, ..., B_k\}\) is specified.


2: Note


Each \(V_j\) needs to be finite-dimensional, because Proof uses the dual of \(B_j\): the definition of dual basis for covectors (dual) space of basis for finite-dimensional vectors space requires \(V_j\) to be finite-dimensional.


3: Proof


Whole Strategy: Step 1: see that \(B\) spans \(V_1 \otimes ... \otimes V_k\); Step 2: see that \(B\) is linearly independent.

Step 1:

Let us see that \(B\) spans \(V_1 \otimes ... \otimes V_k\).

Each element of \(V_1 \otimes ... \otimes V_k\) is \([r^1 ((v_{1, 1}, ..., v_{1, k})) + ... + r^l ((v_{l, 1}, ..., v_{l, k}))]\) by the definition of tensor product of \(k\) vectors spaces.

\([r^1 ((v_{1, 1}, ..., v_{1, k})) + ... + r^l ((v_{l, 1}, ..., v_{l, k}))] = r^1 [((v_{1, 1}, ..., v_{1, k}))] + ... + r^l [((v_{l, 1}, ..., v_{l, k}))]\), which is by the definition of quotient vectors space of vectors space by sub-'vectors space'.

Let us look at each \(r^j [((v_{j, 1}, ..., v_{j, k}))]\).

\(r^j [((v_{j, 1}, ..., v_{j, k}))] = r^j [((\sum_{m_1 \in S_{1, j}} v_{j, 1}^{m_1} b_{1, m_1}, ..., \sum_{m_k \in S_{k, j}} v_{j, k}^{m_k} b_{k, m_k}))]\), where \(S_{l, j}\) is a finite index set, \(= r^j \sum_{m_1 \in S_{1, j}} v_{j, 1}^{m_1} [((b_{1, m_1}, ..., \sum_{m_k \in S_{k, j}} v_{j, k}^{m_k} b_{k, m_k}))]\), which is by "the 2 legitimate rules", \(= ... = r^j \sum_{m_1 \in S_{1, j}} v_{j, 1}^{m_1} ... \sum_{m_k \in S_{k, j}} v_{j, k}^{m_k} [((b_{1, m_1}, ..., b_{k, m_k}))]\), which means that \(r^j [((v_{j, 1}, ..., v_{j, k}))]\) is a linear combination of \(B\).

So, \([r^1 ((v_{1, 1}, ..., v_{1, k})) + ... + r^l ((v_{l, 1}, ..., v_{l, k}))] = r^1 [((v_{1, 1}, ..., v_{1, k}))] + ... + r^l [((v_{l, 1}, ..., v_{l, k}))]\) is a linear combination of \(B\).

So, \(B\) spans \(V_1 \otimes ... \otimes V_k\).

Note that this step did not use finite-dimensional-ness of \(V_j\) s, so, \(B\) spans \(V_1 \otimes ... \otimes V_k\) even when \(V_j\) s are infinite-dimensional.

Step 2:

Let us see that \(B\) is linearly independent.

Let \(r^1 [((b_{1, j_{1, 1}}, ..., b_{k, j_{1, k}}))] + ... + r^l [((b_{1, j_{l, 1}}, ..., b_{k, j_{l, k}}))] = 0\).

Let us take the dual basis of each \(B_j\), \(B^*_j = \{b_j^l\}\).

Let us take the multilinear map, \(f_{j_{m, 1}, ..., j_{m, k}}: V_1 \times ... \times V_k \to F = b_1^{j_{m, 1}} \otimes ... \otimes b_k^{j_{m, k}}\).

By the proposition that for any multilinear map from any finite product vectors space, there is the unique linear map from the tensor product of the finite number of vectors spaces such that the multilinear map is the linear map after the canonical map from the product vectors space into the tensor product, there is the unique linear map, \(f'_{j_{m, 1}, ..., j_{m, k}}: V_1 \otimes ... \otimes V_k \to F\), such that \(f_{j_{m, 1}, ..., j_{m, k}} = f'_{j_{m, 1}, ..., j_{m, k}} \circ g\), where \(g: V_1 \times ... \times V_k \to V_1 \otimes ... \otimes V_k, (v_1, ..., v_k) \mapsto [((v_1, ..., v_k))]\).

\(0 = f'_{j_{m, 1}, ..., j_{m, k}} (0) = f'_{j_{m, 1}, ..., j_{m, k}} (r^1 [((b_{1, j_{1, 1}}, ..., b_{k, j_{1, k}}))] + ... + r^l [((b_{1, j_{l, 1}}, ..., b_{k, j_{l, k}}))]) = r^1 f'_{j_{m, 1}, ..., j_{m, k}} ([((b_{1, j_{1, 1}}, ..., b_{k, j_{1, k}}))]) + ... + r^l f'_{j_{m, 1}, ..., j_{m, k}} ([((b_{1, j_{l, 1}}, ..., b_{k, j_{l, k}}))])\), because \(f'_{j_{m, 1}, ..., j_{m, k}}\) is linear, but \(f'_{j_{m, 1}, ..., j_{m, k}} ([((b_{1, j_{n, 1}}, ..., b_{k, j_{n, k}}))]) = f'_{j_{m, 1}, ..., j_{m, k}} \circ g ((b_{1, j_{n, 1}}, ..., b_{k, j_{n, k}})) = f_{j_{m, 1}, ..., j_{m, k}} ((b_{1, j_{n, 1}}, ..., b_{k, j_{n, k}})) = b_1^{j_{m, 1}} \otimes ... \otimes b_k^{j_{m, k}} ((b_{1, j_{n, 1}}, ..., b_{k, j_{n, k}})) = b_1^{j_{m, 1}} (b_{1, j_{n, 1}}) ... b_k^{j_{m, k}} (b_{k, j_{n, k}}) = \delta_{m, n}\), so, \(0 = r^m\).

So, \(B\) is linearly independent.


References


<The previous article in this series | The table of contents of this series | The next article in this series>

1022: For Multilinear Map from Finite Product Vectors Space, There Is Unique Linear Map from Tensor Product of Finite Vectors Spaces s.t. Multilinear Map Is Linear Map after Canonical Map from Product Vectors Space into Tensor Product

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for multilinear map from finite product vectors space, there is unique linear map from tensor product of finite vectors spaces s.t. multilinear map is linear map after canonical map from product vectors space into tensor product

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any multilinear map from any finite product vectors space, there is the unique linear map from the tensor product of the finite number of vectors spaces such that the multilinear map is the linear map after the canonical map from the product vectors space into the tensor product.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(\{V_1, ..., V_k\}\): \(\subseteq \{\text{ the } F \text{ vectors spaces }\}\)
\(V\): \(\in \{\text{ the } F \text{ vectors spaces }\}\)
\(V_1 \times ... \times V_k\): \(= \text{ the product vectors space }\)
\(V_1 \otimes ... \otimes V_k\): \(= \text{ the tensor product }\)
\(f\): \(: V_1 \times ... \times V_k \to V\), \(\in \{\text{ the multilinear maps }\}\)
\(g\): \(: V_1 \times ... \times V_k \to V_1 \otimes ... \otimes V_k, (v_1, ..., v_k) \mapsto [((v_1, ..., v_k))]\)
//

Statements:
\(!\exists f': V_1 \otimes ... \otimes V_k \to V \in \{\text{ the linear maps }\} (f = f' \circ g)\)
//


2: Note


\(V_j\) or \(V\) does not need to be finite-dimensional.


3: Proof


Whole Strategy: Step 1: see that there is only 1 option for \(f'\) to satisfy the requirements; Step 2: see that it is well-defined; Step 3: see that it satisfies the requirements.

Step 1:

Let \([v] \in V_1 \otimes ... \otimes V_k\) be any.

Although \(v\) is not uniquely determined, once \(v\) is determined, \(v \in F (V_1 \times ... \times V_k, F)\) and \(v\) is uniquely expressed as \(v = r^1 ((v_{1, 1}, ..., v_{1, k})) + ... + r^l ((v_{l, 1}, ..., v_{l, k}))\), by the proposition that for any module with any basis, the components set of any element with respect to the basis is unique, because \(\{((v_1, ..., v_k)) \vert v_j \in V_j\}\) is a basis for \(F (V_1 \times ... \times V_k, F)\).

\([v] = [r^1 ((v_{1, 1}, ..., v_{1, k})) + ... + r^l ((v_{l, 1}, ..., v_{l, k}))]\).

\( = r^1 [((v_{1, 1}, ..., v_{1, k}))] + ... + r^l [((v_{l, 1}, ..., v_{l, k}))]\).

As \(f'\) needs to be linear, \(f' ([v]) = f' (r^1 [((v_{1, 1}, ..., v_{1, k}))] + ... + r^l [((v_{l, 1}, ..., v_{l, k}))]) = r^1 f' ([((v_{1, 1}, ..., v_{1, k}))]) + ... + r^l f' ([((v_{l, 1}, ..., v_{l, k}))])\) is required.

\(f ((v_{j, 1}, ..., v_{j, k})) = f' \circ g ((v_{j, 1}, ..., v_{j, k}))\) is required.

\(= f' ([((v_{j, 1}, ..., v_{j, k}))])\), so, \(f' ([v])\) is required to be \(r^1 f ((v_{1, 1}, ..., v_{1, k})) + ... + r^l f ((v_{l, 1}, ..., v_{l, k}))\).

So, if any \(f'\) exists, \(f'\) is uniquely determined.

But of course, we need to confirm that \(f' ([v]) = r^1 f ((v_{1, 1}, ..., v_{1, k})) + ... + r^l f ((v_{l, 1}, ..., v_{l, k}))\) is well-defined.

Step 2:

Once \(v\) is determined, \(r^1 f ((v_{1, 1}, ..., v_{1, k})) + ... + r^l f ((v_{l, 1}, ..., v_{l, k}))\) is uniquely determined.

The only issue is that it does not depend on the choice of \(v\).

For any other choice, \(v' \in F (V_1 \times ... \times V_k, F)\), such that \([v] = [v']\), \(v' - v \in (S)\), where \((S)\) is the sub-'vectors space' generated by \(S := \{((v_1, ..., r v_j, ..., v_k)) - r ((v_1, ..., v_k)) \in F (V_1 \times ... \times V_k) \vert r \in F, v_1 \in V_1, ..., v_k \in V_k\} \cup \{((v_1, ..., v_j + v'_j, ..., v_k)) - ((v_1, ..., v_j, ..., v_k)) - ((v_1, ..., v'_j, ..., v_k)) \in F (V_1 \times ... \times V_k) \vert v_1 \in V_1, ..., v_k \in V_k, v'_j \in V_j\}\).

As each element of \((S)\) is a linear combination of \(S\), \(v' - v = s^1 (((v_{1, 1}, ..., r^1 v_{1, j}, ..., v_{1, k})) - r^1 ((v_{1, 1}, ..., v_{1, k}))) + ... + s^l (((v_{l, 1}, ..., r^l v_{l, j}, ..., v_{l, k})) - r^l ((v_{l, 1}, ..., v_{l, k}))) + t^1 (((w_{1, 1}, ..., w_{1, j_1} + w'_{1, j_1}, ..., w_{1, k})) - ((w_{1, 1}, ..., w_{1, j_1}, ..., w_{1, k})) - ((w_{1, 1}, ..., w'_{1, j_1}, ..., w_{1, k}))) + ... + t^m (((w_{m, 1}, ..., w_{m, j_m} + w'_{m, j_m}, ..., w_{m, k})) - ((w_{m, 1}, ..., w_{m, j_m}, ..., w_{m, k})) - ((w_{m, 1}, ..., w'_{m, j_m}, ..., w_{m, k})))\).

Note that while each \(((\bullet))\) expression is a basis element for \(F (V_1 \times ... \times V_k, F)\), the above expression of \(v' - v\) may contain some duplications of basis elements, but nevertheless, for any \(a^1 ((v_{1, 1}, ..., v_{1, k})) + ... + a^l ((v_{l, 1}, ..., v_{l, k}))\) with some possible duplications among the basis elements, \(f' ([a^1 ((v_{1, 1}, ..., v_{1, k})) + ... + a^l ((v_{l, 1}, ..., v_{l, k}))]) = a^1 f ((v_{1, 1}, ..., v_{1, k})) + ... + a^l f ((v_{l, 1}, ..., v_{l, k}))\) holds, because supposing \(((v_{1, 1}, ..., v_{1, k})) = ((v_{2, 1}, ..., v_{2, k}))\), \(f' ([a^1 ((v_{1, 1}, ..., v_{1, k})) + ... + a^l ((v_{l, 1}, ..., v_{l, k}))]) = f' ([(a^1 + a^2) ((v_{1, 1}, ..., v_{1, k})) + a^3 ((v_{3, 1}, ..., v_{3, k})) + ... + a^l ((v_{l, 1}, ..., v_{l, k}))]) = (a^1 + a^2) f ((v_{1, 1}, ..., v_{1, k})) + a^3 f ((v_{3, 1}, ..., v_{3, k})) + ... + a^l f ((v_{l, 1}, ..., v_{l, k})) = a^1 f ((v_{1, 1}, ..., v_{1, k})) + a^2 f ((v_{1, 1}, ..., v_{1, k})) + a^3 f ((v_{3, 1}, ..., v_{3, k})) + ... + a^l f ((v_{l, 1}, ..., v_{l, k})) = a^1 f ((v_{1, 1}, ..., v_{1, k})) + a^2 f ((v_{2, 1}, ..., v_{2, k})) + a^3 f ((v_{3, 1}, ..., v_{3, k})) + ... + a^l f ((v_{l, 1}, ..., v_{l, k}))\).

So, \(f' ([v']) = f' ([v + s^1 (((v_{1, 1}, ..., r^1 v_{1, j}, ..., v_{1, k})) - r^1 ((v_{1, 1}, ..., v_{1, k}))) + ... + s^l (((v_{l, 1}, ..., r^l v_{l, j}, ..., v_{l, k})) - r^l ((v_{l, 1}, ..., v_{l, k}))) + t^1 (((w_{1, 1}, ..., w_{1, j_1} + w'_{1, j_1}, ..., w_{1, k})) - ((w_{1, 1}, ..., w_{1, j_1}, ..., w_{1, k})) - ((w_{1, 1}, ..., w'_{1, j_1}, ..., w_{1, k}))) + ... + t^m (((w_{m, 1}, ..., w_{m, j_m} + w'_{m, j_m}, ..., w_{m, k})) - ((w_{m, 1}, ..., w_{m, j_m}, ..., w_{m, k})) - ((w_{m, 1}, ..., w'_{m, j_m}, ..., w_{m, k})))]) = f' [v] + s^1 f ((v_{1, 1}, ..., r^1 v_{1, j}, ..., v_{1, k})) - s^1 r^1 f ((v_{1, 1}, ..., v_{1, k})) + ... + s^l f ((v_{l, 1}, ..., r^l v_{l, j}, ..., v_{l, k})) - s^l r^l f ((v_{l, 1}, ..., v_{l, k})) + t^1 f ((w_{1, 1}, ..., w_{1, j_1} + w'_{1, j_1}, ..., w_{1, k})) - t^1 f ((w_{1, 1}, ..., w_{1, j_1}, ..., w_{1, k})) - t^1 f ((w_{1, 1}, ..., w'_{1, j_1}, ..., w_{1, k})) + ... + t^m f ((w_{m, 1}, ..., w_{m, j_m} + w'_{m, j_m}, ..., w_{m, k})) - t^m f ((w_{m, 1}, ..., w_{m, j_m}, ..., w_{m, k})) - t^m f ((w_{m, 1}, ..., w'_{m, j_m}, ..., w_{m, k}))\).

For each \(s^n f ((v_{n, 1}, ..., r^n v_{n, j_n}, ..., v_{n, k})) - s^n r^n f ((v_{n, 1}, ..., v_{n, k}))\), it is \(0\), because \(s^n f ((v_{n, 1}, ..., r^n v_{n, j_n}, ..., v_{n, k})) = s^n r^n f ((v_{n, 1}, ... , v_{n, k}))\), because \(f\) is multilinear.

For each \(t^n f ((w_{n, 1}, ..., w_{n, j_n} + w'_{n, j_n}, ..., w_{n, k})) - t^n f ((w_{n, 1}, ..., w_{n, j_n}, ..., w_{n, k})) - t^n f ((w_{n, 1}, ..., w'_{n, j_n}, ..., w_{n, k}))\), it is \(0\), because \(t^n f ((w_{n, 1}, ..., w_{n, j_n} + w'_{n, j_n}, ..., w_{n, k})) = t^n f ((w_{n, 1}, ..., w_{n, j_n}, ..., w_{n, k})) + t^n f ((w_{n, 1}, ..., w'_{n, j_n}, ..., w_{n, k}))\), because \(f\) is multilinear.

So, \(f' [v'] = f' [v]\).

Step 3:

Let us see that \(f'\) is indeed linear.

Let \([v], [v'] \in V_1 \otimes ... \otimes V_k\) be any.

\([v] = [r^1 ((v_{1, 1}, ..., v_{1, k})) + ... + r^l ((v_{l, 1}, ..., v_{l, k}))]\) and \([v'] = [r'^1 ((v'_{1, 1}, ..., v'_{1, k})) + ... + r'^m ((v'_{m, 1}, ..., v'_{m, k}))]\).

\(r [v] + r' [v'] = [r r^1 ((v_{1, 1}, ..., v_{1, k})) + ... + r r^l ((v_{l, 1}, ..., v_{l, k})) + r' r'^1 ((v'_{1, 1}, ..., v'_{1, k})) + ... + r' r'^m ((v'_{m, 1}, ..., v'_{m, k}))]\).

\(f' (r [v] + r' [v']) = r r^1 f ((v_{1, 1}, ..., v_{1, k})) + ... + r r^l f ((v_{l, 1}, ..., v_{l, k})) + r' r'^1 f ((v'_{1, 1}, ..., v'_{1, k})) + ... + r' r'^m f ((v'_{m, 1}, ..., v'_{m, k}))\): as before any duplication of basis elements does not matter, \(= r (r^1 f ((v_{1, 1}, ..., v_{1, k})) + ... + r^l f ((v_{l, 1}, ..., v_{l, k}))) + r' (r'^1 f ((v'_{1, 1}, ..., v'_{1, k})) + ... + r'^m f ((v'_{m, 1}, ..., v'_{m, k}))) = r f' ([v]) + r' f' ([v'])\).

Let us reconfirm that \(f = f' \circ g\).

For each \((v_1, ..., v_k) \in V_1 \times ... \times V_k\), \(f' \circ g ((v_1, ..., v_k)) = f' ([((v_1, ..., v_k))]) = f ((v_1, ..., v_k))\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>

1021: Tensor Product of \(k\) Vectors Spaces over Field

<The previous article in this series | The table of contents of this series | The next article in this series>

definition of tensor product of \(k\) vectors spaces over field

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a definition of tensor product of \(k\) vectors spaces over field.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\( F\): \(\in \{\text{ the fields }\}\)
\( \{V_1, ..., V_k\}\): \(\subseteq \{\text{ the } F \text{ vectors spaces }\}\)
\( V_1 \times ... \times V_k\): \(= \text{ the product set }\)
\( F (V_1 \times ... \times V_k, F)\): \(= \text{ the free vectors space }\)
\( S\): \(= \{((v_1, ..., r v_j, ..., v_k)) - r ((v_1, ..., v_k)) \in F (V_1 \times ... \times V_k) \vert r \in F, v_1 \in V_1, ..., v_k \in V_k\} \cup \{((v_1, ..., v_j + v'_j, ..., v_k)) - ((v_1, ..., v_j, ..., v_k)) - ((v_1, ..., v'_j, ..., v_k)) \in F (V_1 \times ... \times V_k) \vert v_1 \in V_1, ..., v_k \in V_k, v'_j \in V_j\}\)
\( (S)\): \(= \text{ the sub-'vectors space' generated by subset of vectors space }\)
\(*V_1 \otimes ... \otimes V_k\): \(= F (V_1 \times ... \times V_k) / (S)\), the quotient vectors space
//

Conditions:
//

For each \((v_1, ..., v_k) \in V_1 \times ... \times V_k\), \([f_{(v_1, ..., v_k)}] = [((v_1, ..., v_k))]\) (where \(f_{(v_1, ..., v_k)}: V_1 \times ... \times V_k \to F \in F (V_1 \times ... \times V_k, F)\) is the function that maps \((v_1, ..., v_k)\) to \(1\) and maps the other elements to \(0\)) is often denoted as \(v_1 \otimes ... \otimes v_k\) and called "tensor product of \(v_1, ..., v_k\)", but one finds them misleading to be confused with tensor product of tensors and prefers denoting just \([((v_1, ..., v_k))]\): for the reason why we use the notation like \(((v_1, ..., v_k))\), see Note for the definition of free vectors space on set: \((v_1, ..., v_k) \in V_1 \times ... \times V_k\) but \(((v_1, ..., v_k)) \in F (V_1 \times ... \times V_k, F)\).


2: Note


We need to be careful not to do like "\([r ((v_1, ..., v_k)) + r' ((v'_1, ..., v'_k))] = [(r (v_1, ..., v_k)) + (r' (v'_1, ..., v'_k))] = [((r v_1, ..., r v_k)) + ((r' v'_1, ..., r' v'_k))] = [((r v_1 , ..., r v_k) + (r' v'_1, ..., r' v'_k))] = [((r v_1 + r' v'_1, ..., r v_k + r' v'_k))]\)", which is wrong: the reason is described in Note for the definition of free vectors space on set: the 1st equal is wrong because \(r ((v_1, ..., v_k)) + r' ((v'_1, ..., v'_k))\) is the function that maps \((v_1, ..., v_k)\) to \(r\) and maps \((v'_1, ..., v'_k)\) to \(r'\), while \((r (v_1, ..., v_k)) + (r' (v'_1, ..., v'_k))\) is the function that maps \(r (v_1, ..., v_k)\) to \(1\) and maps \(r' (v'_1, ..., v'_k)\) to \(1\) but maps \((v_1, ..., v_k)\) to \(0\) and maps \((v'_1, ..., v'_k)\) to \(0\); the 3rd equal is wrong because \(((r v_1, ..., r v_k)) + ((r' v'_1, ..., r' v'_k))\) is the function that maps \((r v_1, ..., r v_k)\) to \(1\) and maps \((r' v'_1, ..., r' v'_k)\) to \(1\), while \(((r v_1 , ..., r v_k) + (r' v'_1, ..., r' v'_k))\) is the function that maps \((r v_1 , ..., r v_k) + (r' v'_1, ..., r' v'_k)\) to \(1\) but maps \((r v_1, ..., r v_k)\) to \(0\) and maps \((r' v'_1, ..., r' v'_k)\) to \(0\).

These are 2 legitimate rules: 1) \([((v_1, ..., r v_j, ..., v_k))] = r [((v_1, ..., v_k))]\); 2) \([((v_1, ..., v_j + v'_j, ..., v_k))] = [((v_1, ..., v_j, ..., v_k))] + [((v_1, ..., v'_j, ..., v_k))]\).

1) is because \([((v_1, ..., r v_j, ..., v_k))] = [((v_1, ..., r v_j, ..., v_k)) - (((v_1, ..., r v_j, ..., v_k)) - r ((v_1, ..., v_k)))] = [r ((v_1, ..., v_k))] = r [((v_1, ..., v_k))]\).

2) is because \([((v_1, ..., v_j + v'_j, ..., v_k))] = [((v_1, ..., v_j + v'_j, ..., v_k)) - (((v_1, ..., v_j + v'_j, ..., v_k)) - ((v_1, ..., v_j, ..., v_k)) - ((v_1, ..., v'_j, ..., v_k)))] = [((v_1, ..., v_j, ..., v_k)) + ((v_1, ..., v'_j, ..., v_k))] = [((v_1, ..., v_j, ..., v_k))] + [((v_1, ..., v'_j, ..., v_k))]\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>