2025-03-02

1026: For Tensors Space or Tensor Product of Vectors Spaces, Transition of Standard Bases or Components Is Square Matrix, and Inverse Is Product of Inverses

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for tensors space or tensor product of vectors spaces, transition of standard bases or components is square matrix, and inverse is product of inverses

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for the tensors space with respect to any field and any finite number of finite-dimensional the field vectors spaces and the field or the tensor product of any finite-dimensional vectors spaces over any field, the transition of any standard bases or the components is a square matrix, and the inverse matrix is the product of the inverses.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
F: { the fields }
{V1,...,Vk}: { the finite-dimensional F vectors spaces }
L(V1,...,Vk:F): = the tensors space 
V1...Vk: = the tensor product 
{B1,...,Bk}: Bj{ the bases of Vj}={bjl|1ldimVj}
{B1,...,Bk}: Bj{ the bases of Vj}={bjl|1ldimVj}
{B1,...,Bk}: Bj= the dual basis of Bj={bjl|1ldimVj}
{B1,...,Bk}: Bj= the dual basis of Bj={bjl|1ldimVj}
B: ={b1j1...bkjk|bljlBl}, { the bases for L(V1,...,Vk:F)}
B: ={b1j1...bkjk|bljlBl}, { the bases for L(V1,...,Vk:F)}
B: ={[((b1j1,...,bkjk))]|bljlBl}, { the bases for V1...Vk}
B: ={[((b1j1,...,bkjk))]|bljlBl}, { the bases for V1...Vk}
//

Statements:
bjl=bjmMjlm

(
(
b1j1...bkjk=M11l1j1...Mk1lkjkb1l1...bklk

Ml1,...,lkj1,...,jk:=M11l1j1...Mk1lkjk is a square matrix

M1l1,...,lkj1,...,jk=M1l1j1...Mklkjk
)

(
[((b1j1,...,bkjk))]=[((b1l1,...,bklk))]M1j1l1...Mkjklk

Mj1,...,jkl1,...,lk:=M1j1l1...Mkjklk is a square matrix

M1j1,...,jkl1,...,lk=M11j1l1...Mk1jklk
)
)
//


2: Proof


Whole Strategy: Step 1: see that the transition for the bases for L(V1,...,Vk:F) holds; Step 2: see that Ml1,...,lkj1,...,jk is a square matrix; Step 3: see that the inverse of Ml1,...,lkj1,...,jk is as is claimed; Step 4: see that the transition for the bases for V1...Vk holds; Step 5: see that Mj1,...,jkl1,...,lk is a square matrix; Step 6: see that the inverse of Mj1,...,jkl1,...,lk is as is claimed; Step 7: see that also the components transitions are some square matrices.

Step 1:

B and B are indeed some bases for L(V1,...,Vk:F), by the proposition that for any field and any k finite-dimensional vectors spaces over the field, the tensors space with respect to the field and the vectors spaces and the field has the basis that consists of the tensor products of the elements of the dual bases of any bases of the vectors spaces.

b1j1...bkjk=M11l1j1...Mk1lkjkb1l1...bklk holds, by the proposition that for the tensors space with respect to any field and any k finite-dimensional vectors spaces over the field and the field, the transition of the standard bases with respect to any bases for the vector spaces is this.

Step 2:

Ml1,...,lkj1,...,jk may not look like any matrix unless k=1, because it is a multi-dimensional array.

But the set of the combinations, J:={(j1,...,jk)|1j1dimV1,...,1jkdimVk}, whose order is dimV1...dimVk, can be regarded as a single index set. And J={(l1,...,lk)|1l1dimV1,...,1lkdimVk} can be regarded as a single index set.

So, Ml1,...,lkj1,...,jk can be regarded to be a (dimV1...dimVk)×(dimV1...dimVk) square matrix: the order of the index, J, can be chosen arbitrary, for example, (1,...,1),(1,...,2),...,(dimV1,...,dimVk), which is the most natural one.

Also each of b1j1...bkjk and b1l1...bklk can be regarded to be a column vector (a kind of matrix) with the chosen order of J.

Then, b1j1...bkjk=Ml1,...,lkj1,...,jkb1l1...bklk is the usual multiplication of the square matrix and the column vector.

In fact, that is natural, because it is a transition of bases for a vectors space, and although we denote the basis, B, as {b1j1...bkjk}, just because that is somehow convenient for clarifying what each element is, in fact, the basis can be denoted also like {e1,...,edimV1...dimVk}.

For any another matrix, Nj1,...,jkm1,...,mk, with the chosen order of J, Nj1,...,jkm1,...,mkMl1,...,lkj1,...,jk is the usual multiplication of the square matrices.

Step 3:

The reason why we want to regard Ml1,...,lkj1,...,jk as a square matrix is that we want to take the inverse of it, while the reason why we want to take the inverse of it is that the inverse represents the transition of the components, by the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this: it certainly has the inverse, because it is a transition of bases.

The inverse of Ml1,...,lkj1,...,jk is the matrix, Nj1,...,jkm1,...,mk, such that Nj1,...,jkm1,...,mkMl1,...,lkj1,...,jk=δl1m1...δlkmk: the product of the reverse order is automatically guaranteed to be I, because we know that Ml1,...,lkj1,...,jk is invertible: from NM=I, MN=MNMM1=MIM1=I.

There is M1j1m1...Mkjkmk, which is a (dimV1...dimVk)×(dimV1...dimVk) matrix.

M1j1m1...MkjkmkMl1,...,lkj1,...,jk=M1j1m1...MkjkmkM11l1j1...Mk1lkjk=M1j1m1M11l1j1...Mkjkmk...Mk1lkjk=δl1m1...δlkmk, which means that M1j1m1...Mkjkmk is the inverse of Ml1,...,lkj1,...,jk.

So, M1l1,...,lkj1,...,jk=M1l1j1...Mklkjk.

M1l1,...,lkj1,...,jk represents the transition of tensor components, by the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this.

Step 4:

B and B are indeed some bases for V1...Vk, by the proposition that the tensor product of any k finite-dimensional vectors spaces has the basis that consists of the classes induced by any basis elements.

[((b1j1,...,bkjk))]=[((b1l1,...,bklk))]M1j1l1...Mkjklk holds, by the proposition that for the tensor product of any k finite-dimensional vectors spaces over any field, the transition of the standard bases with respect to any bases for the vector spaces is this.

Step 5:

Mj1,...,jkl1,...,lk may not look like any matrix unless k=1, because it is a multi-dimensional array.

But the set of the combinations, J:={(j1,...,jk)|1j1dimV1,...,1jkdimVk}, whose order is dimV1...dimVk, can be regarded as a single index set. And J={(l1,...,lk)|1l1dimV1,...,1lkdimVk} can be regarded as a single index set.

So, Mj1,...,jkl1,...,lk can be regarded to be a (dimV1...dimVk)×(dimV1...dimVk) square matrix: the order of the index, J, can be chosen arbitrary, for example, (1,...,1),(1,...,2),...,(dimV1,...,dimVk), which is the most natural one.

Also each of [((b1j1,...,bkjk))] and [((b1l1,...,bklk))] can be regarded to be a row vector (a kind of matrix) with the chosen order of J.

Then, [((b1j1,...,bkjk))]=[((b1l1,...,bklk))]Mj1,...,jkl1,...,lk is the usual multiplication of the row vector and the square matrix.

In fact, that is natural, because it is a transition of bases for a vectors space, and although we denote the basis, B, as {[((b1j1,...,bkjk))]}, just because that is somehow convenient for clarifying what each element is, in fact, the basis can be denoted also like {e1,...,edimV1...dimVk}.

For any another matrix, Nl1,...,lkm1,...,mk, with the chosen order of J, Nl1,...,lkm1,...,mkMj1,...,jkl1,...,lk is the usual multiplication of the square matrices.

Step 6:

The reason why we want to regard Mj1,...,jkl1,...,lk as a square matrix is that we want to take the inverse of it, while the reason why we want to take the inverse of it is that the inverse represents the transition of the components: it certainly has the inverse, because it is a transition of bases.

The inverse of Mj1,...,jkl1,...,lk is the matrix, Nl1,...,lkm1,...,mk, such that Nl1,...,lkm1,...,mkMj1,...,jkl1,...,lk=δj1m1...δjkmk: the product of the reverse order is automatically guaranteed to be I, because we know that Mj1,...,jkl1,...,lk is invertible.

There is M11l1m1...Mk1lkmk, which is a (dimV1...dimVk)×(dimV1...dimVk) matrix.

M11l1m1...Mk1lkmkMj1,...,jkl1,...,lk=M11l1m1...Mk1lkmkM1j1l1...Mkjklk=M11l1m1M1j1l1...Mk1lkmk...Mkjklk=δj1m1...δjkmk, which means that M11l1m1...Mk1lkmk is the inverse of Mj1,...,jkl1,...,lk.

So, M1j1,...,jkl1,...,lk=M11j1l1...Mk1jklk.

M1j1,...,jkl1,...,lk represents the transition of tensor components, by the proposition that for any finite-dimensional vectors space, the transition of the components of any vector with respect to any change of bases is this.

Step 7:

So, we have gotten the components transitions, M1l1,...,lkj1,...,jk and M1j1,...,jkl1,...,lk, also which are some matrices likewise, and the inverses are Ml1,...,lkj1,...,jk and Mj1,...,jkl1,...,lk.


3: Note


We are talking about the transition of bases or the transition of components, not about the tensor components themselves: for example, for a tensor, tL(V1,...,Vk,V1,...,Vk:F), t can be expressed with the components with respect to a standard basis as tl1,...,lkj1,...,jk, which resembles Ml1,...,lkj1,...,jk in form, but it is not so natural to regard it as a matrix: thinking of t(v1,...vk,v1,...,vk)=tl1,...,lkj1,...,jkv1j1...vkjkv1l1...vklk, in order to regard it as the multiplication of a matrix and a column vector, the column vector would be like (v11...vk1v11...vk1,v11...vk1v11...vk2,...,v1dimV1...vkdimVkv1dimV1...vkdimVk)t instead of like (v11,...,v1dimV1,...,vk1,...,vkdimVk), which might not be particularly meaningful (if it is meaningful for your situation, of course, it is fine).

The reason why that might not be meaningful is that t is not any linear map in general (refer to the proposition that a multilinear map is not necessarily linear): t is :V1×...×Vk×V1×...×VkF, a non-linear map from a (2(dimV1+...+dimVk))-dimensional vectors space into F, and thinking of the (dimV1...dimVk)2×(dimV1...dimVk)2 matrix is not meaningful in general.

Of course, as any matrix is just an arrangement of some ring elements, you can always regard tl1,...,lkj1,...,jk as a matrix, if you want to.


References


<The previous article in this series | The table of contents of this series | The next article in this series>