2026-01-18

1567: For Finite-Dimensional Vectors Space with Inner Product, Components Matrix of Inner Product w.r.t. Basis Is Invertible

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for finite-dimensional vectors space with inner product, components matrix of inner product w.r.t. basis is invertible

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any finite-dimensional vectors space with any inner product, the components matrix of the inner product with respect to any basis is invertible.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\mathbb{R}, \mathbb{C}\}\), with the canonical field structure
\(d\): \(\in \mathbb{N} \setminus \{0\}\)
\(V\): \(\in \{\text{ the } d \text{ -dimensional } F \text{ vectors spaces }\}\), with any inner product, \(\langle \bullet, \bullet \rangle\)
\(B\): \(\in \{\text{ the bases for } V\}\), \(= \{b_1, ..., b_d\}\)
\(M\): \(\in \{\text{ the } d \times d F \text{ matrices }\}\), such that \(M^j_l = \langle b_l, b_j \rangle\)
//

Statements:
\(det M \neq 0\)
//


2: Proof


Whole Strategy: Step 1: take the Gram-Schmidt orthonormalization of \(B\), \(B' = \{b'_1, ..., b'_d\}\), and see that \(M'\) with respect to \(B'\) is \(I\); Step 2: see that \(b_j = b'_l N^l_j\) where \(det N \neq 0\); Step 3: see that \(M = N^* M' N\); Step 4: see that \(det M = det (N^* M' N) \neq 0\).

Step 1:

Let us take the Gram-Schmidt orthonormalization of \(B\), \(B' = \{b'_1, ..., b'_d\}\).

\(B'\) is a basis for \(V\), by the proposition that for any vectors space with any inner product, any orthonormal subset is linearly independent and the proposition that for any finite-dimensional vectors space, any linearly independent subset with dimension number of elements is a basis.

Let \(M'\) be the \(d \times d\) \(F\) matrix such that \(M'^j_l = \langle b'_l, b'_j \rangle\).

\(M' = I\), because \(B'\) is orthonormal.

Step 2:

For each \(j \in \{1, ..., d\}\), \(b_j = b'_l N^l_j\) where \(N^l_j \in F\), because \(B'\) is a basis.

\(det N \neq 0\), by the proposition that for any vectors space over any field and any square matrix over the field with dimension equal to or smaller than the dimension of the vectors space, the matrix is invertible if it maps a linearly-independent set of vectors to a linearly-independent set of vectors, and if the matrix is invertible, it maps any linearly-independent set of vectors to a linearly-independent set of vectors.

Step 3:

\(M^j_l = \langle b_l, b_j \rangle = \langle b'_m N^m_l, b'_n N^n_j \rangle = \overline{N^n_j} \langle b'_m, b'_n \rangle N^m_l = \overline{N^n_j} M'^n_m N^m_l = (N^* M' N)^j_l\), so, \(M = N^* M' N\).

Step 4:

\(det M = det (N^* M' N) = det N^* det M' det N \neq 0\), because \(det N^* = \overline{det N} \neq 0\), \(det M' = det I = 1 \neq 0\), and \(det N \neq 0\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>