2025-08-03

1227: For Vectors Space and Square Matrix over Field with Dimension Equal to or Smaller Than Dimension of Vectors Space, Matrix Is Invertible iff It Maps Linearly-Independent Set of Vectors to Linearly-Independent Set of Vectors

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for vectors space and square matrix over field with dimension equal to or smaller than dimension of vectors space, matrix is invertible iff it maps linearly-independent set of vectors to linearly-independent set of vectors

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any vectors space over any field and any square matrix over the field with dimension equal to or smaller than the dimension of the vectors space, the matrix is invertible if it maps a linearly-independent set of vectors to a linearly-independent set of vectors, and if the matrix is invertible, it maps any linearly-independent set of vectors to a linearly-independent set of vectors.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(V\): \(\in \{\text{ the } F \text{ vectors spaces }\}\)
\(n\): \(\in \mathbb{N} \setminus \{0\}\), where \(n\) is equal to or smaller than the dimension of \(V\)
\(M\): \(\in \{\text{ the } n \times n F \text{ matrices }\}\)
//

Statements:
(
\(det M \neq 0\)
\(\implies\)
\(\forall \{v_1, ..., v_n\} \in \{\text{ the linearly-independent sets of } V\} (M \begin{pmatrix} v_1 \\ ... \\ v_n \end{pmatrix} \in \{\text{ the linearly-independent sets of } V\})\)
)
\(\land\)
(
\(\exists \{v_1, ..., v_n\} \in \{\text{ the linearly-independent sets of } V\} (M \begin{pmatrix} v_1 \\ ... \\ v_n \end{pmatrix} \in \{\text{ the linearly-independent sets of } V\})\)
\(\implies\)
\(det M \neq 0\)
)
//

\(V\) can be infinite-dimensional (in that case, \(n\) can be any in \(\mathbb{N} \setminus \{0\}\)).


2: Proof


Whole Strategy: Step 1: suppose that \(det M \neq 0\), and see that \(M \begin{pmatrix} v_1 \\ ... \\ v_n \end{pmatrix}\) is linearly-independent; Step 2: suppose that there is a linearly-independent \(\{v_1, ..., v_n\}\), and see that \(det M \neq 0\).

Step 1:

Let us suppose that \(det M \neq 0\).

Let \(\{v_1, ..., v_n\}\) be any linearly-independent set.

Let us see that \(M \begin{pmatrix} v_1 \\ ... \\ v_n \end{pmatrix}\) is linearly-independent.

Let \(c_1 \sum_{j_1} M^1_{j_1} v_{j_1} + ... + c_n \sum_{j_n} M^n_{j_n} v_{j_n} = 0\).

That is \(\sum_{j_1} c_{j_1} M^{j_1}_1 v_1 + ... + \sum_{j_n} c_{j_n} M^{j_n}_n v_n = 0\).

So, \(\begin{pmatrix} c_1 & ... & c_n \end{pmatrix} M = \begin{pmatrix} 0 & ... & 0 \end{pmatrix}\), because \(\{v_1, ..., v_n\}\) is linearly-independent.

As \(det M \neq 0\), \(M^{-1}\) exists, by the proposition that over any field, any square matrix has the inverse if and only if its determinant is nonzero, and the inverse is this.

So, \(\begin{pmatrix} c_1 & ... & c_n \end{pmatrix} M M^{-1} = \begin{pmatrix} 0 & ... & 0 \end{pmatrix} M^{-1}\), but the left hand side is \(\begin{pmatrix} c_1 & ... & c_n \end{pmatrix} I = \begin{pmatrix} c_1 & ... & c_n \end{pmatrix}\) and the right hand side is \(\begin{pmatrix} 0 & ... & 0 \end{pmatrix}\).

So, \(\begin{pmatrix} c_1 & ... & c_n \end{pmatrix} = \begin{pmatrix} 0 & ... & 0 \end{pmatrix}\).

That means that \(M \begin{pmatrix} v_1 \\ ... \\ v_n \end{pmatrix}\) is linearly-independent.

Step 2:

Let us suppose that there is a linearly-independent \(\{v_1, ..., v_n\}\) such that \(M \begin{pmatrix} v_1 \\ ... \\ v_n \end{pmatrix}\) is linearly-independent.

Let \(c_1 \sum_{j_1} M^1_{j_1} v_{j_1} + ... + c_n \sum_{j_n} M^n_{j_n} v_{j_n} = 0\).

That is \(\sum_{j_1} c_{j_1} M^{j_1}_1 v_1 + ... + \sum_{j_n} c_{j_n} M^{j_n}_n v_n = 0\).

So, \(\begin{pmatrix} c_1 & ... & c_n \end{pmatrix} M = \begin{pmatrix} 0 & ... & 0 \end{pmatrix}\), because \(\{v_1, ..., v_n\}\) is linearly-independent.

That is \(M^t \begin{pmatrix} c_1 \\ ... \\ c_n \end{pmatrix} = \begin{pmatrix} 0 \\ ... \\ 0 \end{pmatrix}\).

Let us suppose that \(det M^t = 0\).

Then, the rank of \(det M^t\) would be smaller than \(n\).

By Cramer's rule for any system of linear equations, a \(c_j\) would be able to be taken arbitrary: at least, there would be a solution, \((c_1, ..., c_n) = (0, ..., 0)\).

Such a nonzero \((c_1, ..., c_n)\) would satisfy \(M^t \begin{pmatrix} c_1 \\ ... \\ c_n \end{pmatrix} = \begin{pmatrix} 0 \\ ... \\ 0 \end{pmatrix}\), so, would satisfy \(\begin{pmatrix} c_1 & ... & c_n \end{pmatrix} M = \begin{pmatrix} 0 & ... & 0 \end{pmatrix}\), so, would satisfy \(\sum_{j_1} c_{j_1} M^{j_1}_1 v_1 + ... + \sum_{j_n} c_{j_n} M^{j_n}_n v_n = 0\), which is nothing but \(c_1 \sum_{j_1} M^1_{j_1} v_{j_1} + ... + c_n \sum_{j_n} M^n_{j_n} v_{j_n} = 0\).

That would mean that \(c_1 \sum_{j_1} M^1_{j_1} v_{j_1} + ... + c_n \sum_{j_n} M^n_{j_n} v_{j_n} = 0\) had a nonzero \((c_1, ..., c_n)\), a contradiction against that \(M \begin{pmatrix} v_1 \\ ... \\ v_n \end{pmatrix}\) is linearly-independent.

So, \(det M = det M^t \neq 0\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>