description/proof of that for n x n matrix, if there are m rows with more than n - m same columns 0, matrix is not invertible
Topics
About: matrix
The table of contents of this article
Starting Context
- The reader knows a definition of %field name% matrices space.
- The reader admits the proposition that any square matrix is invertible if and only if its determinant is nonzero.
- The reader admits the proposition that for any finite-dimensional vectors space, there is no linearly independent subset that has more than the dimension number of elements.
Target Context
- The reader will have a description and a proof of the proposition that for any n x n matrix, if there are any m rows with more than any n - m same columns 0, the matrix is not invertible.
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(M\): \(\in \{\text{ the } n x n F \text{ matrices } \}\)
//
Statements:
\(\exists \{j_1, ..., j_m\} \subseteq \{1, ..., n\}, \exists \{k_1, ..., k_l\} \subseteq \{1, ..., n\} \text{ such that } n - m \lt l (\forall j_p \in \{j_1, ..., j_m\}, \forall k_q \in \{k_1, ..., k_l\} (M^{j_p}_{k_q} = 0))\)
\(\implies\)
\(M \notin \{\text{ the invertible matrices }\}\)
//
2: Note
For example, \(\begin{pmatrix} M^1_1 & 0 & 0 \\ M^2_1 & 0 & 0 \\ M^3_1 & M^3_2 & M^3_3 \end{pmatrix}\) is not invertible: \(\{j_1, j_2\} = \{1, 2\}\) and \(\{k_1, k_2\} = \{2, 3\}\) with \(n - m = 3 - 2 = 1 \lt 2 = l\).
This proposition directly talks only about "any m rows with more than any n - m same columns 0", but of course, it holds also for 'any m columns with more than any n - m same rows 0', because we can just transpose the matrix, apply the proposition for the transposed matrix, and transpose the transposed matrix back to the original matrix without changing the non-invertibleness.
3: Proof
Whole Strategy: Step 1: see that any square matrix is invertible if and only if the determinant is nonzero; Step 2: see that the \(m\) rows are linearly dependent; Step 3: conclude the proposition.
Step 1:
Any square matrix is invertible if and only if the determinant is nonzero, by the proposition that any square matrix is invertible if and only if its determinant is nonzero.
So, all what we need to check is that \(det M = 0\).
Step 2:
Let us see that the \(m\) rows are linearly dependent.
The condition means that the \(m\) rows has less than \(n - (n - m) = m\) nonzero columns.
Let the number of the nonzero columns be \(s\), where \(s \lt m\).
The dimension of the vectors space spanned by the \(m\) rows is at most \(s\).
The \(m\) rows cannot be linearly independent, by the proposition that for any finite-dimensional vectors space, there is no linearly independent subset that has more than the dimension number of elements.
Step 3:
As the matrix has some linearly dependent set of rows, its determinant is \(0\).
So, \(M\) is not invertible.