2025-12-14

1490: For Square Matrix over Field with \(2\) Square Diagonal Blocks with \(1\) of Rest Blocks \(0\), Determinant Is Nonzero iff Determinants of Diagonal Blocks Are Nonzero

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for square matrix over field with \(2\) square diagonal blocks with \(1\) of rest blocks \(0\), determinant is nonzero iff determinants of diagonal blocks are nonzero

Topics


About: matrices space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any square matrix over any field with any \(2\) square diagonal blocks with \(1\) of the rest blocks \(0\), the determinant is nonzero if and only if the determinants of the diagonal blocks are nonzero.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(M\): \(\in \{\text{ the } n \times n \text{ matrices over } F\}\), \(= \begin{pmatrix} M^1_1 & M^1_2 \\ M^2_1 & M^2_2 \end{pmatrix}\) where \(M^1_1 \in \{\text{ the } n_1 \times n_1 \text{ matrices over } F\}\) and \(M^2_2 \in \{\text{ the } n_2 \times n_2 \text{ matrices over } F\}\) and \(M^2_1 = 0\) or \(M^1_2 = 0\)
//

Statements:
\(det M \neq 0\)
\(\iff\)
\(det M^1_1 \neq 0 \land det M^2_2 \neq 0\)
//


2: Note


This proposition can be applied iteratively, meaning that when \(M^2_2\) is in the shape of \(M\), this proposition can be applied to \(M^2_2\), and so on.

Without the requirement, \(M^2_1 = 0\) or \(M^1_2 = 0\), the proposition does not hold: for example, for \(M := \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\) with \(n_1 = 1\) and \(n_2 = 1\), \(det M \neq 0\) but \(det M^1_1 = 0\) and \(det M^2_2 = 0\).


3: Proof


Whole Strategy: apply the proposition that the determinant of any square matrix over any field is nonzero if and only if the set of the columns or the rows is linearly independent; Step 1: suppose that \(M^2_1 = 0\); Step 2: suppose that \(det M \neq 0\); Step 3: see that \(det M^1_1 \neq 0\) and \(det M^2_2 \neq 0\); Step 4: suppose that \(det M^1_1 \neq 0\) and \(det M^2_2 \neq 0\); Step 5: see that \(det M \neq 0\); Step 6: conclude the proposition for the \(M^1_2 = 0\) case.

Step 1:

Let us suppose that \(M^2_1 = 0\).

Step 2:

Let us suppose that \(det M \neq 0\).

Step 3:

By the proposition that the determinant of any square matrix over any field is nonzero if and only if the set of the columns or the rows is linearly independent, the columns of \(M\) is linearly independent.

So, especially, the left \(n_1\) columns of \(M\) are linearly independent.

So, the columns of \(M^1_1\) are linearly independent: denoting the \((j, l)\) component of \(M^1_1\) as \({M^1_1}^j_l\), if \(r_1 \begin{pmatrix} {M^1_1}^1_1 \\ ... \\ {M^1_1}^{n_1}_1 \end{pmatrix} + ... + r_{n_1} \begin{pmatrix} {M^1_1}^1_{n_1} \\ ... \\ {M^1_1}^{n_1}_{n_1} \end{pmatrix} = 0\) with some nonzero \((r_1, ..., r_{n_1})\), \(r_1 \begin{pmatrix} {M^1_1}^1_1 \\ ... \\ {M^1_1}^{n_1}_1 \\ 0 \\ ... \\ 0 \end{pmatrix} + ... + r_{n_1} \begin{pmatrix} {M^1_1}^1_{n_1} \\ ... \\ {M^1_1}^{n_1}_{n_1} \\ 0 \\ ... \\ 0 \end{pmatrix} = 0\) with the same nonzero \((r_1, ..., r_{n_1})\), a contradiction against that the left \(n_1\) columns of \(M\) are linearly independent.

So, \(det M^1_1 \neq 0\) by the proposition that the determinant of any square matrix over any field is nonzero if and only if the set of the columns or the rows is linearly independent.

Likewise, the rows of \(M\) is linearly independent.

So, especially, the bottom \(n_2\) rows of \(M\) are linearly independent.

So, the rows of \(M^2_2\) are linearly independent: denoting the \((j, l)\) component of \(M^2_2\) as \({M^2_2}^j_l\), if \(r_1 \begin{pmatrix} {M^2_2}^1_1 & ... & {M^2_2}^1_{n_2} \end{pmatrix} + ... + r_{n_2} \begin{pmatrix} {M^2_2}^{n_2}_1 & ... & {M^2_2}^{n_2}_{n_2} \end{pmatrix} = 0\) with some nonzero \((r_1, ..., r_{n_2})\), \(r_1 \begin{pmatrix} 0 & ... & 0 & {M^2_2}^1_1 & ... & {M^2_2}^1_{n_2} \end{pmatrix} + ... + r_{n_1} \begin{pmatrix} 0 & ... & 0 & {M^2_2}^{n_2}_1 & ... & {M^2_2}^{n_2}_{n_2} \end{pmatrix} = 0\) with the same nonzero \((r_1, ..., r_{n_2})\), a contradiction against that the bottom \(n_2\) rows of \(M\) are linearly independent.

So, \(det M^2_2 \neq 0\), as before.

Step 4:

Let us suppose that \(det M^1_1 \neq 0\) and \(det M^2_2 \neq 0\).

Step 5:

By the proposition that the determinant of any square matrix over any field is nonzero if and only if the set of the columns or the rows is linearly independent, the columns of \(M^1_1\) are linearly independent.

So, the left \(n_1\) columns of \(M\) are linearly independent, by the proposition that for any finite-dimensional columns or rows module and any linearly independent subset, any expansion of the subset into any larger-dimensional columns or rows module is linearly independent.

Likewise, the columns of \(M^2_2\) are linearly independent.

So, the right \(n_2\) columns of \(M\) are linearly independent, as before.

Let us see that the columns of \(M\) are linearly independent.

Let \(r_1 \begin{pmatrix} {M^1_1}^1_1 \\ ... \\ {M^1_1}^{n_1}_1 \\ 0 \\ ... \\ 0 \end{pmatrix} + ... r_{n_1} \begin{pmatrix} {M^1_1}^1_{n_1} \\ ... \\ {M^1_1}^{n_1}_{n_1} \\ 0 \\ ... \\ 0 \end{pmatrix} + r_{n_1 + 1} \begin{pmatrix} {M^1_2}^1_1 \\ ... \\ {M^1_2}^{n_1}_1 \\ {M^2_2}^1_1 \\ ... \\ {M^2_2}^{n_2}_1 \end{pmatrix} + ... + r_{n_1 + n_2} \begin{pmatrix} {M^1_2}^1_{n_2} \\ ... \\ {M^1_2}^{n_1}_{n_2} \\ {M^2_2}^1_{n_2} \\ ... \\ {M^2_2}^{n_2}_{n_2} \end{pmatrix} = 0\).

So, \(r_1 \begin{pmatrix} 0 \\ ... \\ 0 \end{pmatrix} + ... r_{n_1} \begin{pmatrix} 0 \\ ... \\ 0 \end{pmatrix} + r_{n_1 + 1} \begin{pmatrix} {M^2_2}^1_1 \\ ... \\ {M^2_2}^{n_2}_1 \end{pmatrix} + ... + r_{n_1 + n_2} \begin{pmatrix} {M^2_2}^1_{n_2} \\ ... \\ {M^2_2}^{n_2}_{n_2} \end{pmatrix} = 0\), which implies that \((r_{n_1 + 1}, ..., r_{n_1 + n_2}) = (0, ..., 0)\), because the columns of \(M^2_2\) are linearly independent.

So, \(r_1 \begin{pmatrix} {M^1_1}^1_1 \\ ... \\ {M^1_1}^{n_1}_1 \\ 0 \\ ... \\ 0 \end{pmatrix} + ... r_{n_1} \begin{pmatrix} {M^1_1}^1_{n_1} \\ ... \\ {M^1_1}^{n_1}_{n_1} \\ 0 \\ ... \\ 0 \end{pmatrix} = 0\).

So, \(r_1 \begin{pmatrix} {M^1_1}^1_1 \\ ... \\ {M^1_1}^{n_1}_1 \end{pmatrix} + ... r_{n_1} \begin{pmatrix} {M^1_1}^1_{n_1} \\ ... \\ {M^1_1}^{n_1}_{n_1} \end{pmatrix} = 0\), which implies that \((r_1, ..., r_{n_1}) = (0, ..., 0)\), because the columns of \(M^1_1\) are linearly independent.

So, \((r_1, ..., f_{n_1 + n_2}) = (0, ..., 0)\).

So, the columns of \(M\) are linearly independent.

So, \(det M \neq 0\), by the proposition that the determinant of any square matrix over any field is nonzero if and only if the set of the columns or the rows is linearly independent.

Step 6:

Let us suppose that \(M^1_2 = 0\).

Let us think of the transpose of \(M\), \(M^t\).

\(M^t = \begin{pmatrix} {M^1_1}^t & {M^2_1}^t \\ {M^1_2}^t & {M^2_2}^t \end{pmatrix}\).

Now, \({M^1_2}^t = 0\).

By Step 4 and Step 5, \(det M^t \neq 0\) if and only if \(det {M^1_1}^t \neq 0\) and \(det {M^2_2}^t \neq 0\).

So, \(det M = det M^t \neq 0\) if and only if \(det {M^1_1} = det {M^1_1}^t \neq 0\) and \(det {M^2_2} = det {M^2_2}^t \neq 0\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>