2025-08-03

1226: Cramer's Rule for System of Linear Equations

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of Cramer's rule for system of linear equations

Topics


About: field

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of Cramer's rule for any system of linear equations.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(m\): \(\in \mathbb{N} \setminus \{0\}\)
\(n\): \(\in \mathbb{N} \setminus \{0\}\)
\(M\): \(\in \{\text{ the } m \times n F \text{ matrices }\}\), with rank \(k\) such that the top-left \(k \times k\) matrix has a nonzero determinant
\(c\): \(\in \{\text{ the } m \times 1 F \text{ matrices }\}\)
\(M \begin{pmatrix} x^1 \\ ... \\ x^n \end{pmatrix} = c\)
//

Statements:
\(\forall j \in \{k + 1, ..., m\} (det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} = 0)\)
\(\iff\)
\(M \begin{pmatrix} x^1 \\ ... \\ x^n \end{pmatrix} = c\) has some solutions
\(\iff\)
These are the solutions: \((x^{k + 1}, ..., x^n)\) is arbitrary; \(\begin{pmatrix} x^1 \\ ... \\ x^k \end{pmatrix} = {\begin{pmatrix} M^1_1 & ... & M^1_k \\ ... \\ M^k_1 & ... & M^k_k \end{pmatrix}}^{-1} \begin{pmatrix} c^1 - \sum_{j \in \{k + 1, ..., n\}} M^1_j x^j \\ ... \\ c^k - \sum_{j \in \{k + 1, ..., n\}} M^k_j x^j \end{pmatrix}\)
//

This proposition presupposes that the top-left \(k \times k\) matrix has a nonzero determinant; if not, the system can be rearranged to be so; so, this proposition can be applied to any system.

When \(k = m\), the condition is vacuously satisfied.


2: Note


It is possible to be able to choose another \(n - k\) components of \(x\) arbitrary, but that does not make any new solution.

For example, let \(\begin{pmatrix} 1 & 2 \\ 2 & 4 \end{pmatrix} \begin{pmatrix} x^1 \\ x^2 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}\), then, \(k = 1\), and taking \(x^2\) arbitrary and \(x^1 = - 2 x^2\) are the solutions, but the system can be expressed also as \(\begin{pmatrix} 2 & 1 \\ 4 & 2 \end{pmatrix} \begin{pmatrix} x^2 \\ x^1 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix}\), then, also taking \(x^1\) arbitrary and \(x^2 = - 1 / 2 x^1\) are the solutions, but in fact, the latter set of solutions equals the former set of solutions.


3: Proof


Whole Strategy: Step 1: see that when \(m = n = k\), \(x = M^{-1} c\) is the unique solution; Step 2: suppose that \(M \begin{pmatrix} x^1 \\ ... \\ x^n \end{pmatrix} = c\) has some solutions, and see that \(det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} = 0\); Step 3: suppose that \(det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} = 0\), and see that \(\begin{pmatrix} x^1 \\ ... \\ x^k \end{pmatrix} = \begin{pmatrix} M^1_1 & ... & M^1_k \\ ... \\ M^k_1 & ... & M^k_k \end{pmatrix}^{-1} \begin{pmatrix} c^1 - \sum_{j \in \{k + 1, ..., n\}} M^1_j x^j \\ ... \\ c^k - \sum_{j \in \{k + 1, ..., n\}} M^k_j x^j \end{pmatrix}\) are the solutions.

Step 1:

Let us think of the special case, \(m = n = k\).

That means that \(det M \neq 0\).

There is the inverse of \(M\), \(M^{-1}\), by the proposition that over any field, any square matrix has the inverse iff its determinant is nonzero, and the inverse is this.

\(M^{-1} M \begin{pmatrix} x^1 \\ ... \\ x^n \end{pmatrix} = M^{-1} c\), but the left hand side is \(I \begin{pmatrix} x^1 \\ ... \\ x^n \end{pmatrix} = \begin{pmatrix} x^1 \\ ... \\ x^n \end{pmatrix}\).

So, \(x = M^{-1} c\) is necessary.

And \(x = M^{-1} c\) is indeed a solution, because \(M x = M M^{-1} c = I c = c\).

Step 2:

Let us suppose that \(M \begin{pmatrix} x^1 \\ ... \\ x^n \end{pmatrix} = c\) has some solutions.

When \(m = k\), \(\forall j \in \{k + 1, ..., m\} (det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} = 0)\) is vacuously true.

Let us suppose that \(k \lt m\).

Let us think of \(det \begin{pmatrix} M^1_1 & ... & M^1_k & \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1} \\ ... \\ M^k_1 & ... & M^k_k & \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k} \\ M^j_1 & ... & M^j_k & \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j} \end{pmatrix}\).

\(= det \sum_{l \in \{1, ..., n\}} \begin{pmatrix} M^1_1 & ... & M^1_k & M^1_l x^l \\ ... \\ M^k_1 & ... & M^k_k & M^k_l x^l \\ M^j_1 & ... & M^j_k & M^j_l x^l \end{pmatrix} = \sum_{l \in \{1, ..., n\}} det \begin{pmatrix} M^1_1 & ... & M^1_k & M^1_l x^l \\ ... \\ M^k_1 & ... & M^k_k & M^k_l x^l \\ M^j_1 & ... & M^j_k & M^j_l x^l \end{pmatrix}\), by a property of determinant of square matrix over commutative ring, \(= \sum_{l \in \{1, ..., n\}} x^l det \begin{pmatrix} M^1_1 & ... & M^1_k & M^1_l \\ ... \\ M^k_1 & ... & M^k_k & M^k_l \\ M^j_1 & ... & M^j_k & M^j_l \end{pmatrix}\), by a property of determinant of square matrix over commutative ring.

For each \(l \le k\), the determinant is \(0\), because the last column is duplicate with the \(l\)-th column; for each \(k \lt l\), the determinant is \(0\), because the rank of \(M\) is \(k\).

So, \(= 0\).

So, \(det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} = det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} - 0 = det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} - det \begin{pmatrix} M^1_1 & ... & M^1_k & \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1} \\ ... \\ M^k_1 & ... & M^k_k & \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k} \\ M^j_1 & ... & M^j_k & \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j} \end{pmatrix} = det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 - \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1} \\ ... \\ M^k_1 & ... & M^k_k & c^k - \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k} \\ M^j_1 & ... & M^j_k & c^j - \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j} \end{pmatrix}\), by some properties of determinant of square matrix over commutative ring.

As there is a solution, \(x_0\), let us put \(x = x_0\) into the above equation, which is valid because the equation holds for any \(x\).

Then, the last column is \(0\), so, the determinant is \(0\).

So, \(det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} = 0\).

Step 3:

Let us suppose that \(\forall j \in \{k + 1, ..., m\} (det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix} = 0)\).

Let \(N := \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 - \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1} \\ ... \\ M^k_1 & ... & M^k_k & c^k - \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k} \\ M^j_1 & ... & M^j_k & c^j - \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j} \end{pmatrix}\).

By applying the Laplace expansion of the determinant of any square matrix over any commutative ring holds and its corollary with respect to the last column, \(det N = \sum_{p \in \{1, ..., n\}} N^p_{k + 1} N_{p, k + 1} = (c^1 - \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1}) N_{1, k + 1} + ... + (c^k - \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k}) N_{k, k + 1} + (c^j - \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j}) N_{k + 1, k + 1}\), where \(N_{k + 1, k + 1}\) is nothing but the determinant of the top-left \(k \times k\) submatrix of \(M\), which is nonzero.

As has been seen in Step 2, \(det N = det \begin{pmatrix} M^1_1 & ... & M^1_k & c^1 \\ ... \\ M^k_1 & ... & M^k_k & c^k \\ M^j_1 & ... & M^j_k & c^j \end{pmatrix}\), which is \(0\) by the suppositon.

So, \(det N = (c^1 - \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1}) N_{1, k + 1} + ... + (c^k - \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k}) N_{k, k + 1} + (c^j - \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j}) N_{k + 1, k + 1} = 0\).

Let \((x^{k + 1}, ..., x^n)\) be arbitrary.

The system of the top \(k\) equations becomes \(\begin{pmatrix} M^1_1 & ... & M^1_k \\ ... \\ M^k_1 & ... & M^k_k \end{pmatrix} \begin{pmatrix} x^1 \\ ... \\ x^k \end{pmatrix} = \begin{pmatrix} c^1 - \sum_{j \in \{k + 1, ..., n\}} M^1_j x^j \\ ... \\ c^k - \sum_{j \in \{k + 1, ..., n\}} M^k_j x^j \end{pmatrix}\).

By Step 1, \(\begin{pmatrix} x^1 \\ ... \\ x^k \end{pmatrix} = \begin{pmatrix} M^1_1 & ... & M^1_k \\ ... \\ M^k_1 & ... & M^k_k \end{pmatrix}^{-1} \begin{pmatrix} c^1 - \sum_{j \in \{k + 1, ..., n\}} M^1_j x^j \\ ... \\ c^k - \sum_{j \in \{k + 1, ..., n\}} M^k_j x^j \end{pmatrix}\) is the unique solution for the system of the \(k\) equations.

For \((c^1 - \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1}) N_{1, k + 1} + ... + (c^k - \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k}) N_{k, k + 1} + (c^j - \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j}) N_{k + 1, k + 1} = 0\), \(c^1 - \sum_{l_1 \in \{1, ..., n\}} M^1_{l_1} x^{l_1} = ... = c^k - \sum_{l_k \in \{1, ..., n\}} M^k_{l_k} x^{l_k} = 0\), so, \((c^j - \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j}) N_{k + 1, k + 1} = 0\), but as \(N_{k + 1, k + 1} \neq 0\), \(c^j - \sum_{l_j \in \{1, ..., n\}} M^j_{l_j} x^{l_j} = 0\).

So, such \(x\) s satisfy the original system.

So, there are some solutions of the original system.

There are no other solution, because \(\begin{pmatrix} x^1 \\ ... \\ x^k \end{pmatrix} = \begin{pmatrix} M^1_1 & ... & M^1_k \\ ... \\ M^k_1 & ... & M^k_k \end{pmatrix}^{-1} \begin{pmatrix} c^1 - \sum_{j \in \{k + 1, ..., n\}} M^1_j x^j \\ ... \\ c^k - \sum_{j \in \{k + 1, ..., n\}} M^k_j x^j \end{pmatrix}\) is a necessary condition: for any solution, \(x\), \((x^{k + 1}, ..., x^n)\) needs to take a value, and \((x^1, ..., x^k)\) is uniquely determined for the value, and that solution is one of the above solutions.

So, such \(x\) s are the solutions.


References


<The previous article in this series | The table of contents of this series | The next article in this series>