description/proof of that for finite-dimensional columns or rows module and linearly independent subset, expansion of subset into larger-dimensional columns or rows module is linearly independent
Topics
About: module
The table of contents of this article
Starting Context
- The reader knows a definition of %ring name% module.
Target Context
- The reader will have a description and a proof of the proposition that for any finite-dimensional columns or rows module and any linearly independent subset, any expansion of the subset into any larger-dimensional columns or rows module is linearly independent.
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\(R\): \(\in \{\text{ the rings }\}\)
\(d\): \(\in \mathbb{N}\)
\(d'\): \(\in \mathbb{N}\), such that \(d \lt d'\)
\(M_c\): \(= \{(r^1, ..., r^d)^t \vert r^j \in R\}\), \(\in \{\text{ the } R \text{ modules }\}\)
\(M_r\): \(= \{(r^1, ..., r^d) \vert r^j \in R\}\), \(\in \{\text{ the } R \text{ modules }\}\)
\(S_c\): \(= \{({r_1}^1, ..., {r_1}^d)^t, ..., ({r_n}^1, ..., {r_n}^d)^t\}\), \(\in \{\text{ the linearly independent subsets of } M_c\}\)
\(S_r\): \(= \{({r_1}^1, ..., {r_1}^d), ..., ({r_n}^1, ..., {r_n}^d)\}\), \(\in \{\text{ the linearly independent subsets of } M_r\}\)
\(M'_c\): \(= \{(r^1, ..., r^{d'})^t \vert r^j \in R\}\), \(\in \{\text{ the } R \text{ modules }\}\)
\(M'_r\): \(= \{(r^1, ..., r^{d'}) \vert r^j \in R\}\), \(\in \{\text{ the } R \text{ modules }\}\)
\(S'_c\): \(= \{({r_1}^1, ..., {r_1}^d, {r_1}^{d + 1}, ..., {r_1}^{d'})^t, ..., ({r_n}^1, ..., {r_n}^d, {r_n}^{d + 1}, ..., {r_n}^{d'})^t\}\), where the added components are arbitrary
\(S'_r\): \(= \{({r_1}^1, ..., {r_1}^d, {r_1}^{d + 1}, ..., {r_1}^{d'}), ..., ({r_n}^1, ..., {r_n}^d, {r_n}^{d + 1}, ..., {r_n}^{d'})\}\), where the added components are arbitrary
//
Statements:
\(S'_c \in \{\text{ the linearly independent subsets of } M'_c\}\)
\(\land\)
\(S'_r \in \{\text{ the linearly independent subsets of } M'_r\}\)
//
2: Proof
Whole Strategy: Step 1: take \(c_1 ({r_1}^1, ..., {r_1}^d, {r_1}^{d + 1}, ..., {r_1}^{d'})^t + ... + ({r_n}^1, ..., {r_n}^d, {r_n}^{d + 1}, ..., {r_n}^{d'})^t = 0\) and see that all the \(c_j\) s are \(0\); Step 2: take \(c_1 ({r_1}^1, ..., {r_1}^d, {r_1}^{d + 1}, ..., {r_1}^{d'}) + ... + ({r_n}^1, ..., {r_n}^d, {r_n}^{d + 1}, ..., {r_n}^{d'}) = 0\) and see that all the \(c_j\) s are \(0\).
Step 1:
Let us take \(c_1 ({r_1}^1, ..., {r_1}^d, {r_1}^{d + 1}, ..., {r_1}^{d'})^t + ... + ({r_n}^1, ..., {r_n}^d, {r_n}^{d + 1}, ..., {r_n}^{d'})^t = 0\).
That means that \(c_1 ({r_1}^1, ..., {r_1}^d)^t + ... + ({r_n}^1, ..., {r_n}^d)^t = 0\), because as all the \(d'\) components are \(0\), especially, all the \(d\) components are \(0\).
As \(S_c\) is linearly independent, all the \(c_j\) s are \(0\).
So, \(S'_c\) is linearly independent.
Step 2:
Let us take \(c_1 ({r_1}^1, ..., {r_1}^d, {r_1}^{d + 1}, ..., {r_1}^{d'}) + ... + ({r_n}^1, ..., {r_n}^d, {r_n}^{d + 1}, ..., {r_n}^{d'}) = 0\).
That means that \(c_1 ({r_1}^1, ..., {r_1}^d) + ... + ({r_n}^1, ..., {r_n}^d) = 0\), because as all the \(d'\) components are \(0\), especially, all the \(d\) components are \(0\).
As \(S_r\) is linearly independent, all the \(c_j\) s are \(0\).
So, \(S'_r\) is linearly independent.