description/proof of that for linearly independent sequence in vectors space, derived sequence in which each element is linear combination of equal or smaller index elements with nonzero equal index coefficient is linearly independent
Topics
About: vectors space
The table of contents of this article
- Starting Context
- Target Context
- Orientation
- Main Body
- 1: Structured Description
- 2: Natural Language Description
- 3: Proof
Starting Context
- The reader knows a definition of %field name% vectors space.
- The reader knows a definition of linearly independent subset of module.
Target Context
- The reader will have a description and a proof of the proposition that for any linearly independent sequence in any vectors space, any derived sequence in which each element is any linear combination of equal or smaller index elements with any nonzero equal index coefficient is linearly independent.
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(V\): \(\in \{\text{ the } F \text{ vectors spaces }\}\)
\((v_1, v_2, ...)\): \(\subseteq V\), \(\in \{\text{ the linearly independent subsets of } V\}\)
\((w_1, w_2, ...)\): \(\subseteq V\)
//
Statements:
\(\forall k (w_k = \sum_{j = \in \{1, ..., k\}} r_k^j v_j \text{ where } r_k^j \in F \land r_k^k \neq 0)\)
\(\implies\)
\((w_1, w_2, ...) \in \{\text{ the linearly independent subsets of } V\}\).
//
2: Natural Language Description
For any field, \(F\), any \(F\) vectors space, \(V\), and any linearly independent sequence in \(V\), \((v_1, v_2, ...)\), any sequence in \(V\), \((w_1, w_2, ...)\), such that for each \(k\), \(w_k = \sum_{j = \in \{1, ..., k\}} r_k^j v_j\) where \(r_k^j \in F \land r_k^k \neq 0\), is linearly independent.
3: Proof
For each finite subset, \(S\), of \(\{w_1, w_2, ...\}\), there is the maximum index element, \(w_m \in S\). Let us think of \(S' := \{w_1, w_2, ..., w_m\}\). \(S \subseteq S'\).
Let us think of \(\sum_{k \in \{1, ..., m\}} s^k w_k = 0\) where \(s^k \in F\). If we prove that \(s^k = 0\) for each \(k \in \{1, ..., m\}\), the proposition will be proved, because for \(S\), it is the special case that \(s^k = 0\) for each \(w_k \notin S\), and if \(s^k = 0\) for each \(k \in \{1, ..., m\}\) in the general case, it will be even more so in the special case.
\(\sum_{k \in \{1, ..., m\}} s^k w_k = \sum_{k \in \{1, ..., m\}} s^k \sum_{j = \in \{1, ..., k\}} r_k^j v_j = s^1 (r_1^1 v_1) + s^2 (r_2^1 v_1 + r_2^2 v_2) + ... + s^m (r_m^1 v_1 + ... + r_m^m v_m) = (s^1 r_1^1 + s^2 r_2^1 + ... + s^m r_m^1) v_1 + (s^2 r_2^2 + s^3 r_3^2 + ... + s^m r_m^2) v_2 + ... + (s^{m - 1} r_{m - 1}^{m - 1} + s^m r_m^{m - 1}) v_{m - 1} + s^m r_m^m v_m = 0\).
As each coefficient of \(v_j\) is \(0\), \(s^m r_m^m = 0\), but as \(r_m^m \neq 0\), \(s^m = 0\); \(s^{m - 1} r_{m - 1}^{m - 1} + s^m r_m^{m - 1} = s^{m - 1} r_{m - 1}^{m - 1} + 0 r_m^{m - 1} = 0\), which implies that \(s^{m - 1} r_{m - 1}^{m - 1} = 0\), but as \(r_{m - 1}^{m - 1} \neq 0\), \(s^{m - 1} = 0\); ...; \(s^2 r_2^2 + s^3 r_3^2 + ... + s^m r_m^2 = s^2 r_2^2 + 0 r_3^2 + ... + 0 r_m^2 = 0\), which implies that \(s^2 r_2^2 = 0\), but as \(r_2^2 \neq 0\), \(s^2 = 0\); \(s^1 r_1^1 + s^2 r_2^1 + ... + s^m r_m^1 = s^1 r_1^1 + 0 r_2^1 + ... + 0 r_m^1 = 0\), which implies that \(s^1 r_1^1 = 0\), but as \(r_1^1 \neq 0\), \(s^1 = 0\).