description/proof of that for finite-dimensional vectors space, proper subspace has lower dimension
Topics
About: vectors space
The table of contents of this article
- Starting Context
- Target Context
- Orientation
- Main Body
- 1: Structured Description
- 2: Natural Language Description
- 3: Proof
Starting Context
Target Context
- The reader will have a description and a proof of the proposition that for any finite-dimensional vectors space, any proper subspace has a lower dimension.
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\(F\): \(\in \{\text{ the fields }\}\)
\(V'\): \(\in \{\text{ the } d' \text{ -dimensional } F \text{ vectors spaces }\}\)
\(V\): \(\in \{\text{ the } d \text{ -dimensional proper vectors subspaces of } V'\}\)
//
Statements:
\(d \lt d'\)
//
2: Natural Language Description
For any field, \(F\), any \(d'\)-dimensional \(F\) vectors space, \(V'\), and any \(d\)-dimensional proper vectors subspace, \(V \subseteq V'\), \(d \lt d'\).
3: Proof
Whole Strategy: Step 1: suppose that \(d' \le d\); Step 2: find a contradiction: find a set of more than \(d'\) linearly independent elements on \(V'\), which would contradict the proposition that for any finite-dimensional vectors space, there is no linearly independent subset that has more than the dimension number of elements.
Step 1:
Let us suppose that \(d' \le d\).
Step 2:
Step 2 Strategy: Step 2-1: take a \(d\)-cardinality basis of \(V\); Step 2-2: take any element of \(V' \setminus V\); Step 2-3: show that the basis with the element added would be linearly independent on \(V'\).
Step 2-1:
\(V\) would have a basis, \(\{e_1, ..., e_d\}\).
Step 2-2:
There would be a \(v \in V' \setminus V\).
Step 2-3:
Let us prove that \(\{e_1, ..., e_d, v\}\) would be linearly independent on \(V'\).
Let us suppose otherwise.
\(c^1 e_1 + ... + c^d e_d + c v = 0\) would have a nonzero solution for \((c^1, ..., c^d, c)\). In fact, \(c \neq 0\), because otherwise, \(c^1 e_1 + ... + c^d e_d = 0\), which would imply \(c^j = 0\), because the basis was linearly independent. So, \(v = - c^{-1} (c^1 e_1 + ... + c^d e_d)\), a contradiction against \(v \notin V\).
So, \(\{e_1, ..., e_d, v\}\) would be linearly independent on \(V'\)
So, \(V'\) would have a set of \(d + 1\) linearly independent vectors, but \(d' \lt d + 1\), a contradiction against the proposition that for any finite-dimensional vectors space, there is no linearly independent subset that has more than the dimension number of elements.
So, \(d \lt d'\).