2025-02-16

1006: Free Vectors Space on Set

<The previous article in this series | The table of contents of this series | The next article in this series>

definition of free vectors space on set

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a definition of free vectors space on set.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\( S\): \(\in \{\text{ the sets }\}\)
\( F\): \(\in \{\text{ the fields }\}\)
\(*F (S, F)\): \(= \{f: S \to F \in Pow (S \times F) \vert f^{-1} (F \setminus \{0\}) \in \{\text{ the finite sets }\}\}\) with the addition and the scalar multiplication specified below, \(\in \{\text{ the } F \text{ vectors spaces }\}\)
//

Conditions:
\(\forall f_1, f_2 \in F (S, F) (f_1 + f_2: s \mapsto f_1 (s) + f_2 (s))\)
\(\land\)
\(\forall r \in F, \forall f \in F (S, F) (r f: s \mapsto r f (s))\)
//


2: Note


The 1st "\(F\)" in "\(F (S, F)\)" does not mean the field, \(F\), but is the abbreviation of "Free". So, when the field is \(\mathbb{R}\), it becomes \(F (S, \mathbb{R})\).

An \(f\) is often denoted as \(r^1 s_1 + ...+ r^k s_k\) where \(r^j \in F\) and \(s_j \in S\), which means that \(f (s_j) = r^j\) and \(f (s) = 0\) for any \(s \notin \{s_1, ..., s_k\}\). The expression is commutative: \(r^1 s_1 + r^2 s_2+ r^3 s_3 = r^2 s_2 + r^1 s_1 + r^3 s_3 = ...\), because that does not change \(f\) at all.

The operations are indeed well-defined: \((f_1 + f_2)^{-1} (F \setminus \{0\}) \in \{\text{ the finite sets }\}\) and \((r f)^{-1} (F \setminus \{0\}) \in \{\text{ the finite sets }\}\), obviously.

Let us see that \(F (S, F)\) is indeed a \(F\) vectors space.

1) for any elements, \(v_1, v_2 \in F (S, F)\), \(v_1 + v_2 \in F (S, F)\) (closed-ness under addition): already seen.

2) for any elements, \(v_1, v_2 \in F (S, F)\), \(v_1 + v_2 = v_2 + v_1\) (commutativity of addition): for each \(s \in S\), \((v_1 + v_2) (s) = v_1 (s) + v_2 (s) = v_2 (s) + v_1 (s) = (v_2 + v_1) (s)\).

3) for any elements, \(v_1, v_2, v_3 \in F (S, F)\), \((v_1 + v_2) + v_3 = v_1 + (v_2 + v_3)\) (associativity of additions): for each \(s \in S\), \(((v_1 + v_2) + v_3) (s) = (v_1 + v_2) (s) + v_3 (s) = v_1 (s) + v_2 (s) + v_3 (s) = v_1 (s) + (v_2 (s) + v_3 (s)) = v_1 (s) + (v_2 + v_3) (s) = (v_1 + (v_2 + v_3)) (s)\).

4) there is a 0 element, \(0 \in F (S, F)\), such that for any \(v \in F (S, F)\), \(v + 0 = v\) (existence of 0 vector): the \(0\) function, \(f_0\), is \(0\), because for each \(s \in S\), \((v + f_0) (s) = v (s) + f_0 (s) = v (s) + 0 = v (s)\).

5) for any element, \(v \in F (S, F)\), there is an inverse element, \(v' \in F (S, F)\), such that \(v' + v = 0\) (existence of inverse vector): \(- v\) is a \(v'\), because for each \(s \in S\), \((-v + v) (s) = - v (s) + v (s) = 0 = f_0 (s)\).

6) for any element, \(v \in F (S, F)\), and any scalar, \(r \in F\), \(r . v \in F (S, F)\) (closed-ness under scalar multiplication): already seen.

7) for any element, \(v \in F (S, F)\), and any scalars, \(r_1, r_2 \in F\), \((r_1 + r_2) . v = r_1 . v + r_2 . v\) (scalar multiplication distributability for scalars addition): for each \(s \in S\), \(((r_1 + r_2) . v) (s) = (r_1 + r_2) v (s) = r_1 v (s) + r_2 v (s) = (r_1 v) (s) + (r_2 v) (s) = (r_1 . v + r_2 . v) (s)\).

8) for any elements, \(v_1, v_2 \in F (S, F)\), and any scalar, \(r \in F\), \(r . (v_1 + v_2) = r . v_1 + r . v_2\) (scalar multiplication distributability for vectors addition): for each \(s \in S\), \((r . (v_1 + v_2)) (s) = r (v_1 + v_2) (s) = r (v_1 (s) + v_2 (s)) = r v_1 (s) + r v_2 (s) = (r v_1) (s) + (r v_2) (s) = (r . v_1 + r . v_2) (s)\).

9) for any element, \(v \in V\), and any scalars, \(r_1, r_2 \in F\), \((r_1 r_2) . v = r_1 . (r_2 . v)\) (associativity of scalar multiplications): for each \(s \in S\), \(((r_1 r_2) . v) (s) = (r_1 r_2) v (s) = r_1 (r_2 v (s)) = r_1 (r_2 v) (s) = (r_1 . (r_2 . v)) (s)\).

10) for any element, \(v \in V\), \(1 . v = v\) (identity of 1 multiplication): for each \(s \in S\), \((1 . v) (s) = 1 v (s) = v (s)\).

For each \(s \in S\), there is the function, \(f_s \in F (S, F)\), such that \(f_s (s) = 1\) and \(f_s (s') = 0\) for each \(s' \in S \setminus \{s\}\) (\(f_s = 1 s\) by the aforementioned notation), and \(B := \{f_s \in F (S, F): s \in S\}\) is a basis of \(F (S, F)\): for each \(f \in F (S, F)\), \(f = f^1 f_{s_1} + ... + f^k f_{s_k}\), which is \(f^1 s_1 + ... + f^k s_k\) by the aforementioned notation; it is linearly independent, because for each \(c^1 f_{s_1} + ... + c^k f_{s_k} = 0\), \((c^1 f_{s_1} + ... + c^k f_{s_k}) (s_j) = 0 (s_j) = 0\), but the left hand side is \(c^1 f_{s_1} (s_j) + ... + c^k f_{s_k} (s_j) = c^j\), so, \(c^j = 0\).

We need to be careful when \(S\) is an \(F\) vectors space, \(V\): for each \(v \in V\) and \(r \in F \setminus \{1\}\), \(r f_v = r (1 v) = r (v) \neq (r v) = 1 (r v) = 1 f_{r v} = f_{r v}\), because \(r f_v\) is the function that maps \(v\) to \(r\) while \(f_{r v}\) is the function that maps \(r v\) to \(1\) but maps \(v\) to \(0\). When \(S\) is just a set, \(r f_s = r s\) is not ambiguous, because as there is no such operation as \(r s\) on \(S\), \(r\) is inevitably operating on \(F (S, F)\), but when \(S = V\), an \(F\) vectors space, \(r v\) is ambiguous whether \(r\) is operating on \(V\), which is \(1 (r v) = 1 f_{r v}\), or \(r\) is operating on \(F (V, F)\), which is \(r (v) = r f_v\).

We will adopt the notation that for each \(v \in V\), \(v\) denotes \(v \in V\) while \((v)\) denotes \((v) = f_v \in F (V, F)\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>