2025-08-03

1230: For Finite-Dimensional Real or Complex Vectors Space, Taking Basis and Sum of Absolute Components for Each Vector Is Norm

<The previous article in this series | The table of contents of this series | The next article in this series>

description/proof of that for finite-dimensional real or complex vectors space, taking basis and sum of absolute components for each vector is norm

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any finite-dimensional real or complex vectors space, taking any basis and the sum of the absolute components for each vector is a norm.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Structured Description


Here is the rules of Structured Description.

Entities:
\(F\): \(\in \{\mathbb{R}, \mathbb{C}\}\), with the canonical field structure
\(V\): \(\in \{\text{ the } d \text{ -dimensional vectors spaces over } F\}\)
\(B\): \(= \{b_1, ..., b_d\}\), \(\in \{\text{ the bases for } V\}\)
\(\Vert \bullet \Vert\): \(: V \to \mathbb{R}, v = v^j b_j \mapsto \sum_{j \in \{1, ..., d\}} \vert v^j \vert\)
//

Statements:
\(\Vert \bullet \Vert \in \{\text{ the norms on } V\}\)
//


2: Proof


Whole Strategy: Step 1: see that \(\Vert \bullet \Vert\) satisfies the conditions to be a norm.

Step 1:

Let \(v_1, v_2 \in V\) be any and let \(r \in F\) be any.

1) \((0 \le \Vert v_1 \Vert) \land ((0 = \Vert v_1 \Vert) \iff (v_1 = 0))\): \(\Vert v_1 \Vert = \Vert {v_1}^j b_j \Vert = \sum_{j \in \{1, ..., d\}} \vert {v_1}^j \vert\), so, \(0 \le \Vert v_1 \Vert\); when \(v_1 = 0\), \({v_1}^j = 0\) for each \(j\), so, \(\vert {v_1}^j \vert = 0\) for each \(j\), so, \(\Vert v_1 \Vert = \sum_{j \in \{1, ..., d\}} \vert {v_1}^j \vert = 0\), while when \(\Vert v_1 \Vert = 0\), \(\sum_{j \in \{1, ..., d\}} \vert {v_1}^j \vert = 0\), so, \(\vert {v_1}^j \vert = 0\) for each \(j\), so, \({v_1}^j = 0\) for each \(j\), so, \(v_1 = 0\).

2) \(\Vert r v_1 \Vert = \vert r \vert \Vert v_1 \Vert\): \(\Vert r v_1 \Vert = \Vert r {v_1}^j b_j \Vert = \sum_{j \in \{1, ..., d\}} \vert r {v_1}^j \vert = \sum_{j \in \{1, ..., d\}} \vert r \vert \vert {v_1}^j \vert = \vert r \vert \sum_{j \in \{1, ..., d\}} \vert {v_1}^j \vert = \vert r \vert \Vert v_1 \Vert\).

3) \(\Vert v_1 + v_2 \Vert \le \Vert v_1 \Vert + \Vert v_2 \Vert\): \(\Vert v_1 + v_2 \Vert = \Vert {v_1}^j b_j + {v_2}^j b_j \Vert = \Vert ({v_1}^j + {v_2}^j) b_j \Vert = \sum_{j \in \{1, ..., d\}} \vert {v_1}^j + {v_2}^j \vert \le \sum_{j \in \{1, ..., d\}} (\vert {v_1}^j \vert + \vert {v_2}^j \vert) = \sum_{j \in \{1, ..., d\}} \vert {v_1}^j \vert + \sum_{j \in \{1, ..., d\}} \vert {v_2}^j \vert = \Vert v_1 \Vert + \Vert v_2 \Vert\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>