description/proof of that square of Euclidean norm of \(\mathbb{R}^n\) vector is equal to or smaller than positive definite real quadratic form divided by smallest eigenvalue
Topics
About: vectors space
The table of contents of this article
- Starting Context
- Target Context
- Orientation
- Main Body
- 1: Structured Description
- 2: Natural Language Description
- 3: Proof
Starting Context
- The reader knows a definition of %field name% vectors space.
- The reader knows a definition of positive definite quadratic form.
- The reader knows a definition of Euclidean-norm on Euclidean vectors space.
- The reader knows a definition of eigenvalue of matrix.
- The reader knows a definition of orthonormal matrix.
- The reader admits the proposition that any real symmetric matrix can be diagonalized by an orthonormal matrix.
- The reader admits the proposition that the transposition of any orthonormal matrix is the inverse of the original matrix.
Target Context
- The reader will have a description and a proof of the proposition that for any \(\mathbb{R}^n\) vectors space, the square of the Euclidean norm of any vector is equal to or smaller than any positive definite real quadratic form divided by the smallest eigenvalue of the quadratic form.
Orientation
There is a list of definitions discussed so far in this site.
There is a list of propositions discussed so far in this site.
Main Body
1: Structured Description
Here is the rules of Structured Description.
Entities:
\(\mathbb{R}^d\): with the Euclidean vectors space structure and the Euclidean norm, \(\Vert v \Vert = \sqrt{{v^1}^2 + {v^2}^2 + ... + {v^d}^2}\)
\(M\): \(\in \{\text{ the real symmetric } d \times d \text{ matrices }\}\)
\(f\): \(\in \{\text{ the positive definite real quadratic forms over } \mathbb{R}^d\}\), \(: \mathbb{R}^d \to \mathbb{R}\), \(v \mapsto v^t M v\)
\(v\): \(\in \mathbb{R}^d\)
\(\lambda_m\): \(= \text{ the smallest eigenvalue of } M\), inevitably \(0 \lt \lambda_m\)
//
Statements:
\(\Vert v \Vert^2 \leq f (v) / \lambda_m\).
//
2: Natural Language Description
For the \(\mathbb{R}^n\) Euclidean vectors space with the Euclidean norm, \(\Vert v \Vert = \sqrt{{v^1}^2 + {v^2}^2 + ... + {v^d}^2}\), and any positive definite real quadratic form, \(f (v) = v^t M v\), where \(M\) is a real symmetric \(d \times d\) matrix, \(\Vert v \Vert^2 \leq f (v) / \lambda_m\), where \(\lambda_m\) (inevitably \(0 \lt \lambda_m\)) is the smallest eigenvalue of \(M\).
3: Proof
There is an orthonormal matrix, \(M'\), such that \(M'^{-1} M M' = [\lambda_1, ..., \lambda_d]\) where \([\lambda_1, ..., \lambda_d]\) is a diagonal matrix where \(\lambda_i\) is an eigenvalue, by the proposition that any real symmetric matrix can be diagonalized by an orthonormal matrix. Let us define \(v' := M'^{-1} v\), so, \(v = M' v'\). \(f (v) = v'^{t} M'^{t} M M' v' = v'^{t} M'^{-1} M M' v' = v'^{t} [\lambda_1, ..., \lambda_d] v' = \lambda_1 {v'^1}^2 + \lambda_2 {v'^2}^2 + ... + \lambda_d {v'^d}^2\), by the proposition that the transposition of any orthonormal matrix is the inverse of the original matrix
Let us denote the smallest eigenvalue as \(\lambda_m\), which is inevitably \(0 \lt \lambda_m\), because \(M\) is positive definite. \(\lambda_m \Vert v' \Vert^2 = \lambda_m ({v'^1}^2 + {v'^2}^2 + ... + {v'^d}^2) \leq \lambda_1 {v'^1}^2 + \lambda_2 {v'^2}^2 + ... + \lambda_d {v'^d}^2 = f (v)\). As \(M'\) is an orthonormal matrix, \(\Vert v' \Vert = \Vert v \Vert\). So, \(\Vert v \Vert^2 \leq f (v) / \lambda_m\).