2023-10-29

399: For 'Independent Variable'-Value Pairs Data, Choosing Origin-Passing Approximating Line with Least Value Difference Squares Sum Equals Projecting Values Vector to Independent Variables Vector Line

<The previous article in this series | The table of contents of this series | The next article in this series>

A description/proof of that for 'independent variable'-value pairs data, choosing origin-passing approximating line with least value difference squares sum equals projecting values vector to independent variables vector line

Topics


About: vectors space

The table of contents of this article


Starting Context



Target Context


  • The reader will have a description and a proof of the proposition that for any 'independent variable'-value pairs data, choosing the origin-passing approximating line with the least value difference squares sum equals projecting the values vector to the independent variables vector line in the \(\mathbb{R}^n\) space.

Orientation


There is a list of definitions discussed so far in this site.

There is a list of propositions discussed so far in this site.


Main Body


1: Description


For any 'independent variable'-value pairs data, \((x_1, y_1), (x_2, y_2), ..., (x_n, y_n)\), choosing the origin-passing approximating line, \(y = s x\), such that the value difference squares sum, \(S := \sum_{i} (y_i - s x_i)^2\), is the least equals projecting the values vector, \((y_1, y_2, ..., y_n)\), to the independent variables vector line, \(t (x_1, x_2, ..., x_n)\) in the \(\mathbb{R}^n\) space, what which means is that \(s\) equals the length of the projection.


2: Proof


Let us determine \(s\). Let us take \(\frac{d S}{d s} = 0\). \(\frac{d S}{d s} = \sum_{i} (- 2 (y_i - s x_i) x_i) = 0\). \(\sum_{i} (y_i x_i) = s \sum_{i} x_i^2\). \(s = (\sum_{i} (y_i x_i)) (\sum_{i} x_i^2)^{-1}\).

Let us take the projection. The inner product of \((x_1, x_2, ..., x_n)\) and \((y_1, y_2, ..., y_n)\) is \(\sum_{i} (y_i x_i) = (\sum_{i} x_i^2)^{2^{-1}} (\sum_{i} y_i^2)^{2^{-1}} cos \theta\) where \(\theta\) is the angle between the vectors, and the length of the projection is \(l = (\sum_{i} y_i^2)^{2^{-1}} cos \theta = (\sum_{i} y_i^2)^{2^{-1}} \sum_{i} (y_i x_i) ((\sum_{i} x_i^2)^{2^{-1}})^{-1} ((\sum_{i} y_i^2)^{2^{-1}})^{-1} = \sum_{i} (y_i x_i) ((\sum_{i} x_i^2)^{2^{-1}})^{-1}\).

So, \(l = s\), which is what this proposition means.

The projection is \((\sum_{i} (y_i x_i)) ((\sum_{i} x_i^2)^{2^{-1}})^{-1} ((\sum_{i} x_i^2)^{2^{-1}})^{-1} (x_1, x_2, ..., x_n) = (\sum_{i} (y_i x_i)) ((\sum_{i} x_i^2)^{2^{-1}})^{-2} (x_1, x_2, ..., x_n) = (\sum_{i} (y_i x_i)) (\sum_{i} x_i^2)^{-1} (x_1, x_2, ..., x_n)\).


References


<The previous article in this series | The table of contents of this series | The next article in this series>