Newton's Method Example

Let's seek 2 coefficients' solution of a function(1) as the curve fitting by Newton's method.

\(\large{y=ax^2e^{-bx}} \qquad (1)\)

Making Data Series

\(\large{a=} \)
\(\large{b=} \)
\(\large{=} \)

Initial Values(Coefficients and Numerical Differential's Step Size) Setup

\(\large{a^\left(0\right)=} \)
\(\large{b^\left(0\right)=} \)
\(\large{\varDelta a=} \)
\(\large{\varDelta b=} \)

Execute Newton Method

\(\large{a^\left(k\right)=} \)
\(\large{b^\left(k\right)=} \)
\(\large{(k=} \)
\(\large{)} \)


Explanation

This program performs the minimization for residual sum of squares. The residual sum of squares is
defined by (2).

\(\large{r\left(a,b \right)=\frac{1}{2}\displaystyle \sum_{i=1}^{n} \left(y_i - a^\left(k\right)x_i^2e^{-b^\left(k\right)x_i} \right)^2 } \qquad (2)\)
\(\small{y_i}\): Data series \(\small{\qquad n}\): Number of data series
\(\small{a^\left(k\right), b^\left(k\right)}\): \(\small{a, b}\) at \(\small{k}\) th iterative time

Iterative update of 2 Variables Newton's method is (3). For each iteration, \(\varDelta X \) will be solved. In detail,
please see "Newton's method" of this site.

\(\large{ \begin{bmatrix} \frac{\partial^2 r}{\partial a^2} & \frac{\partial^2 r}{\partial a \partial b} \\ \frac{\partial^2 r}{\partial b \partial a} & \frac{\partial^2 r}{\partial b^2} \end{bmatrix} \varDelta X= - \begin{bmatrix} \frac{\partial r}{\partial a} \\ \frac{\partial r}{\partial b} \end{bmatrix} \qquad (3) }\)
\(\large{ \varDelta X= \begin{bmatrix} a^\left(k+1\right)-a^\left(k\right) \\ b^\left(k+1\right)-b^\left(k\right) \end{bmatrix} }\)


For the numerical differentiation, a residual error's map is made from (2). In another, the analytical
solution does not use the numerical diffrentiation's step size, directly calculates partial differentiations
(4)`(9).

\(\large{g\left(x\right)=x^2 e^{-bx} } \qquad \left(x=x_i\right) \qquad(4)\)
\(\large{\frac{\partial r}{\partial a}=\displaystyle \sum_{i=1}^{n} -g\left(y_i - ag \right) } \qquad (5)\)
\(\large{\frac{\partial r}{\partial b}=\displaystyle \sum_{i=1}^{n} axg\left(y_i - ag \right) } \qquad (6)\)
\(\large{\frac{\partial^2 r}{\partial a^2}=\displaystyle \sum_{i=1}^{n} g^2 } \qquad (7)\)
\(\large{\frac{\partial^2 r}{\partial a \partial b} = \frac{\partial^2 r}{\partial b \partial a}=\displaystyle \sum_{i=1}^{n} xg\left(y_i - 2ag \right) } \qquad (8)\)
\(\large{\frac{\partial^2 r}{\partial b^2}=\displaystyle \sum_{i=1}^{n} -ax^2g \left(y_i - 2ag \right) } \qquad (9)\)

If the initial values is far from solutions, there are things of not reaching to the solutions. In
this case, to give the initial values with Least squares solution can be available.