# Newton’s Method¶

## Root-Finding Problems¶

Given a scalar equation \(f(x)=0\), where \(f:[a,b] \rightarrow {\mathbb R}\) is a continuous function, we want to compute its approximate solutions (i.e. roots of the function \(f\)).

## Iterative Methods¶

**Definition.** An iterative method is a mathematical procedure that
generates a sequence of improving approximate solutions to a problem. An iterative method starts with an initial approximation,
which is often called an initial guess.

**Convergence:** Let \(r\) be the exact solution to a given root-finding
problem, i.e. \(f(r)=0\). Let \(x_0\) be the initial guess
and consider a sequence \(x_1, x_2, \dotsc, x_n, x_{n+1}, \dotsc\)
of improving approximate solutions generated by an iterative method. Consider the absolute
approximation error \(\varepsilon_{n} =
|r-x_n|\). The iterative method is said to be convergent for the
given initial guess, if its
corresponding sequence converges to \(r\), that is, if \(\lim_{n
\rightarrow \infty} \varepsilon_n =0\).

**Linear convergence:** An iterative method is said to be linearly
convergent if \(\lim_{n
\rightarrow \infty} \frac{\varepsilon_{n+1}}{\varepsilon_{n}} = S
<1\).

**Quadratic convergence:** An iterative method is said to be quadratically
convergent if \(\lim_{n
\rightarrow \infty} \frac{\varepsilon_{n+1}}{\varepsilon_{n}^2} = M
<\infty\).

**Stopping criterion I:** A stopping criterion tells us when to terminate an iterative
algorithm. Given a small error tolerance \(TOL\), a common stopping
criterion is \(\varepsilon_n =
|r-x_n| \le TOL\). This will determine the number of iterations
\(n\) necessary to obtain an approximate solution \(x_n\) within the error tolerance. Note that in order to use this criterion we will need to know the exact root \(r\), which is often not available. The following criteria are hence more practical.

**Stopping criterion II:** In most practical root-finding problems we do not have
access to the exact root \(r\). In such cases we may consider the
quantity \((E_{\rm abs})_{n} = |x_{n+1}-x_n|\), which
approximates the absolute error \(\varepsilon_{n} =
|r-x_{n}|\). The corresponding stopping criterion will then be \((E_{\rm abs})_{n} =
|x_{n+1}-x_n| \le TOL\). Note that this is an acceptable criterion
since it is an indication of the convergence of \(x_n\).

**Stopping criterion III:** Usually when \(x_n\) is a small value,
we may use the approximate relative error in the stopping criterion \((E_{\rm rel})_{n} =
|x_{n+1}-x_n| / |x_{n+1}| \le TOL\).

**Remark.** In addition to an stopping criterion, we also need to set
a limit on the maximum number of steps in case convergence fails.

## Newton’s Method¶

Newton’s method is a simple iterative method for finding roots of functions. The basic idea behind the method is to approximate the function with the tangent line and then approximate the root of the function by the root of the tangent line.

**Algorithm:** Let \(x_0\) be an initial guess for a root of \(f\). For
\(n=0,1, 2, \dotsc\) iteratively compute \(x_{n+1}\) in terms of the just computed \(x_n\) by: \(x_{n+1} = x_n -f(x_n)/f'(x_n)\). Then for most functions and reasonable initial guesses, the sequence converges to a root of \(f\).

**Convergence:** Let \(f\) be a twice differentiable function with
a root \(r\), i.e. \(f(r)=0\). If \(f'(r) \neq 0\), then
for a reasonable initial guess (one that is sufficiently close to \(r\))
Newton’s method is quadratically converging to \(r\), with \(\lim_{n
\rightarrow \infty} \frac{\varepsilon_{n+1}}{\varepsilon_{n}^2} =
M=\frac{|f"(r)|} {2 \ |f'(r)|} < \infty\).

**Remark.** The convergence of Newton’s method will be quadratic if:

- The initial guess is not too far from the root \(r\), or equivalently the intial error \(\varepsilon_0 = |r - x_0|\) is not too large.
- The derivative of \(f\) at \(x_n\) (which appears in the denominator) is not too close to \(0\). This means that the function \(f\) is not too flat at \(x_n\), otherwise the tangent line would be almost parallel to the \(x\)-axis and would intersect it far away. In particular we need \(f'(r) \neq 0\).
- The second derivative of \(f\) is not too big in absolute value near the root. A large second derivative means the function is very concave or very convex and hence turns away from the tangent line quickly.

## Modified Newton’s method¶

If \(r\) is a multiple root of the function \(f\) with multiplicity \(m\ge 2\), that is, if \(f(r) = f'(r) = \dotsc = f^{(m-1)}(r) = 0\), but \(f^{(m)}(r) \neq 0\), then Newton’s method will not converge quadrtically. In this case Newton’s method with be linearly convergent with \(\lim_{n \rightarrow \infty} \frac{\varepsilon_{n+1}}{\varepsilon_{n}} = S = \frac{m-1}{m} < 1\).

If the multiplicity \(m\) of the root is known in advance, it is possible to modify Newton’s method and obtain quadratic covergence. In this case The modified Newton’s method reads \(x_{n+1} = x_n -m f(x_n)/f'(x_n)\).