Fixed-point Methods cont.
Approximating a solution \(p\) of the equation \(g(x) = x\) …
… by iterating the function \(g\).
Start by choosing \(p_0\). Then, for \(n \ge 1\), define
\[p_n = g(p_{n-1})\]
If \(g\) is a continuous contraction on an interval \([a,b]\), and if you choose \(p_0 \in [a,b]\), then \(p_n\) will converge to \(p\).
Suppose \(g\) is a continuous contraction on \([a,b]\), and \(p_0 \in [a,b]\). Then for all \(n > 0\):
The inequality 1. means the convergence is linear.
The Newton’s method is a fixedpoint method.
Suppose \(f \in \mathcal{C}^\alpha(I)\) for some interval \(I\) and some \(\alpha \ge 2\). Suppose \(p \in I\) is a solution of the equation \(g(x) = x\), and \[g'(p) = g''(p) = \cdots = g^{(\alpha - 1)}(p) = 0\] and \(g^{(\alpha)}(p) \neq 0\).
Then, if the initial guess \(p_0\) is close enough to \(p\), then the sequence defined recursively by \(p_n = g(p_{n-1})\) for \(n \ge 1\) converges to \(p\) with order of convergence \(\alpha\), and \[\lim_{n\to \infty} \frac{p_{n+1} - p}{(p_n - p)^\alpha} = \frac{g^{(\alpha)}(p)}{\alpha!}.\]