Section 9.5 The relaxation parameter
When faced with an iterative method which moves either too slowly, undershooting the target (FigureĀ 9.2.1) or too quickly, overshooting it (FigureĀ 9.2.2), we can try and fix the issue by introducing a relaxation parameter. This is a fixed positive number \(\omega \gt 0\) by which we multiply the step size \(h\) obtained from Newton's method (or secant method, etc). That is, instead of iterating the function
we iterate the function
If \(\omega > 1\text{,}\) this is over-relaxation, meant to make the process move faster. If \(\omega \lt 1\text{,}\) this is under-relaxation, meant to slow down the process so it does not overshoot and run off in a wrong direction.
How to determine an appropriate value of \(\omega\text{?}\) Consider the case of a multiple root: \(f(x) = x^p\) with \(p > 1\text{.}\) Here
so the best value to use \(\omega = p\text{,}\) which makes \(g(x)=0\text{,}\) the solution is found in one step. So, for a double root, when the graph of \(f\) looks like a parabola near the root, we should use over-relaxation with \(\omega = 2\text{;}\) for triple roots, \(\omega = 3\text{,}\) and so on. Over-relaxation needs to be done very carefully: in these examples, any value of \(\omega\) other than the correct one will fail to achieve the desired speed up of convergence.
Similarly, a function shaped like \(f(x) = x^{1/3}\) may need under-relaxation, in this case the previous paragraph suggests \(\omega = 1/3\text{.}\) In general, under-relaxation is a less delicate issue than over-relaxation: seeing the failure to converge, one can simply try \(\omega = 0.1\text{,}\) and if that does not help, maybe \(\omega = 0.01\text{.}\) The convergence, if it is achieved, will usually be slow (linear).