Skip to main content

Section 10.4 Examples and questions

These are additional examples for reviewing the topic we have covered. When reading each example, try to find your own solution before clicking “Answer”. There are also questions for reviewing the concepts of this section.

Consider a polynomial equation, for example \(x^3 + 2x^2 - 7x + 1 = 0\text{.}\) This is a scalar equation with one scalar variable, so it could be solved by the methods of previous chapters, including Newton's method in one variable. But then we would get only one root (depending on initial position), not all three. Rewrite this equation as a system of three equations for three roots \(x_1, x_2, x_3\text{,}\) using the fact that the polynomial factors as \((x-x_1)(x-x_2)(x-x_3)\text{.}\)

Solution

Expanding the product, we find

\begin{equation*} x^3 + 2x^2 - 7x + 1 = (x-x_1)(x-x_2)(x-x_3) = x^3 + Ax^2 + Bx + C \end{equation*}

where \(A = -(x_1+x_2+x_3)\text{,}\) \(B = x_1x_2 + x_1x_3 + x_2x_2\text{,}\) and \(C = -x_1x_2x_3\text{.}\) Equating the coefficients, we get the system

\begin{equation*} \begin{cases} x_1+x_2+x_3 \amp = -2 \\ x_1x_2 + x_1x_3 + x_2x_3 \amp = -7 \\ x_1x_2x_3 \amp = -1 \end{cases} \end{equation*}

If we can solve this system, the solution will consist of all three roots of the original equation.

Use the multivariate Newton method to solve the system from Example 10.4.1.

Solution

We have the vector function

\begin{equation*} \mathbf F(\mathbf x) = \begin{pmatrix} x_1+x_2+x_3 +2 \\ x_1x_2 + x_1x_3 + x_2x_3 + 7 \\ x_1x_2x_3 + 1 \end{pmatrix} \end{equation*}

and its Jacobian matrix is

\begin{equation*} J(\mathbf x) = \begin{pmatrix} 1 \amp 1 \amp 1 \\ x_2 + x_3 \amp x_1 + x_3 \amp x_1+x_2 \\ x_2 x_3 \amp x_1 x_3 \amp x_1 x_2 \end{pmatrix} \end{equation*}

Use the code from Example 10.2.3 with these functions:

F = @(x) [x(1) + x(2) + x(3) + 2; 
          x(1)*x(2) + x(1)*x(3) + x(2)*x(3) + 7;
          x(1)*x(2)*x(3) + 1];
J = @(x) [1, 1, 1;
          x(2) + x(3), x(1) + x(3), x(1) + x(2);
          x(2)*x(3), x(1)*x(3), x(1)*x(2)];

The initial point should be a point with distinct coordinates, because at points with equal coordinates like [0; 0; 0] or [1; 1; 1] the Jacobian matrix is singular (it has equal columns). For example, [1; 2; 3] works just fine.

Found a solution after 9 steps:
    1.7240    0.1497   -3.8737

Although an algebraic formula for the roots of a cubic (degree 3) equation exists, the symbolic form of the roots is often too complicated to be of any use. See what Wolfram Alpha shows as a solution of this equation.

The above process, in a polished and simplified form, is known as the Durand-Kerner method for finding all polynomial roots at once.

If we apply the logic of Example 10.4.2 to the equation \(x^3 + 2x^2 - 7x + 5 = 0\) (so, just replace 1 by 5 in the formula for F), the method fails to converge. Why?

What goes wrong if we try to use Newton's method to solve the system

\begin{equation*} \begin{cases} \sin x \amp = a \\ \cos y \amp = b \\ \tan z \amp = c\end{cases} \end{equation*}

for some given values of \(a, b, c\text{?}\)

What prevents us from using Newton's method for systems where the number of equations is not equal to the number of unknowns? For example, the equation \((x-2)^2 + (y+1)^2 = 0\) has a unique solution: can we find it with Newton's method?