Minimize \(f(\mathbf x) = x_1^4 + \exp(3x_1+x_2)\) on the circle \(x_1^2 + x_2^2 = 1\) using the Lagrange multiplier method.
Solution.
The code is similar to Example 32.2.1 but with more complicated gradient and Hessian. To avoid notational confusion, it uses
x for the vector \((x_1, x_2)\) and v for the vector \((x_1, x_2, \lambda)\text{.}\) The functions Fg and Fh below are the gradient and Hessian of the function
\begin{equation*}
F(x_1, x_2, \lambda) = x_1^4 + \exp(3x_1+x_2) - \lambda (x_1^2 + x_2^2 - 1)
\end{equation*}
We equate
Fg to zero vector, and use Fh according to Newton’s method.f = @(x) x(1)^4 + exp(3*x(1)+x(2));
Fg = @(v) [4*v(1)^3 + 3*exp(3*v(1) + v(2)) - v(3)*(2*v(1)); ...
exp(3*v(1) + v(2)) - v(3)*(2*v(2)); ...
-(v(1)^2 + v(2)^2 - 1)];
Fh = @(v) [12*v(1)^2 + 9*exp(3*v(1) + v(2)) - v(3)*2, 3*exp(3*v(1) + v(2)), -(2*v(1)); ...
3*exp(3*v(1) + v(2)), exp(3*v(1) + v(2)) - v(3)*2, -(2*v(2)); ...
-2*v(1), -2*v(2), 0];
v = randn(3, 1);
max_tries = 10000;
for k=1:max_tries
dv = -Fh(v)\Fg(v);
v = v + dv;
if norm(dv) < 1e-6
break
end
end
x = v(1:2);
if k < max_tries
fprintf('Found x = (%g, %g) with f(x) = %g after %d steps\n', x, f(x), k);
else
disp('Failed to converge')
end
The method converges quickly and the point it finds satisfies to constraint. But this point is often a maximum rather than a minimum of the function. As usual with Newton’s method, the situation can be improved by running a rough search algorithm first, just to locate a good starting point.
