Minimize Rosenbrock’s function (32.1.2) using Newton’s method with a random starting point
randn(2, 1). Plot the path of the search.
Solution.
The gradient of Rosenbrock’s function is (32.1.3) and its Hessian matrix is
\begin{equation}
\begin{pmatrix} 2 + 400 (x_1^2-x_2) + 800 x_1^2 & -400x_1 \\
-400 x_1 & 200 \end{pmatrix}\tag{32.2.1}
\end{equation}
f = @(x) (x(1)-1).^2 + 100*(x(1).^2 - x(2)).^2;
fg = @(x) [2*(x(1)-1) + 400*(x(1).^2 - x(2))*x(1); -200*(x(1).^2 - x(2))];
fh = @(x) [2 + 400*(x(1)^2-x(2)) + 800*x(1)^2, -400*x(1); -400*x(1), 200];
x = randn(2, 1);
max_tries = 10000;
path = zeros(2, max_tries);
for k=1:max_tries
path(:, k) = x;
dx = -fh(x)\fg(x);
x = x + dx;
if norm(dx) < 1e-6
break
end
end
plot(path(1, 1:k), path(2, 1:k), '-+')
if k < max_tries
fprintf('Found x = (%g, %g) with f(x) = %g after %d steps\n', x, f(x), k);
else
disp('Failed to converge')
end
The process usually makes a few seemingly random jumps but then quickly converges.
