Skip to main content

Exercises 33.5 Homework

1.

Apply the code in Example 33.3.1 to Easom function

\begin{equation*} f(x_1, x_2) = -\cos x_1 \cos x_2 \exp(-(x_1-\pi)^2-(x_2-\pi)^2) \end{equation*}

The global minimum is \((\pi, \pi)\) with \(f(\pi, \pi)=-1\) but it is difficult to find because the function has many local minima and its landscape does not naturally lead to the global minimum. If you run the code based on Example 33.3.1 five times (simply pressing F5 repeatedly), how many times does it converge to the global minimum \((\pi, \pi)\text{?}\)

To improve the situation, add a stochastic (random) step to the minimization process, as an alternative to contraction. That is, when the algorithm in Example 33.3.1 wants to contract the triangle, try the following point first.

S = T(:, ind) + randn(2, 1);

If the value of \(f\) at S is smaller than the value at T(:, ind), then replace T(:, ind) with S. Otherwise, perform the contraction step.

If you run the stochastic Nelder-Mead method five times (pressing F5 repeatedly) how many times does it converge to the global minimum?

Remark: The documentation of SciPy optimize package provides an overview of many other optimization methods. There are a few bracket-based methods for functions of one variable (including golden section), several multivariate methods (such as Nelder-Mead), and in addition, some methods designed for finding the global minimum. Most of the latter group are stochastic methods: they intentionally make the search process subject to chance.