Rosenbrock function

In mathematical optimization, the Rosenbrock function is a non-convex function, introduced by Howard H. Rosenbrock in 1960, which is used as a performance test problem for optimization algorithms.

The global minimum is inside a long, narrow, parabolic-shaped flat valley.

To find the valley is trivial.

To converge to the global minimum, however, is difficult.

The function is defined by It has a global minimum at

the function is symmetric and the minimum is at the origin.

Two variants are commonly encountered.

uncoupled 2D Rosenbrock problems, and is defined only for even

s: This variant has predictably simple solutions.

A second, more involved variant is has exactly one minimum for

This result is obtained by setting the gradient of the function equal to zero, noticing that the resulting equation is a rational function of

the polynomials can be determined exactly and Sturm's theorem can be used to determine the number of real roots, while the roots can be bounded in the region of

this method breaks down due to the size of the coefficients involved.

Many of the stationary points of the function exhibit a regular pattern when plotted.

[5] This structure can be exploited to locate them.

The Rosenbrock function can be efficiently optimized by adapting appropriate coordinate system without using any gradient information and without building local approximation models (in contrast to many derivate-free optimizers).

The following figure illustrates an example of 2-dimensional Rosenbrock function optimization by adaptive coordinate descent from starting point

with a regular initial simplex a minimum is found with function value

The figure below visualizes the evolution of the algorithm.

Plot of the Rosenbrock function of two variables. Here , and the minimum value of zero is at .
Animation of Rosenbrock's function of three variables. [ 2 ]
Rosenbrock roots exhibiting hump structures
Rosenbrock function Nelder-Mead
Nelder-Mead method applied to the Rosenbrock function