Pattern search (optimization)

Optimization attempts to find the best match (the solution that has the lowest error value) in a multidimensional analysis space of possibilities.

[1] An early and simple variant is attributed to Fermi and Metropolis when they worked at the Los Alamos National Laboratory.

It is described by Davidon,[2] as follows: They varied one theoretical parameter at a time by steps of the same magnitude, and when no such increase or decrease in any one parameter further improved the fit to the experimental data, they halved the step size and repeated the process until the steps were deemed sufficiently small.Convergence is a pattern search method proposed by Yu, who proved that it converges using the theory of positive bases.

[3] Later, Torczon, Lagarias and co-authors[4][5] used positive-basis techniques to prove the convergence of another pattern-search method on specific classes of functions.

Outside of such classes, pattern search is a heuristic that can provide useful approximate solutions for some issues, but can fail on others.

Example of convergence of a direct search method on the Broyden function . At each iteration, the pattern either moves to the point which best minimizes its objective function, or shrinks in size if no point is better than the current point, until the desired accuracy has been achieved, or the algorithm reaches a predetermined number of iterations.