Optimization attempts to find the best match (the solution that has the lowest error value) in a multidimensional analysis space of possibilities.
[1] An early and simple variant is attributed to Fermi and Metropolis when they worked at the Los Alamos National Laboratory.
It is described by Davidon,[2] as follows: They varied one theoretical parameter at a time by steps of the same magnitude, and when no such increase or decrease in any one parameter further improved the fit to the experimental data, they halved the step size and repeated the process until the steps were deemed sufficiently small.Convergence is a pattern search method proposed by Yu, who proved that it converges using the theory of positive bases.
[3] Later, Torczon, Lagarias and co-authors[4][5] used positive-basis techniques to prove the convergence of another pattern-search method on specific classes of functions.
Outside of such classes, pattern search is a heuristic that can provide useful approximate solutions for some issues, but can fail on others.