Derivative-free optimization

For example, f might be non-smooth, or time-consuming to evaluate, or in some way noisy, so that methods that rely on derivatives or approximate them via finite differences are of little use.

The problem to find optimal points in such situations is referred to as derivative-free optimization, algorithms that do not use derivatives or finite differences are called derivative-free algorithms.

[1] The problem to be solved is to numerically optimize an objective function

When applicable, a common approach is to iteratively improve a parameter guess by local hill-climbing in the objective function landscape.

Derivative-based optimization is efficient at finding local optima for continuous-domain smooth single-modal problems.

is expensive to evaluate, or is non-smooth, or noisy, so that (numeric approximations of) derivatives do not provide useful information.

In derivative-free optimization, various methods are employed to address these challenges using only function values of

Some of these methods can be proved to discover optima, but some are rather metaheuristic since the problems are in general more difficult to solve compared to convex optimization.

For these, the ambition is rather to efficiently find "good" parameter values which can be near-optimal given enough resources, but optimality guarantees can typically not be given.

One should keep in mind that the challenges are diverse, so that one can usually not use one algorithm for all kinds of problems.