Natural evolution strategies (NES) are a family of numerical optimization algorithms for black box problems.
Similar in spirit to evolution strategies, they iteratively update the (continuous) parameters of a search distribution by following the natural gradient towards higher expected fitness.
From the samples, NES estimates a search gradient on the parameters towards higher expected fitness.
This step is crucial, since it prevents oscillations, premature convergence, and undesired effects stemming from a given parameterization.
They differ in the type of probability distribution and the gradient approximation method used.
Different search spaces require different search distributions; for example, in low dimensionality it can be highly beneficial to model the full covariance matrix.
In high dimensions, on the other hand, a more scalable alternative is to limit the covariance to the diagonal only.
In addition, highly multi-modal search spaces may benefit from more heavy-tailed distributions (such as Cauchy, as opposed to the Gaussian).
NES then pursues the objective of maximizing the expected fitness under the search distribution through gradient ascent.
In practice, it is possible to use the Monte Carlo approximation based on a finite number of
samples Finally, the parameters of the search distribution can be updated iteratively Instead of using the plain stochastic gradient for updates, NES follows the natural gradient, which has been shown to possess numerous advantages over the plain (vanilla) gradient, e.g.: The NES update is therefore where
The Fisher matrix can sometimes be computed exactly, otherwise it is estimated from samples, reusing the log-derivatives
For this purpose, the fitness of the population is transformed into a set of utility values