Minimax estimator

In statistical decision theory, where we are faced with the problem of estimating a deterministic parameter (vector)

is called minimax if its maximal risk is minimal among all estimators of

is an estimator which performs best in the worst possible case allowed in the problem.

Consider the problem of estimating a deterministic (not Bayesian) parameter

, and the risk function for this loss is the mean squared error (MSE).

Unfortunately, in general, the risk cannot be minimized since it depends on the unknown parameter

Therefore additional criteria for finding an optimal estimator in some sense are required.

is called minimax with respect to a risk function

if it achieves the smallest maximum risk among all estimators, meaning it satisfies Logically, an estimator is minimax when it is the best in the worst case.

To demonstrate this notion denote the average risk of the Bayes estimator

then: Corollary: If a Bayes estimator has constant risk, it is minimax.

Example 1: Unfair coin[2][3]: Consider the problem of estimating the "success" rate of a binomial variable,

This may be viewed as estimating the rate at which an unfair coin falls on "heads" or "tails".

In this case the Bayes estimator with respect to a Beta-distributed prior,

is with constant Bayes risk and, according to the Corollary, is minimax.

with increasing support and also with respect to a zero-mean normal prior

In fact in this example, the ML estimator is known to be inadmissible (not admissible) whenever

, and they are both minimax, the James–Stein estimator has smaller risk for any finite

In general, it is difficult, often even impossible to determine the minimax estimator.

Nonetheless, in many cases, a minimax estimator has been determined.

The Bayes estimator with respect to a prior which is uniformly distributed on the edge of the bounding sphere is known to be minimax whenever

The design of the approximate minimax estimator is intimately related to the geometry, such as the metric entropy number, of

Sometimes, a minimax estimator may take the form of a randomised decision rule.

The parameter space has just two elements and each point on the graph corresponds to the risk of a decision rule: the x-coordinate is the risk when the parameter is

If the knowledge of this correlation function is not perfectly available, a popular minimax robust optimization approach[6] is to define a set characterizing the uncertainty about the correlation function, and then pursuing a minimax optimization over the uncertainty set and the estimator respectively.

Similar minimax optimizations can be pursued to make estimators robust to certain imprecisely known parameters.

For instance, a recent study dealing with such techniques in the area of signal processing can be found in.

[7] In R. Fandom Noubiap and W. Seidel (2001) an algorithm for calculating a Gamma-minimax decision rule has been developed, when Gamma is given by a finite number of generalized moment conditions.

Such a decision rule minimizes the maximum of the integrals of the risk function with respect to all distributions in Gamma.

Gamma-minimax decision rules are of interest in robustness studies in Bayesian statistics.

MSE of maximum likelihood estimator versus James–Stein estimator