Descent direction

In optimization, a descent direction is a vector

{\displaystyle \mathbf {p} \in \mathbb {R} ^{n}}

that points towards a local minimum

of an objective function

Computing

by an iterative method, such as line search defines a descent direction

th iterate to be any

denotes the inner product.

The motivation for such an approach is that small steps along

guarantee that

is reduced, by Taylor's theorem.

Using this definition, the negative of a non-zero gradient is always a descent direction, as

Numerous methods exist to compute descent directions, all with differing merits, such as gradient descent or the conjugate gradient method.

More generally, if

is a positive definite matrix, then

is a descent direction at

[1] This generality is used in preconditioned gradient descent methods.