In optimization, a gradient method is an algorithm to solve problems of the form with the search directions defined by the gradient of the function at the current point.
Examples of gradient methods are the gradient descent and the conjugate gradient.
This linear algebra-related article is a stub.
You can help Wikipedia by expanding it.