Gradient method

In optimization, a gradient method is an algorithm to solve problems of the form


 * $$\min_{x\in\mathbb R^n}\; f(x)$$

with the search directions defined by the gradient of the function at the current point. Examples of gradient methods are the gradient descent and the conjugate gradient.