#
Optimization Methods

##
Steepest Descent

###
Function of One Variable

Search for the minimum of function * *by doing steps proportional to * *in the direction, opposite to

###
Function of Several Variables

Here we just use the gradient instead of the derivative

##
Newton's Method

###
Function of One Variable

In order to find minimum of , solve the equation using the approximation:

==>

Therefore.

==>

==>

###
Function of Several Variables

The case is similar to the above, only we use the gradient instead of the derivative, and matrix of second derivatives (kinda "gradient of a gradient") instead of the second derivative.

##
Gauss-Newton's Method

Suppose there is a vector-valued function and we need to minimize the value of .

Linearizing we get that for each

which in vector form is:

where is the Jacobian matrix of and

To minimize we solve using the obtained approximation:

Expressing we get the step of the algorithm:

*
**© Konstantin Tretyakov*

Converted by *Mathematica*
March 13, 2003

Original file is available *here*.