## Monday, February 1, 2010

Now we turn to the minimization of a function of n variables, where and the partial derivatives of are accessible.

Let be a function of such that exists for . The gradient of , denoted by , is the vector

(1) .

Illustrations: -

Recall that the gradient vector in (1) points locally in the direction of the greatest rate of increase of . Hence points locally in the direction of greatest decrease . Start at the point and search along the line through in the direction . You will arrive at a point , where a local minimum occurs when the point is constrained to lie on the line . Since partial derivatives are accessible, the minimization process can be executed using either the quadratic or cubic approximation method.

Next we compute and move in the search direction . You will come to , where a local minimum occurs when is constrained to lie on the line . Iteration will produce a sequence, , of points with the property

If then will be a local minimum .

Suppose that has been obtained.

(i) Evaluate the gradient vector .

(ii) Compute the search direction .

(iii) Perform a single parameter minimization of on the interval , where b is large.
This will produce a value where a local minimum for . The relation
shows that this is a minimum for along the search line .

(iv) Construct the next point .

(v) Perform the termination test for minimization, i.e.
are the function values and sufficiently close and the distancesmall enough ?

Repeat the process.