Monday, February 1, 2010

Gradient and Newton's Methods

Now we turn to the minimization of a function [Graphics:Images/GradientSearchMod_gr_1.gif] of n variables, where [Graphics:Images/GradientSearchMod_gr_2.gif] and the partial derivatives of [Graphics:Images/GradientSearchMod_gr_3.gif] are accessible.

Steepest Descent or Gradient Method

Let [Graphics:Images/GradientSearchMod_gr_4.gif] be a function of [Graphics:Images/GradientSearchMod_gr_5.gif] such that [Graphics:Images/GradientSearchMod_gr_6.gif] exists for [Graphics:Images/GradientSearchMod_gr_7.gif]. The gradient of [Graphics:Images/GradientSearchMod_gr_8.gif], denoted by [Graphics:Images/GradientSearchMod_gr_9.gif], is the vector

(1) [Graphics:Images/GradientSearchMod_gr_10.gif].

Illustrations: -

Recall that the gradient vector in (1) points locally in the direction of the greatest rate of increase of [Graphics:Images/GradientSearchMod_gr_17.gif]. Hence [Graphics:Images/GradientSearchMod_gr_18.gif] points locally in the direction of greatest decrease [Graphics:Images/GradientSearchMod_gr_19.gif]. Start at the point [Graphics:Images/GradientSearchMod_gr_20.gif] and search along the line through [Graphics:Images/GradientSearchMod_gr_21.gif] in the direction [Graphics:Images/GradientSearchMod_gr_22.gif]. You will arrive at a point [Graphics:Images/GradientSearchMod_gr_23.gif], where a local minimum occurs when the point [Graphics:Images/GradientSearchMod_gr_24.gif] is constrained to lie on the line [Graphics:Images/GradientSearchMod_gr_25.gif]. Since partial derivatives are accessible, the minimization process can be executed using either the quadratic or cubic approximation method.

Next we compute [Graphics:Images/GradientSearchMod_gr_26.gif] and move in the search direction [Graphics:Images/GradientSearchMod_gr_27.gif]. You will come to [Graphics:Images/GradientSearchMod_gr_28.gif], where a local minimum occurs when [Graphics:Images/GradientSearchMod_gr_29.gif] is constrained to lie on the line [Graphics:Images/GradientSearchMod_gr_30.gif]. Iteration will produce a sequence, [Graphics:Images/GradientSearchMod_gr_31.gif], of points with the property
[Graphics:Images/GradientSearchMod_gr_32.gif]

If [Graphics:Images/GradientSearchMod_gr_33.gif] then [Graphics:Images/GradientSearchMod_gr_34.gif] will be a local minimum [Graphics:Images/GradientSearchMod_gr_35.gif].

Outline of the Gradient Method

Suppose that [Graphics:Images/GradientSearchMod_gr_36.gif] has been obtained.

(i) Evaluate the gradient vector [Graphics:Images/GradientSearchMod_gr_37.gif].

(ii) Compute the search direction [Graphics:Images/GradientSearchMod_gr_38.gif].

(iii) Perform a single parameter minimization of [Graphics:Images/GradientSearchMod_gr_39.gif] on the interval [Graphics:Images/GradientSearchMod_gr_40.gif], where b is large.
This will produce a value [Graphics:Images/GradientSearchMod_gr_41.gif] where a local minimum for [Graphics:Images/GradientSearchMod_gr_42.gif]. The relation [Graphics:Images/GradientSearchMod_gr_43.gif]
shows that this is a minimum for [Graphics:Images/GradientSearchMod_gr_44.gif] along the search line [Graphics:Images/GradientSearchMod_gr_45.gif].

(iv) Construct the next point [Graphics:Images/GradientSearchMod_gr_46.gif].

(v) Perform the termination test for minimization, i.e.
are the function values [Graphics:Images/GradientSearchMod_gr_47.gif] and [Graphics:Images/GradientSearchMod_gr_48.gif] sufficiently close and the distance[Graphics:Images/GradientSearchMod_gr_49.gif]small enough ?

Repeat the process.

No comments:

Post a Comment

Introductory Methods of Numerical Analysis