Now we turn to the minimization of a function of n variables, where and the partial derivatives of are accessible. Although the Newton search method will turn out to have a familiar form. For illustration purposes we emphasize the two dimensional case when . The extension to n dimensions is discussed in the hyperlink.
Assume that are functions of two variables, , their Jacobian matrix is
Assume that is a function of two variables, , and has partial derivatives up to the order two. The Hessian matrix is defined as follows:
Lemma 1. For the Hessian matrix is the Jacobian matrix for the two functions , i. e.
Lemma 2. If the second order partial derivatives of are continuous then the Hessian matrix is symmetric.