Quiz Chapter 09.04: Multidimensional Gradient Search Method MULTIPLE CHOICE TEST (All Tests) MULTIDIMENSIONAL GRADIENT SEARCH METHOD (More on Multidimensional Gradient Search Method) OPTIMIZATION (More on Optimization) Pick the most appropriate answer 1. Which of the following statements is incorrect? Direct search methods are useful when the optimization function is not differentiable. The gradient of f(x,y) is the a vector pointing in the direction of the steepest slope at that point. The Hessian is the Jacobian Matrix of second-order partial derivatives of a function. The second derivative of the optimization function is used to determine if we have reached an optimal point. 2. An initial estimate of an optimal solution is given to be used in conjunction with the steepest ascent method to determine the maximum of the function. Which of the following statements is correct? The function to be optimized must be differentiable. If the initial estimate is different than the optimal solution, then the magnitude of the gradient is nonzero. As more iterations are performed, the function values of the solutions at the end of each subsequent iteration must be increasing. All 3 statements are correct. 3. What are the gradient and the determinant of the Hessian of the function f(x,y)=x^{2} y^{2} at its global optimum? \triangledown f = 0i + 0j and \left| H \right| > 0 \triangledown f = 0i + 0j and \left| H \right| = 0 \triangledown f = 1i + 1j and \left| H \right | \leq 0 \triangledown f = 1i + 1j and \left| H \right| = 0 4. Determine the gradient of the function x^{2} - 2y^{2} -4y + 6 at point (0, \, 0). \triangledown f = 2i-4j \triangledown f = 0i-4j \triangledown f = 0i+0j \triangledown f = -4i-4j 5. Determine the determinant of hessian of the function x^{2} - 2y^{2} - 4y + 6 at point (0, \, 0). 2 -4 0 -8 6. Determine the minimum of the function f(x,y) = x^{2} + y^{2}. Use the point (2, \, 1) as the initial estimate of the optimal solution. Conduct one iteration. (2, \, 1) (-6, \, -3) (0, \, 0) (1, \, -1) Loading …