Q1. Which of the following statements is incorrect? Direct search methods are useful when the optimization function is not differentiable The gradient of f(x,y) is the a vector pointing in the direction of the steepest slope at that point. The Hessian is the Jacobian Matrix of second-order partial derivatives of a function. The second derivative of the optimization function is used to determine if we have reached an optimal point. Q2. An initial estimate of an optimal solution is given to be used in conjunction with the steepest ascent method to determine the maximum of the function. Which of the following statements is correct? The function to be optimized must be differentiable. If the initial estimate is different than the optimal solution, then the magnitude of the gradient is nonzero. As more iterations are performed, the function values of the solutions at the end of each subsequent iteration must be increasing. All 3 statements are correct. Q3. What are the gradient and the determinant of the Hessian of the function f(x,y)=x2y2 at its global optimum?
Q4.
Determine the gradient of the function
Q5.
Determine the determinant of hessian
of the function 2 -4 0 -8
Q6.
Determine the minimum of the function
(2,1) (-6,-3) (0,0) (1,-1)
|
AUDIENCE | AWARDS | PEOPLE | TRACKS | DISSEMINATION | PUBLICATIONS |
||
Copyrights:
University
of South Florida, 4202 E Fowler Ave, Tampa, FL 33620-5350.
All Rights Reserved. Questions, suggestions or
comments, contact
kaw@eng.usf.edu
This material is based upon work supported by the National Science Foundation
under Grant#
|
||