— Many deterministic algorithms in the context of constrained optimization require the first-order derivatives, or the gradient vectors, of the objective and constraint functions to determine the next feasible direction along which the search should progress. Although the second-order derivatives, or the Hessian matrices, are also required by some methods such as the sequential quadratic programming (SQP), their values can be approximated based on the first-order information, making the gradients central to the deterministic algorithms for solving constrained optimization problems. In this paper, two ways of obtaining the gradients are compared under the framework of the simple memetic algorithm (MA) employing genetic algorithm (GA) and SQP. Despite the simplicity and straightforwardness of the finite-difference gradients, faster convergence rate can be achieved when the analytical gradients can be made available. The savings on the number of function evaluations as well as the am...