—This paper reviews the different gradient-based schemes and the sources of gradient, their availability, precision and computational complexity, and explores the benefits of using gradient information within a memetic framework in the context of continuous parameter optimization, which is labeled here as Memetic Gradient Search. In particular, we considered a quasi-Newton method with analytical gradient and finite differencing, as well as simultaneous perturbation stochastic approximation, used as the local searches. Empirical study on the impact of using gradient information showed that Memetic Gradient Search outperformed the traditional GA and analytical, precise gradient brings considerable benefit to gradient-based local search (LS) schemes. Though gradient-based searches can sometimes get trapped in local optima, memetic gradient searches were still able to converge faster than the conventional GA.