Various multi–objective evolutionary algorithms (MOEAs) have obtained promising results on various numerical multi– objective optimization problems. The combination with gradient–based local search operators has however been limited to only a few studies. In the single–objective case it is known that the additional use of gradient information can be beneficial. In this paper we provide an analytical parametric description of the set of all non–dominated (i.e. most promising) directions in which a solution can be moved such that its objectives either improve or remain the same. Moreover, the parameters describing this set can be computed efficiently using only the gradients of the individual objectives. We use this result to hybridize an existing MOEA with a local search operator that moves a solution in a randomly chosen non–dominated improving direction. We test the resulting algorithm on a few well–known benchmark problems and compare the results with the same MOEA wi...
Peter A. N. Bosman, Edwin D. de Jong