It is known that in real-valued Single-Objective (SO) optimization with Gaussian Estimation-of-Distribution Algorithms (EDAs), it is important to take into account how distribution parameters change in subsequent generations to prevent inefficient convergence as a result of overfitting, especially if dependencies are modelled. We illustrate that in Multi-Objective (MO) optimization the risk of overfitting is even larger and only further increased if clustered variation is used, a technique often employed in Multi-Objective EDAs (MOEDAs) in the form of mixture modelling via clustering selected solutions in objective space. We point out that a technique previously used in EDAs to remove the risk of overfitting for SO optimization, the anticipated mean shift (AMS), can also be used in MO optimization if clusters in subsequent generations are registered. We propose to compute this registration explicitly. Although computationally more intensive than existing approaches, the effectiveness of AMS is thereby increased. We further propose a new clustering technique to improve mixture modelling in EDAs by 1) allowing clusters to overlap substantially and 2) assigning each cluster the same number of solutions. This allows any existing EDA to be transformed into a mixture-based version straightforwardly. Finally, we point out the benefit of injecting solutions obtained from running equal-capacity SO optimizers in synchronous parallel and investigate experimentally, using 9 well-known benchmark problems, the advantages of each of the techniques.
, , ,
,
, ,
ACM Press
J. Branke
Genetic and Evolutionary Computation Conference
Intelligent and autonomous systems

Bosman, P. (2010). The anticipated mean shift and cluster registration in mixture-based EDAs for multi-objective optimization. In J. Branke (Ed.), Proceedings of ACM Annual Genetic and Evolutionary Computation Conference 2010 (pp. 351–358). ACM Press.