Often, Estimation-of-Distribution Algorithms (EDAs) are praised for their ability to optimize a broad class of problems. EDA applications are however still limited. Often heard criticism is that a large population size is required and that distribution estimation takes long. Here we look at possibilities for improvements in these areas. We first discuss the use of a memory to aggregate information over multiple generations and reduce the population size. The approach we take, empirical risk minimization to perform non-linear regression of memory parameters, may well be generalizable to other EDAs. We design a memory this way for a Gaussian EDA and observe smaller population size requirements and fewer evaluations used. We also speed up the selection of Bayesian factorizations for Gaussian EDAs by sorting the entries in the covariance matrix. Finally, we discuss parameter-free Gaussian EDAs for real-valued single-objective optimization. We propose to not only increase the population size in subsequent runs, but to also divide it over parallel runs across the search space. On some multimodal problems improvements are thereby obtained.
, , , ,
, ,
ACM Press
G. Raidl
Genetic and Evolutionary Computation Conference
Intelligent and autonomous systems

Bosman, P. (2009). On Empirical Memory Design, Faster Selection of Bayesian Factorizations and Parameter-Free Gaussian EDAs. In G. Raidl (Ed.), Proceedings of ACM Annual Genetic and Evolutionary Computation Conference 2009 (pp. 389–396). ACM Press.