The Gene-pool Optimal Mixing Evolutionary Algorithm (GOMEA) is a recently introduced model-based EA that has been shown to be capable of outperforming state-of-the-art alternative EAs in terms of scalability when solving discrete optimization problems. One of the key aspects of GOMEA's success is a variation operator that is designed to extensively exploit linkage models by effectively combining partial solutions. Here, we bring the strengths of GOMEA to Genetic Programming (GP), introducing GP-GOMEA. Under the hypothesis of having little problem-specific knowledge, and in an effort to design easy-to-use EAs, GP-GOMEA requires no parameter specification. On a set of well-known benchmark problems we find that GP-GOMEA outperforms standard GP while being on par with more recently introduced, state-of-the-art EAs. We furthermore introduce Input-space Entropy-based Building-block Learning (IEBL), a novel approach to identifying and encapsulating relevant building blocks (subroutines) into new terminals and functions. On problems with an inherent degree of modularity, IEBL can contribute to compact solution representations, providing a large potential for knock-on effects in performance. On the difficult, but highly modular Even Parity problem, GP-GOMEA+IEBL obtains excellent scalability, solving the 14-bit instance in less than 1 hour.

, , , ,
3D dose reconstruction for children with long-term follow-up Toward improved decision making in radiation treatment for children with cancer
Genetic and Evolutionary Computation Conference
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Virgolin, M., Witteveen, C., Alderliesten, T., & Bosman, P. (2017). Scalable genetic programming by gene-pool optimal mixing and input-space entropy-based building-block learning. In GECCO 2017 - Proceedings of the 2017 Genetic and Evolutionary Computation Conference (pp. 1041–1048). doi:10.1145/3071178.3071287