Bayesian networks (BNs) are probabilistic graphical models which are widely used for knowledge representation and decision making tasks, especially in the presence of uncertainty. Finding or learning the structure of BNs from data is an NP-hard problem. Evolutionary algorithms (EAs) have been extensively used to automate the learning process. In this paper, we consider the use of the Gene-Pool Optimal Mixing Evolutionary Algorithm (GOMEA). GOMEA is a relatively new type of EA that belongs to the class of model-based EAs. The model used in GOMEA is aimed at modeling the dependency structure between problem variables, so as to improve the efficiency and effectiveness of variation. This paper shows that the excellent performance of GOMEA transfers from well-known academic benchmark problems to the specific case of learning BNs from data due to its model-building capacities and the potential to compute partial evaluations when learning BNs. On commonly-used datasets of varying size, we find that GOMEA outperforms standard algorithms such as Order-based search (OBS), as well as other EAs, such as Genetic Algorithms (GAs) and Estimation of Distribution algorithms (EDAs), even when efficient local search techniques are added.

Additional Metadata
Keywords Bayesian networks, Evolutionary algorithms, Structure learning
Persistent URL dx.doi.org/10.1145/3205455.3205502
Conference Genetic and Evolutionary Computation Conference
Citation
Orphanou, K, Thierens, D, & Bosman, P.A.N. (2018). Learning Bayesian network structures with GOMEA. In GECCO 2018 - Proceedings of the 2018 Genetic and Evolutionary Computation Conference (pp. 1007–1014). doi:10.1145/3205455.3205502