Mixed-integer optimization considers problems with both discrete and continuous variables. The ability to learn and process problem structure can be of paramount importance for optimization, particularly when faced with black-box optimization (BBO) problems, where no structural knowledge is known a priori. For such cases, model-based Evolutionary Algorithms (EAs) have been very successful in the fields of discrete and continuous optimization. In this paper, we present a model-based EA which integrates techniques from the discrete and continuous domains in order to tackle mixed-integer problems. We furthermore introduce the novel mechanisms to learn and exploit mixed-variable dependencies. Previous approaches only learned dependencies explicitly in either the discrete or the continuous domain. The potential usefulness of addressing mixed dependencies directly is assessed by empirically analyzing algorithm performance on a selection of mixed-integer problems with different types of variable interactions. We find substantially improved, scalable performance on problems that exhibit mixed dependencies.

, ,
doi.org/10.1109/CEC.2016.7744347
IEEE Congress on Evolutionary Computation
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Sadowski, K., Bosman, P., & Thierens, D. (2016). Learning and exploiting mixed variable dependencies with a model-based EA. In 2016 IEEE Congress on Evolutionary Computation, CEC 2016 (pp. 4382–4389). doi:10.1109/CEC.2016.7744347