Linear scaling with and within semantic backpropagation-based genetic programming for symbolic regression
Semantic Backpropagation (SB) is a recent technique that promotes effective variation in tree-based genetic programming. The basic idea of SB is to provide information on what output is desirable for a specified tree node, by propagating the desired root-node output back to the specified node using inversions of functions encountered along the way. Variation operators then replace the subtree located at the specified node with a tree for which the output is closest to the desired output, by searching in a pre-computed library. In this paper, we propose two contributions to enhance SB specifically for symbolic regression, by incorporating the principles of Keijzer's Linear Scaling (LS). In particular, we show how SB can be used in synergy with the scaled mean squared error, and we show how LS can be adopted within library search. We test our adaptations using the well-known variation operator Random Desired Operator (RDO), comparing to its baseline implementation, and to traditional crossover and mutation. Our experimental results on real-world datasets show that SB enhanced with LS substantially improves the performance of RDO, resulting in overall the best performance among all tested GP algorithms.
|Keywords||Genetic programming, Linear scaling, Semantic backpropagation|
|Conference||Genetic and Evolutionary Computation Conference|
Virgolin, M, Alderliesten, T, & Bosman, P.A.N. (2019). Linear scaling with and within semantic backpropagation-based genetic programming for symbolic regression. In Proceedings of the 2019 Genetic and Evolutionary Computation Conference (pp. 1084–1092). doi:10.1145/3321707.3321758