2022-07-08
Evolutionary neural cascade search across supernetworks
Publication
Publication
To achieve excellent performance with modern neural networks, having the right network architecture is important. Neural Architecture Search (NAS) concerns the automatic discovery of task-specific network architectures. Modern NAS approaches leverage super-networks whose subnetworks encode candidate neural network architectures. These subnetworks can be trained simultaneously, removing the need to train each network from scratch, thereby increasing the efficiency of NAS. A recent method called Neural Architecture Transfer (NAT) further improves the efficiency of NAS for computer vision tasks by using a multi-objective evolutionary algorithm to find high-quality subnetworks of a supernetwork pretrained on ImageNet. Building upon NAT, we introduce ENCAS - - Evolutionary Neural Cascade Search. ENCAS can be used to search over multiple pretrained supernetworks to achieve a trade-off front of cascades of different neural network architectures, maximizing accuracy while minimizing FLOPs count. We test ENCAS on common computer vision benchmarks (CIFAR-10, CIFAR-100, ImageNet) and achieve Pareto dominance over previous state-of-the-art NAS models up to 1.5 GFLOPs. Additionally, applying ENCAS to a pool of 518 publicly available ImageNet classifiers leads to Pareto dominance in all computation regimes and to increasing the maximum accuracy from 88.6% to 89.0%, accompanied by an 18% decrease in computation effort from 362 to 296 GFLOPs.
Additional Metadata | |
---|---|
, , , , | |
doi.org/10.1145/3512290.3528749 | |
2022 Genetic and Evolutionary Computation Conference, GECCO 2022 | |
Organisation | Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands |
Chebykin, A., Alderliesten, T., & Bosman, P. (2022). Evolutionary neural cascade search across supernetworks. In GECCO 2022 - Proceedings of the 2022 Genetic and Evolutionary Computation Conference (pp. 1038–1047). doi:10.1145/3512290.3528749 |