Neural Architecture Search (NAS), i.e., the automation of neural network design, has gained much popularity in recent years with increasingly complex search algorithms being proposed. Yet, solid comparisons with simple baselines are often missing. At the same time, recent retrospective studies have found many new algorithms to be no better than random search (RS). In this work we consider the use of a simple Local Search (LS) algorithm for NAS. We particularly consider a multi-objective NAS formulation, with network accuracy and network complexity as two objectives, as understanding the trade-off between these two objectives is arguably among the most interesting aspects of NAS. The proposed LS algorithm is compared with RS and two evolutionary algorithms (EAs), as these are often heralded as being ideal for multi-objective optimization. To promote reproducibility, we create and release two benchmark datasets, named MacroNAS-C10 and -C100, containing 200K saved network evaluations for two established image classification tasks, CIFAR-10 and CIFAR-100. Our benchmarks are designed to be complementary to existing benchmarks, especially in that they are better suited for multi-objective search. We additionally consider a version of the problem with a much larger architecture space. While we find and show that the considered algorithms explore the search space in fundamentally different ways, we also find that LS substantially outperforms RS and even performs nearly as good as state-of-the-art EAs. We believe that this provides strong evidence that LS is truly a competitive baseline for NAS against which new NAS algorithms should be benchmarked.

, , , , ,
doi.org/10.1007/978-3-030-72062-9_37
Lecture Notes in Computer Science/Lecture Notes in Theoretical Computer Science and General Issues
International Conference on Evolutionary Multi-Criterion Optimization
Centrum Wiskunde & Informatica, Amsterdam (CWI), The Netherlands

Den Ottelander, T., Dushatskiy, A., Virgolin, M., & Bosman, P. (2021). Local Search is a remarkably strong baseline for Neural Architecture Search. In Evolutionary Multi-Criterion Optimization (pp. 465–479). doi:10.1007/978-3-030-72062-9_37