Towards Modular Neural Architecture Search

Author(s)
Lars Boecking, Patrick Philipp and Cedric Kulbach
Journal
ICLR Workshop on Neural Architecture Search
Year
2020
Abstract
In this work we present Modular Neural Architecture Search (ModNAS), which aims at reducing the complexity of the underlying search space by fostering the reuse of successful neural cells. We define a new modularized search space, which enables efficient search based on strong limitation to predefined building block as well as transferability to novel, unseen tasks. We present a preliminary evaluation of ModNAS for CIFAR-10, CIFAR-100 and Fashion-MNIST based on modules from the NAS-Bench-101 benchmark, where we alternate between random and pre-ranked retrieval based on documented accuracies of CIFAR-10. The results are promising in that we retrieve competitive architectures in 6 GPU hours, which highlights the potential of sophisticated ranking approaches for modules in our framework.
Online Sources
https://drive.google.com/file/d/1SLBnJe_X9erS3E0PytxK1ztPAEV37611/view
Research focus
Big Data and Service Science
Download .bib
Download .bib
Published by
Lars Böcking