Publication Details

Neural Architecture Search and Hardware Accelerator Co-Search: A Survey

SEKANINA Lukáš. Neural Architecture Search and Hardware Accelerator Co-Search: A Survey. IEEE Access, vol. 9, no. 9, 2021, pp. 151337-151362. ISSN 2169-3536. Available from: https://ieeexplore.ieee.org/document/9606893
Czech title
Souběžné hledání architektur neuronových sítí a jejich hardwarových akcelerátorů: Přehled metod
Type
journal article
Language
english
Authors
URL
Keywords

Artificial neural networks, Accelerator architectures, Design optimization, Optimization methods, Machine learning, Image classification, Computer aided engineering, Approximation methods, Evolutionary computation, Digital circuits

Abstract

Deep neural networks (DNN) are now dominating in the most challenging applications of machine learning. As DNNs can have complex architectures with millions of trainable parameters (the so-called weights), their design and training are difficult even for highly qualified experts.  In order to reduce human effort, neural architecture search (NAS) methods have been developed to automate the entire design process. The NAS methods typically combine searching in the space of candidate architectures and optimizing (learning) the weights using a gradient method. In this paper, we survey the key elements of NAS methods that -- to various extents -- consider hardware implementation of the resulting DNNs. We classified these methods into three major classes: single-objective NAS (no hardware is considered), hardware-aware NAS (DNN is optimized for a particular hardware platform), and NAS with hardware co-optimization (hardware is directly co-optimized with DNN as a part of NAS). Compared to previous surveys, we emphasize the multi-objective design approach that must be adopted in NAS and focus on co-design algorithms developed for concurrent optimization of DNN architectures and hardware platforms. As most research in this area deals with NAS for image classification using convolutional neural networks, we follow this trajectory in our paper. After reading the paper, the reader should understand why and how NAS and hardware co-optimization are currently used to build cutting-edge implementations of DNNs.

Published
2021
Pages
151337-151362
Journal
IEEE Access, vol. 9, no. 9, ISSN 2169-3536
Publisher
Institute of Electrical and Electronics Engineers
DOI
UT WoS
000719556200001
EID Scopus
BibTeX
@ARTICLE{FITPUB12620,
   author = "Luk\'{a}\v{s} Sekanina",
   title = "Neural Architecture Search and Hardware Accelerator Co-Search: A Survey",
   pages = "151337--151362",
   journal = "IEEE Access",
   volume = 9,
   number = 9,
   year = 2021,
   ISSN = "2169-3536",
   doi = "10.1109/ACCESS.2021.3126685",
   language = "english",
   url = "https://www.fit.vut.cz/research/publication/12620"
}
Back to top