Efficient and Generalizable Neural Architecture Search for Visual Recognition

Loading...
Thumbnail Image

Date

2021

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

148
views
323
downloads

Abstract

Neural Architecture Search (NAS) can achieve accuracy superior to human-designed neural networks, because of the easier automation process and searching techniques.While automated designed neural architectures can achieve new state-of-the-art performance with less human crafting efforts, there are three obstacles which hinder us building the next generation NAS algorithms: (1) search space is constrained which limits their representation ability; (2) searching large search space is time costly which slows down the model crafting process; (3) inference of complicated neural architectures are slow which limits the deployability on different devices To improve search space, previous NAS works rely on existing block motifs. Specifically, previous search space seek the best combination of MobileNetV2 blocks without exploring the sophisticated cell connections. To accelerate searching process, more accurate description of neural architecture is necessary. To deploy neural architectures to hardware, better adaptability is required. The dissertation proposes ScaleNAS to expand a search space that is adaptable to multiple vision-based tasks. The dissertation will show that NASGEM overcomes the neural architecture representation ability to accelerate searching. Finally, we shows how to integrate neural architecture search to strucural pruning and mixed precision quantization to further improve hardware deployment.

Description

Provenance

Citation

Citation

Cheng, Hsin-Pai (2021). Efficient and Generalizable Neural Architecture Search for Visual Recognition. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/23808.

Collections


Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.