Improving the Efficiency and Performance of Neural Architecture Search for the Modern Machine Learning Landscape
Date
2025
Authors
Advisors
Journal Title
Journal ISSN
Volume Title
Repository Usage Stats
views
downloads
Abstract
Neural Architecture Search (NAS) has emerged as a cornerstone in the design and optimization of neural networks, enabling the development of task-specific architectures with state-of-the-art performance. However, the increasing complexity of search spaces and computational cost constraints pose significant challenges to the efficiency and effectiveness of NAS. This thesis addresses these challenges through three key contributions that advance the field of NAS in diverse application domains.
First, we propose LISSNAS, an automated algorithm that leverages structural and performance locality to shrink large, complex search spaces into diverse, compact subspaces, facilitating efficient exploration and exploitation. LISSNAS demonstrates state-of-the-art performance, achieving a Top-1 accuracy of 77.6\% on ImageNet under mobile constraints and excelling in architectural diversity and search efficiency across various datasets and search spaces.
Second, we introduce SOAP-NAS, a novel NAS framework tailored for routability optimization in electronic design automation (EDA). SOAP-NAS overcomes the challenges of noisy training objectives and high variance in routability prediction tasks through innovative data augmentation and a hybrid one-shot and predictor-based NAS approach. This method achieves a 40% improvement in ROC-AUC performance for design rule checking (DRC) hotspot detection, with an ROC-AUC of 0.9802 and a query time of only 0.461 ms.
Lastly, we develop a low-cost fine-grained NAS algorithm that projects the search problem into a lower-dimensional space by predicting accuracy differences between similar networks. This paradigm shift reduces the computational complexity from exponential to linear with respect to the search space size, enabling efficient exploration of fine-grained search spaces. Extensive experimentation on NAS benchmarks demonstrates significant improvements in performance and sample efficiency compared to existing approaches.
Collectively, this thesis provides a comprehensive framework for advancing NAS methodologies, enabling efficient search in large, complex spaces, improving task-specific model performance, and expanding the applicability of NAS to critical domains such as EDA.
Type
Department
Description
Provenance
Subjects
Citation
Permalink
Citation
Sridhar, Arjun (2025). Improving the Efficiency and Performance of Neural Architecture Search for the Modern Machine Learning Landscape. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/32667.
Collections
Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.