TY - JOUR
T1 - Exploiting Operation Importance for Differentiable Neural Architecture Search
AU - Zhou, Yuan
AU - Xie, Xukai
AU - Kung, Sun Yuan
N1 - Funding Information:
This work was supported in part by the National Natural Science Foundation of China under Grant U2006211 and in part by the National Key Research and Development Project of China under Grant 2020YFC1523204.
Publisher Copyright:
© 2012 IEEE.
PY - 2022/11/1
Y1 - 2022/11/1
N2 - Recently, differentiable neural architecture search (NAS) methods have made significant progress in reducing the computational costs of NASs. Existing methods search for the best architecture by choosing candidate operations with higher architecture weights. However, architecture weights cannot accurately reflect the importance of each operation, that is, the operation with the highest weight might not be related to the best performance. To circumvent this deficiency, we propose a novel indicator that can fully represent the operation importance and, thus, serve as an effective metric to guide the model search. Based on this indicator, we further develop a NAS scheme for 'exploiting operation importance for effective NAS' (EoiNAS). More precisely, we propose a high-order Markov chain-based strategy to slim the search space to further improve search efficiency and accuracy. To evaluate the effectiveness of the proposed EoiNAS, we applied our method to two tasks: image classification and semantic segmentation. Extensive experiments on both tasks provided strong evidence that our method is capable of discovering high-performance architectures while guaranteeing the requisite efficiency during searching.
AB - Recently, differentiable neural architecture search (NAS) methods have made significant progress in reducing the computational costs of NASs. Existing methods search for the best architecture by choosing candidate operations with higher architecture weights. However, architecture weights cannot accurately reflect the importance of each operation, that is, the operation with the highest weight might not be related to the best performance. To circumvent this deficiency, we propose a novel indicator that can fully represent the operation importance and, thus, serve as an effective metric to guide the model search. Based on this indicator, we further develop a NAS scheme for 'exploiting operation importance for effective NAS' (EoiNAS). More precisely, we propose a high-order Markov chain-based strategy to slim the search space to further improve search efficiency and accuracy. To evaluate the effectiveness of the proposed EoiNAS, we applied our method to two tasks: image classification and semantic segmentation. Extensive experiments on both tasks provided strong evidence that our method is capable of discovering high-performance architectures while guaranteeing the requisite efficiency during searching.
KW - High-order Markov chain
KW - image classification
KW - neural architecture search (NAS)
KW - semantic segmentation
UR - http://www.scopus.com/inward/record.url?scp=85107203136&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85107203136&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2021.3072950
DO - 10.1109/TNNLS.2021.3072950
M3 - Article
C2 - 33999825
AN - SCOPUS:85107203136
SN - 2162-237X
VL - 33
SP - 6235
EP - 6248
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 11
ER -