Recently, differentiable neural architecture search (NAS) methods have made significant progress in reducing the computational costs of NASs. Existing methods search for the best architecture by choosing candidate operations with higher architecture weights. However, architecture weights cannot accurately reflect the importance of each operation, that is, the operation with the highest weight might not be related to the best performance. To circumvent this deficiency, we propose a novel indicator that can fully represent the operation importance and, thus, serve as an effective metric to guide the model search. Based on this indicator, we further develop a NAS scheme for ``exploiting operation importance for effective NAS'' (EoiNAS). More precisely, we propose a high-order Markov chain-based strategy to slim the search space to further improve search efficiency and accuracy. To evaluate the effectiveness of the proposed EoiNAS, we applied our method to two tasks: image classification and semantic segmentation. Extensive experiments on both tasks provided strong evidence that our method is capable of discovering high-performance architectures while guaranteeing the requisite efficiency during searching.
|Original language||English (US)|
|Journal||IEEE Transactions on Neural Networks and Learning Systems|
|State||Accepted/In press - 2021|
All Science Journal Classification (ASJC) codes
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence
- Computer architecture
- High-order Markov chain
- image classification
- Image segmentation
- Markov processes
- neural architecture search (NAS)
- semantic segmentation.
- Task analysis