TY - GEN
T1 - INVITED
T2 - 57th ACM/IEEE Design Automation Conference, DAC 2020
AU - Xia, Wenhan
AU - Yin, Hongxu
AU - Jha, Niraj K.
N1 - Funding Information:
In this paper, we have reviewed several current trends in compact DNN design and described synthesis frameworks that automatically generate lightweight and accurate DNN architectures. So far, most existing work on efficient DNN synthesis focuses on reducing the computational and storage costs at inference time. However, training DNNs on large-scale datasets remains inefficient and expensive, easily consuming hundreds of GPU hours. More work on speeding up training will substantially benefit the entire deep learning community. In addition, more work can be done on novel model design and optimization techniques such as hardware-software codesign. The DeepInversion methodology, for example, can be extended to explore different gradient sources for inversion, such as bounding boxes, and can be applied to different DNN architectures, e.g., LSTM and 3D CNN. Extending methods like DeepInversion to other DNN architectures can enable synthesis of video, speech, text, 3D objects, and even language models, while making all these tasks data-free. Acknowledgments: This work was supported by NSF under Grant No. CNS-1907381.
Publisher Copyright:
© 2020 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - Deep neural networks (DNNs) have been deployed in myriad machine learning applications. However, advances in their accuracy are often achieved with increasingly complex and deep network architectures. These large, deep models are often unsuitable for real-world applications, due to their massive computational cost, high memory bandwidth, and long latency. For example, autonomous driving requires fast inference based on Internet-of-Things (IoT) edge devices operating under run-time energy and memory storage constraints. In such cases, compact DNNs can facilitate deployment due to their reduced energy consumption, memory requirement, and inference latency. Long short-term memories (LSTMs) are a type of recurrent neural network that have also found widespread use in the context of sequential data modeling. They also face a model size vs. accuracy trade-off. In this paper, we review major approaches for automatically synthesizing compact, yet accurate, DNN/LSTM models suitable for real-world applications. We also outline some challenges and future areas of exploration.
AB - Deep neural networks (DNNs) have been deployed in myriad machine learning applications. However, advances in their accuracy are often achieved with increasingly complex and deep network architectures. These large, deep models are often unsuitable for real-world applications, due to their massive computational cost, high memory bandwidth, and long latency. For example, autonomous driving requires fast inference based on Internet-of-Things (IoT) edge devices operating under run-time energy and memory storage constraints. In such cases, compact DNNs can facilitate deployment due to their reduced energy consumption, memory requirement, and inference latency. Long short-term memories (LSTMs) are a type of recurrent neural network that have also found widespread use in the context of sequential data modeling. They also face a model size vs. accuracy trade-off. In this paper, we review major approaches for automatically synthesizing compact, yet accurate, DNN/LSTM models suitable for real-world applications. We also outline some challenges and future areas of exploration.
KW - Convolutional neural network
KW - Deep learning
KW - Grow-and-prune synthesis paradigm
KW - Long short-term memory
KW - Machine learning
KW - Model compression
UR - http://www.scopus.com/inward/record.url?scp=85093960716&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85093960716&partnerID=8YFLogxK
U2 - 10.1109/DAC18072.2020.9218529
DO - 10.1109/DAC18072.2020.9218529
M3 - Conference contribution
AN - SCOPUS:85093960716
T3 - Proceedings - Design Automation Conference
BT - 2020 57th ACM/IEEE Design Automation Conference, DAC 2020
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 20 July 2020 through 24 July 2020
ER -