Abstract
Pattern-recognition algorithms from machine learning play a prominent role in embedded sensing systems to derive inferences from sensor data. Very often, such systems face severe energy constraints, especially when dealing with high-dimensional data, such as images. The focus of this study is on reducing computational energy by exploiting the concept of transfer learning and energy-efficient dataflow accelerators. We show that the use of convolutional autoencoders can enable various opportunities to reduce computational energy and avoid significant reduction in inference performance when multiple task categories are targeted for inference. We validate our approach through a multi-task case study. The study targets a set of pictures with each picture containing four different task categories: gender, smile, glasses, and pose. In order to minimize inference time and computational energy, a convolutional autoencoder is used for learning a generalized representation of the images. Three scenarios are analyzed: transferring layers using convolutional autoencoders, transferring layers using convolutional neural networks trained on different tasks, and no layer transfer. We show that when the convolutional layers with one fully-connected layer are transferred using convolutional autoencoders, we can achieve a reduction of 6.58×6.58× in computational energy, while improving performance by 1.98, 1.88, 4.11, and 1.47 percent for gender, smile, glasses, and pose inferences, respectively, as compared to the no-transfer method, when the number of training samples is small.
Original language | English (US) |
---|---|
Pages (from-to) | 1045-1057 |
Number of pages | 13 |
Journal | IEEE Transactions on Emerging Topics in Computing |
Volume | 10 |
Issue number | 2 |
DOIs | |
State | Published - 2022 |
All Science Journal Classification (ASJC) codes
- Computer Science (miscellaneous)
- Information Systems
- Human-Computer Interaction
- Computer Science Applications
Keywords
- Convolutional neural networks
- energy reduction
- machine learning
- multi-task images
- transfer learning