TY - JOUR
T1 - A Novel Multi-Stage Training Approach for Human Activity Recognition from Multimodal Wearable Sensor Data Using Deep Neural Network
AU - Mahmud, Tanvir
AU - Sazzad Sayyed, A. Q.M.
AU - Fattah, Shaikh Anowarul
AU - Kung, Sun Yuan
N1 - Publisher Copyright:
© 2001-2012 IEEE.
PY - 2021/1/15
Y1 - 2021/1/15
N2 - Deep neural network is an effective choice to automatically recognize human actions utilizing data from various wearable sensors. These networks automate the process of feature extraction relying completely on data. However, various noises in time series data with complex inter-modal relationships among sensors make this process more complicated. In this article, we have proposed a novel multi-stage training approach that increases diversity in this feature extraction process to make accurate recognition of actions by combining varieties of features extracted from diverse perspectives. Initially, instead of using single type of transformation, numerous transformations are employed on time series data to obtain variegated representations of the features encoded in raw data. An efficient deep CNN architecture is proposed that can be individually trained to extract features from different transformed spaces. Later, these CNN feature extractors are merged into an optimal architecture finely tuned for optimizing diversified extracted features through a combined training stage or multiple sequential training stages. This approach offers the opportunity to explore the encoded features in raw sensor data utilizing multifarious observation windows with immense scope for efficient selection of features for final convergence. Extensive experimentations have been carried out in three publicly available datasets that provide outstanding performance consistently with average five-fold cross-validation accuracy of 99.29% on UCI HAR database, 99.02% on USC HAR database, and 97.21% on SKODA database outperforming other state-of-the-art approaches.
AB - Deep neural network is an effective choice to automatically recognize human actions utilizing data from various wearable sensors. These networks automate the process of feature extraction relying completely on data. However, various noises in time series data with complex inter-modal relationships among sensors make this process more complicated. In this article, we have proposed a novel multi-stage training approach that increases diversity in this feature extraction process to make accurate recognition of actions by combining varieties of features extracted from diverse perspectives. Initially, instead of using single type of transformation, numerous transformations are employed on time series data to obtain variegated representations of the features encoded in raw data. An efficient deep CNN architecture is proposed that can be individually trained to extract features from different transformed spaces. Later, these CNN feature extractors are merged into an optimal architecture finely tuned for optimizing diversified extracted features through a combined training stage or multiple sequential training stages. This approach offers the opportunity to explore the encoded features in raw sensor data utilizing multifarious observation windows with immense scope for efficient selection of features for final convergence. Extensive experimentations have been carried out in three publicly available datasets that provide outstanding performance consistently with average five-fold cross-validation accuracy of 99.29% on UCI HAR database, 99.02% on USC HAR database, and 97.21% on SKODA database outperforming other state-of-the-art approaches.
KW - CNN
KW - Sensor data processing
KW - activity recognition
KW - feature learning
KW - multi-stage training
UR - http://www.scopus.com/inward/record.url?scp=85098194597&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85098194597&partnerID=8YFLogxK
U2 - 10.1109/JSEN.2020.3015781
DO - 10.1109/JSEN.2020.3015781
M3 - Article
AN - SCOPUS:85098194597
VL - 21
SP - 1715
EP - 1726
JO - IEEE Sensors Journal
JF - IEEE Sensors Journal
SN - 1530-437X
IS - 2
M1 - 9164933
ER -