TY - JOUR
T1 - Learning Mixtures of Linear Dynamical Systems
AU - Chen, Yanxi
AU - Poor, H. Vincent
N1 - Funding Information:
Y. Chen is supported in part by the ARO grant W911NF-20-1-0097, the NSF grants CCF-1907661 and IIS-1900140, and the AFOSR grant FA9550-19-1-0030. H. V. Poor is supported in part by the NSF under Grant CCF-1908308. We would like to thank Yuxin Chen and Gen Li for numerous helpful discussions.
Publisher Copyright:
Copyright © 2022 by the author(s)
PY - 2022
Y1 - 2022
N2 - We study the problem of learning a mixture of multiple linear dynamical systems (LDSs) from unlabeled short sample trajectories, each generated by one of the LDS models. Despite the wide applicability of mixture models for time-series data, learning algorithms that come with end-to-end performance guarantees are largely absent from existing literature. There are multiple sources of technical challenges, including but not limited to (1) the presence of latent variables (i.e. the unknown labels of trajectories); (2) the possibility that the sample trajectories might have lengths much smaller than the dimension d of the LDS models; and (3) the complicated temporal dependence inherent to time-series data. To tackle these challenges, we develop a two-stage meta-algorithm, which is guaranteed to efficiently recover each ground-truth LDS model up to error Oe(pd/T), where T is the total sample size. We validate our theoretical studies with numerical experiments, confirming the efficacy of the proposed algorithm.
AB - We study the problem of learning a mixture of multiple linear dynamical systems (LDSs) from unlabeled short sample trajectories, each generated by one of the LDS models. Despite the wide applicability of mixture models for time-series data, learning algorithms that come with end-to-end performance guarantees are largely absent from existing literature. There are multiple sources of technical challenges, including but not limited to (1) the presence of latent variables (i.e. the unknown labels of trajectories); (2) the possibility that the sample trajectories might have lengths much smaller than the dimension d of the LDS models; and (3) the complicated temporal dependence inherent to time-series data. To tackle these challenges, we develop a two-stage meta-algorithm, which is guaranteed to efficiently recover each ground-truth LDS model up to error Oe(pd/T), where T is the total sample size. We validate our theoretical studies with numerical experiments, confirming the efficacy of the proposed algorithm.
UR - http://www.scopus.com/inward/record.url?scp=85163137388&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85163137388&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85163137388
SN - 2640-3498
VL - 162
SP - 3507
EP - 3557
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 39th International Conference on Machine Learning, ICML 2022
Y2 - 17 July 2022 through 23 July 2022
ER -