TY - GEN
T1 - Nonparametric spherical topic modeling with word embeddings
AU - Batmanghelich, Nematollah Kayhan
AU - Saeedi, Ardavan
AU - Narasimhan, Karthik R.
AU - Gershman, Samuel J.
N1 - Publisher Copyright:
© 2016 Association for Computational Linguistics.
PY - 2016
Y1 - 2016
N2 - Traditional topic models do not account for semantic regularities in language. Recent distributional representations of words exhibit semantic consistency over directional metrics such as cosine similarity. However, neither categorical nor Gaussian observational distributions used in existing topic models are appropriate to leverage such correlations. In this paper, we propose to use the von Mises-Fisher distribution to model the density of words over a unit sphere. Such a representation is well-suited for directional data. We use a Hierarchical Dirichlet Process for our base topic model and propose an efficient inference algorithm based on Stochastic Variational Inference. This model enables us to naturally exploit the semantic structures of word embeddings while flexibly discovering the number of topics. Experiments demonstrate that our method outperforms competitive approaches in terms of topic coherence on two different text corpora while offering efficient inference.1.
AB - Traditional topic models do not account for semantic regularities in language. Recent distributional representations of words exhibit semantic consistency over directional metrics such as cosine similarity. However, neither categorical nor Gaussian observational distributions used in existing topic models are appropriate to leverage such correlations. In this paper, we propose to use the von Mises-Fisher distribution to model the density of words over a unit sphere. Such a representation is well-suited for directional data. We use a Hierarchical Dirichlet Process for our base topic model and propose an efficient inference algorithm based on Stochastic Variational Inference. This model enables us to naturally exploit the semantic structures of word embeddings while flexibly discovering the number of topics. Experiments demonstrate that our method outperforms competitive approaches in terms of topic coherence on two different text corpora while offering efficient inference.1.
UR - http://www.scopus.com/inward/record.url?scp=85016629898&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85016629898&partnerID=8YFLogxK
U2 - 10.18653/v1/p16-2087
DO - 10.18653/v1/p16-2087
M3 - Conference contribution
AN - SCOPUS:85016629898
T3 - 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Short Papers
SP - 537
EP - 542
BT - 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Short Papers
PB - Association for Computational Linguistics (ACL)
T2 - 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016
Y2 - 7 August 2016 through 12 August 2016
ER -