Topics in semantic representation

Thomas L. Griffiths, Mark Steyvers, Joshua B. Tenenbaum

Research output: Contribution to journalArticlepeer-review

815 Scopus citations


Processing language requires the retrieval of concepts from memory in response to an ongoing stream of information. This retrieval is facilitated if one can infer the gist of a sentence, conversation, or document and use that gist to predict related concepts and disambiguate words. This article analyzes the abstract computational problem underlying the extraction and use of gist, formulating this problem as a rational statistical inference. This leads to a novel approach to semantic representation in which word meanings are represented in terms of a set of probabilistic topics. The topic model performs well in predicting word association and the effects of semantic association and ambiguity on a variety of language-processing and memory tasks. It also provides a foundation for developing more richly structured statistical models of language, as the generative process assumed in the topic model can easily be extended to incorporate other kinds of semantic and syntactic structure. PsycINFO Database Record (c) 2007 APA, all rights reserved.

Original languageEnglish (US)
Pages (from-to)211-244
Number of pages34
JournalPsychological Review
Issue number2
StatePublished - Apr 2007
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • General Psychology


  • Bayesian models
  • Computational models
  • Probabilistic models
  • Semantic memory
  • Semantic representation


Dive into the research topics of 'Topics in semantic representation'. Together they form a unique fingerprint.

Cite this