Integrating topics and syntax

Thomas L. Griffiths, Mark Steyvers, David M. Blei, Joshua B. Tenenbaum

Research output: Chapter in Book/Report/Conference proceedingConference contribution

277 Scopus citations

Abstract

Statistical approaches to language learning typically focus on either short-range syntactic dependencies or long-range semantic dependencies between words. We present a generative model that uses both kinds of dependencies, and can be used to simultaneously find syntactic classes and semantic topics despite having no representation of syntax or semantics beyond statistical dependency. This model is competitive on tasks like part-of-speech tagging and document classification with models that exclusively use short- And long-range dependencies respectively.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 17 - Proceedings of the 2004 Conference, NIPS 2004
PublisherNeural information processing systems foundation
ISBN (Print)0262195348, 9780262195348
StatePublished - 2005
Externally publishedYes
Event18th Annual Conference on Neural Information Processing Systems, NIPS 2004 - Vancouver, BC, Canada
Duration: Dec 13 2004Dec 16 2004

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Other

Other18th Annual Conference on Neural Information Processing Systems, NIPS 2004
Country/TerritoryCanada
CityVancouver, BC
Period12/13/0412/16/04

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Integrating topics and syntax'. Together they form a unique fingerprint.

Cite this