Learning new facts from knowledge bases with neural tensor networks and semantic word vectors

Danqi Chen, Richard Socher, Christopher D. Manning, Andrew Y. Ng

Research output: Contribution to conferencePaperpeer-review

Abstract

Knowledge bases provide applications with the benefit of easily accessible, systematic relational knowledge but often suffer in practice from their incompleteness and lack of knowledge of new entities and relations. Much work has focused on building or extending them by finding patterns in large unannotated text corpora. In contrast, here we mainly aim to complete a knowledge base by predicting additional true relationships between entities, based on generalizations that can be discerned in the given knowledgebase. We introduce a neural tensor network (NTN) model which predicts new relationship entries that can be added to the database. This model can be improved by initializing entity representations with word vectors learned in an unsupervised fashion from text, and when doing this, existing relations can even be queried for entities that were not present in the database. Our model generalizes and outperforms existing models for this problem, and can classify unseen relationships in WordNet with an accuracy of 75.8%.

Original languageEnglish (US)
StatePublished - Jan 1 2013
Externally publishedYes
Event1st International Conference on Learning Representations, ICLR 2013 - Scottsdale, United States
Duration: May 2 2013May 4 2013

Conference

Conference1st International Conference on Learning Representations, ICLR 2013
Country/TerritoryUnited States
CityScottsdale
Period5/2/135/4/13

All Science Journal Classification (ASJC) codes

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'Learning new facts from knowledge bases with neural tensor networks and semantic word vectors'. Together they form a unique fingerprint.

Cite this