Representing Text for Joint Embedding of Text and Knowledge Bases

  • Kristina Toutanova ,
  • Danqi Chen ,
  • Patrick Pantel ,
  • ,
  • Pallavi Choudhury ,
  • Michael Gamon

Empirical Methods in Natural Language Processing (EMNLP) |

Published by ACL - Association for Computational Linguistics

Models that learn to represent textual and knowledge base relations in the same continuous latent space are able to perform joint inferences among the two kinds of relations and obtain high accuracy on knowledge base completion (Riedel et al. 2013). In this paper we propose a model that captures the compositional structure of textual relations, and jointly optimizes entity, knowledge base, and textual relation representations. The proposed model significantly improves performance over a model that does not share parameters among textual relations with common sub-structure.