Learning from Explicit and Implicit Supervision Jointly For Algebra Word Problems

  • Shyam Upadhyay ,
  • Ming-Wei Chang ,
  • Kai-Wei Chang ,
  • Wen-tau Yih

EMNLP 2016 |

Published by Association for Computational Linguistics

Automatically solving algebra word problems has raised considerable interest recently. Existing state-of-the-art approaches mainly rely on learning from human annotated equations. In this paper, we demonstrate that it is possible to efficiently mine algebra problems and their numerical solutions with little to no manual effort. To leverage the mined dataset, we propose a novel structured-output learning algorithm that aims to learn from both explicit (e.g., equations) and implicit (e.g., solutions) supervision signals jointly. Enabled by this new algorithm, our model gains 4.6% absolute improvement in accuracy on the ALG514 benchmark compared to the one without using implicit supervision. The final model also outperforms the current state-of-the-art approach by 3%.

Related Tools

Learning from Explicit and Implicit Supervision Jointly For Algebra Word Problems

September 20, 2016

This is a public release of the dataset corresponding the paper "Learning from Explicit and Implicit Supervision Jointly For Algebra Word Problems" that appeared in EMNLP 2016. This set only contains the implicit supervised examples. (the word problems with noisy and partially annotated solutions.)