Annotating Derivations: A New Evaluation Strategy and Dataset for Algebra Word Problems
- Shyam Upadhyay ,
- Ming-Wei Chang
We propose a new evaluation for automatic solvers for algebra word problems, which can identify mistakes that existing evaluations overlook. Our proposal is to evaluate such solvers using derivations, which reflect how an equation system was constructed from the word problem. To accomplish this, we develop an algorithm for checking the equivalence between two derivations, and show how derivation an- notations can be semi-automatically added to existing datasets. To make our experiments more comprehensive, we include the derivation annotation for DRAW-1K, a new dataset containing 1000 general algebra word problems. In our experiments, we found that the annotated derivations enable a more accurate evaluation of automatic solvers than previously used metrics. We release derivation annotations for over 2300 algebra word problems for future evaluations.
Related Tools
Diverse Algebra Word Problem Dataset with Derivation Annotations
May 25, 2016
This dataset provides training and testing examples for solving algebra word problems automatically. It consists of over 2000 algebra word problems. Each word problem is annotated with the full derivation (template + alignments) of the relevant equations from the word problem. Please refer to the paper for details. Templates annotated in datasets were globally reconciled. The cross validation splits and train-test splits used in the papers are also provided (Thanks to the respective authors for sharing the splits).