RNNLM – Recurrent Neural Network Language Modeling Toolkit

  • Tomas Mikolov ,
  • Stefan Kombrink ,
  • Anoop Deoras ,
  • Lukas Burget ,
  • Jan Honza Cernocky

IEEE Automatic Speech Recognition and Understanding Workshop |

We present a freely available open-source toolkit for training recurrent neural network based language models. It can be easily used to improve existing speech recognition and machine translation systems. Also, it can be used as a baseline for future research of advanced language modeling techniques. In the paper, we discuss optimal parameter selection and different modes of functionality. The toolkit, example scripts and basic setups are freely available at http://rnnlm.sourceforge.net/.