Harnessing Pre-Trained Neural Networks with Rules for Formality Style Transfer

  • Yunli Wang ,
  • Yu Wu ,
  • Lili Mou ,
  • Zhoujun Li ,
  • WenHan Chao

2019 Empirical Methods in Natural Language Processing |

Published by Association for Computational Linguistics

Publication | Publication

Formality text style transfer plays an important role in various NLP applications, such as non-native speaker assistants and child education. Early studies normalize informal sentences with rules, before statistical and neural models become a prevailing method in the field. While a rule-based system is still a common preprocessing step for formality style transfer in the neural era, it could introduce noise if we use the rules in a naive way such as data preprocessing. To mitigate this problem, we study how to harness rules into a state-of-the-art neural network that is typically pretrained on massive corpora. We propose three fine-tuning methods in this paper and achieve a new state-of-the-art on benchmark datasets