In this paper, we propose Neural Phrase-based Machine Translation (NPMT). Our method explicitly models the phrase structures in output sequences through Sleep-WAke Networks (SWAN), a recently proposed segmentationbased sequence modeling method. To alleviate the monotonic alignment requirement of SWAN, we introduce a new layer to perform (soft) local reordering of input sequences. Our experiments show that NPMT achieves state-of-the-art results on IWSLT 2014 German-English translation task without using any attention mechanisms. We also observe that our method produces meaningful phrases in the output language.