Translation sub-model is one of the most im-portant components in statistical machine trans-lation, but the conventional approach suffers from two major problems. Firstly, translation sub-model is not optimized with respect to any of automatic evaluation metrics of SMT (such as BLEU). The second problem is over-fitting to training data. This paper presents a new uni-fied framework, by adding a scalable transla-tion sub-model into the conventional framework. The sub-model is optimized with the same criterion as the translation output is evaluated (BLEU), and trained using margin in-fused relaxed algorithm (MIRA) to handle over-fitting. Under our new framework, MIRA and minimum error rate training (MERT) are unified into an interactive training process. Our approach has not only shown to improve per-formance over a state-of-the-art baseline, but also generalize well in-domain training data to out-of-domain test data.