{"id":183809,"date":"2005-11-18T00:00:00","date_gmt":"2009-10-31T13:03:31","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/multi-engine-machine-translation-guided-by-explicit-word-matching\/"},"modified":"2023-02-15T09:59:03","modified_gmt":"2023-02-15T17:59:03","slug":"multi-engine-machine-translation-guided-by-explicit-word-matching","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/multi-engine-machine-translation-guided-by-explicit-word-matching\/","title":{"rendered":"Multi-Engine Machine Translation Guided by Explicit Word Matching"},"content":{"rendered":"<div class=\"asset-content\">\n<p>In this talk, I will describe a recent new approach that we have been developing for synthetically combining the output of several different Machine Translation (MT) engines operating on the same input. The goal of this work is to produce a synthetic combination that surpasses all of the original systems in translation quality. Our approach uses the individual MT engines as &#8220;black boxes&#8221; and does not require any explicit cooperation from the original MT systems. An explicit word matcher is first used in order to identify and align the words that are common between the MT engine outputs. The matcher can match not only identical words, but also morphological variants and synonyms. A decoding algorithm then uses this information, in conjunction with confidence estimates for the various engines and a trigram language model in order to score and rank a collection of sentence hypotheses that are synthetic combinations of words from the various original engines. The highest scoring sentence hypothesis is selected as the final output of our system.<\/p>\n<p>Experiments conducted on combining several Arabic-to-English and several Chinese-to-English online translation systems demonstrate that our multi-engine combination system provides an improvement of about 6% over the best original system, and is about equal in translation quality to an &#8220;oracle&#8221; capable of selecting the best of the original systems on a sentence-by-sentence basis. I will describe the details of the approach, and several planned extensions for further improving its effectiveness.<\/p>\n<\/div>\n<p><!-- .asset-content --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In this talk, I will describe a recent new approach that we have been developing for synthetically combining the output of several different Machine Translation (MT) engines operating on the same input. The goal of this work is to produce a synthetic combination that surpasses all of the original systems in translation quality. Our approach [&hellip;]<\/p>\n","protected":false},"featured_media":195204,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_hide_image_in_river":0,"footnotes":""},"research-area":[13545],"msr-video-type":[],"msr-locale":[268875],"msr-post-option":[],"msr-session-type":[],"msr-impact-theme":[],"msr-pillar":[],"msr-episode":[],"msr-research-theme":[],"class_list":["post-183809","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-research-area-human-language-technologies","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/NV6VKOs8y4c","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/183809","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":2,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/183809\/revisions"}],"predecessor-version":[{"id":919704,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/183809\/revisions\/919704"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/195204"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=183809"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=183809"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=183809"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=183809"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=183809"},{"taxonomy":"msr-session-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-session-type?post=183809"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=183809"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=183809"},{"taxonomy":"msr-episode","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-episode?post=183809"},{"taxonomy":"msr-research-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-theme?post=183809"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}