{"id":658488,"date":"2020-05-12T17:58:21","date_gmt":"2020-05-13T00:58:21","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&#038;p=658488"},"modified":"2022-05-12T10:36:15","modified_gmt":"2022-05-12T17:36:15","slug":"project-flute","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/project-flute\/","title":{"rendered":"Project FLUTE"},"content":{"rendered":"<section class=\"mb-3 moray-highlight\">\n\t<div class=\"card-img-overlay mx-lg-0\">\n\t\t<div class=\"card-background  has-background-grey card-background--full-bleed\">\n\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"1920\" height=\"720\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720.png\" class=\"attachment-full size-full\" alt=\"FLUTE - colorful blue and green concentric circles\" style=\"\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720.png 1920w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720-300x113.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720-1024x384.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720-768x288.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720-1536x576.png 1536w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720-1600x600.png 1600w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_colorful-circles_header_1920x720-240x90.png 240w\" sizes=\"auto, (max-width: 1920px) 100vw, 1920px\" \/>\t\t<\/div>\n\t\t<!-- Foreground -->\n\t\t<div class=\"card-foreground d-flex mt-md-n5 my-lg-5 px-g px-lg-0\">\n\t\t\t<!-- Container -->\n\t\t\t<div class=\"container d-flex mt-md-n5 my-lg-5 align-self-center\">\n\t\t\t\t<!-- Card wrapper -->\n\t\t\t\t<div class=\"w-100 w-lg-col-5\">\n\t\t\t\t\t<!-- Card -->\n\t\t\t\t\t<div class=\"card material-md-card py-5 px-md-5\">\n\t\t\t\t\t\t<div class=\"card-body \">\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n<h1 id=\"project-flute\" class=\"h2\">Project FLUTE<\/h1>\n\n\n\n<p>Breaking barriers for Federated Learning research\u00a0at scale<\/p>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n<p>FLUTE (Federated Learning Utilities for Testing and Experimentation) is a high-performance open-source platform for federated learning research and offline\u00a0simulations at scale. The vision for FLUTE is to support progress in the state-of-the-art in\u00a0Federated Learning by providing task-agnostic support for a wide variety of scenarios\u00a0and cutting-edge algorithms with strong experimental results in a user-friendly\u00a0environment, while lowering the FL-related entry barriers to data scientists and\u00a0researchers.\u00a0 The key differentiator behind FLUTE is the ease of implementing new\u00a0scenarios for experimentation in core areas of active research\u2014such as optimization,\u00a0quantization, privacy, and scalability in a robust simulator.<\/p>\n\n\n\n<div class=\"wp-block-media-text has-vertical-padding-none  is-stacked-on-mobile is-style-border is-style-offset-media--top\"><figure class=\"wp-block-media-text__media\"><img loading=\"lazy\" decoding=\"async\" width=\"922\" height=\"624\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_Azure-ML-Studio-graph.png\" alt=\"AML Native Integration graph\" class=\"wp-image-844258 size-full\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_Azure-ML-Studio-graph.png 922w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_Azure-ML-Studio-graph-300x203.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_Azure-ML-Studio-graph-768x520.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_Azure-ML-Studio-graph-240x162.png 240w\" sizes=\"auto, (max-width: 922px) 100vw, 922px\" \/><\/figure><div class=\"wp-block-media-text__content\">\n<h3 id=\"aml-native-integration\">AML Native Integration<\/h3>\n\n\n\n<p>FLUTE\u2019s innovative framework has the\u00a0advantage of a native AML integration for\u00a0job submissions, providing a user-friendly\u00a0interface to setup, track and manage their experiments. AML also allows to rerun\/abort\u00a0crashed jobs, analyze metrics and download\u00a0models and logs to a local machine. Besides\u00a0AML, FLUTE also runs seamlessly on stand-alone devices such as laptops and desktop\u00a0machines, using local GPUs when available<\/p>\n<\/div><\/div>\n\n\n\n<div style=\"padding-bottom:32px; padding-top:32px\" class=\"wp-block-msr-immersive-section alignfull row wp-block-msr-immersive-section\">\n\t\n\t<div class=\"container\">\n\t\t<div class=\"wp-block-msr-immersive-section__wrapper col-lg-11 col-xl-9 px-0 m-auto\">\n\t\t\t<h3 id=\"unlocking-fl-research-at-scale\">Unlocking FL Research at scale<\/h3>\n\n\n\n<p>Federated Learning technology (and FLUTE as part of it) has shown great promise creating a\u00a0new ecosystem of advanced non-centralized learning, in the service of the customer needs.\u00a0Learning with privacy, heterogeneity and security constraints is the path towards enterprise-ready ML, advancing our vision of a next-generation enterprise AI platform that supports\u00a0constrained ML workloads.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"523\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_federated-learning-at-scale-infographic-1024x523.png\" alt=\"infographic table with federated learning details\" class=\"wp-image-844255\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_federated-learning-at-scale-infographic-1024x523.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_federated-learning-at-scale-infographic-300x153.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_federated-learning-at-scale-infographic-768x393.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_federated-learning-at-scale-infographic-240x123.png 240w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2020\/05\/FLUTE_federated-learning-at-scale-infographic.png 1158w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\t\t<\/div>\n\t<\/div>\n\n\t<\/div>\n\n\n\n<div style=\"padding-bottom:0; padding-top:0\" class=\"wp-block-msr-immersive-section alignfull row wp-block-msr-immersive-section\">\n\t\n\t<div class=\"container\">\n\t\t<div class=\"wp-block-msr-immersive-section__wrapper\">\n\t\t\t<div class=\"wp-block-media-text has-vertical-margin-small  has-vertical-padding-none  has-media-on-the-right is-stacked-on-mobile is-style-border\"><figure class=\"wp-block-media-text__media\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"535\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/05\/1200x627_Flute_Blog_still_with_logo-1024x535.png\" alt=\"This diagram shows a payload exchange between a server, inside Worker 0, and clients that live inside Workers 2 and 3. First, the server pushes the central ML model plus the clients\u2019 data to Workers 2 and 3. Then, each client trains the model with their local data. Finally, the clients send the pseudo-gradients of this new model back to the server for aggregation and the creation of a new global model.\" class=\"wp-image-842323 size-full\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/05\/1200x627_Flute_Blog_still_with_logo-1024x535.png 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/05\/1200x627_Flute_Blog_still_with_logo-300x157.png 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/05\/1200x627_Flute_Blog_still_with_logo-768x401.png 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/05\/1200x627_Flute_Blog_still_with_logo-1536x803.png 1536w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/05\/1200x627_Flute_Blog_still_with_logo-2048x1070.png 2048w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2022\/05\/1200x627_Flute_Blog_still_with_logo-240x125.png 240w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure><div class=\"wp-block-media-text__content\">\n<h3 id=\"simplifies-rapid-prototyping\">Simplifies rapid prototyping<\/h3>\n\n\n\n<p>FLUTE aims to provides the ease for plug in novel algorithms and perform\u00a0proof-of-concept for any given model combination. Some of the features\u00a0already included in FLUTE are:\u200b<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>FedAvg\u200b<\/li><li>Dynamic Gradient Aggregation\u200b<\/li><li>Adaptative Optimization\u200b<\/li><li>Differential Privacy\u200b<\/li><li>Quantization\u200b<\/li><li>Dropout and Stale Clients<\/li><\/ul>\n<\/div><\/div>\t\t<\/div>\n\t<\/div>\n\n\t<\/div>\n\n\n","protected":false},"excerpt":{"rendered":"<p>A novel framework for training models in a Federated Learning fashion. One of the novelties of the project is the first attempt to introduce Federated Learning in Speech Recognition tasks.  Besides the novelty of the task, the paper describes an easily generalizable FL platform and some of the design decisions used for this task. Among the novel algorithms introduced are a new hierarchical optimization scheme, a gradient selection algorithm, and self-supervised training algorithms.<\/p>\n","protected":false},"featured_media":844261,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13556,13558,13547],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-658488","msr-project","type-msr-project","status-publish","has-post-thumbnail","hentry","msr-research-area-artificial-intelligence","msr-research-area-security-privacy-cryptography","msr-research-area-systems-and-networking","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[658515,658521,658527,683817,767728,781984,786691,830107,837859,841663,846901],"related-downloads":[],"related-videos":[],"related-groups":[702211,756487,761911],"related-events":[],"related-opportunities":[],"related-posts":[842272],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Mirian Hipolito Garcia","user_id":40483,"people_section":"Section name 0","alias":"mirianh"},{"type":"user_nicename","display_name":"Daniel Eduardo Madrigal Diaz","user_id":40480,"people_section":"Section name 0","alias":"danielmad"},{"type":"user_nicename","display_name":"Robert Sim","user_id":36650,"people_section":"Section name 0","alias":"rsim"},{"type":"guest","display_name":"Kenichi Kumatani","user_id":607155,"people_section":"Section name 0","alias":""},{"type":"user_nicename","display_name":"Robert Gmyr","user_id":38487,"people_section":"Section name 0","alias":"rogmyr"}],"msr_research_lab":[199565],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/658488","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":9,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/658488\/revisions"}],"predecessor-version":[{"id":857412,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/658488\/revisions\/857412"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/844261"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=658488"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=658488"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=658488"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=658488"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=658488"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}