{"id":188356,"date":"2012-07-19T00:00:00","date_gmt":"2012-08-27T13:31:55","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/extensions-of-bayesian-optimization-for-real-world-applications\/"},"modified":"2016-08-02T06:11:10","modified_gmt":"2016-08-02T13:11:10","slug":"extensions-of-bayesian-optimization-for-real-world-applications","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/extensions-of-bayesian-optimization-for-real-world-applications\/","title":{"rendered":"Extensions of Bayesian Optimization for Real-World Applications"},"content":{"rendered":"<div class=\"asset-content\">\n<p>Bayesian Optimization (BO) is a popular approach in statistics and machine learning for the global optimization of expensive blackbox functions. It has strong theoretical foundations and also yields state-of-the-art empirical results for optimizing functions with few all-continuous inputs. However, many blackbox optimization problems in real-world applications do not fit into this scope. For example, the &#8220;algorithm configuration&#8221; problem of identifying the best instantiation of a parametric algorithm poses various challenges to BO, including: high dimensionality, mixed discrete\/continuous optimization, function evaluations of varying costs, partial function evaluations that only yield a bound on the true function value, and computational efficiency with tens of thousands of function evaluations. In this talk, I discuss recent work at UBC that extends BO to handle these challenges. Empirical results demonstrate that the resulting methods achieve state-of-the-art performance for the configuration of algorithms for solving hard combinatorial problems and for the configuration of machine learning classifiers.<\/p>\n<p>Based on joint work with Holger Hoos, Kevin Leyton-Brown, and Nando de Freitas and his machine learning group.<\/p>\n<\/div>\n<p><!-- .asset-content --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Bayesian Optimization (BO) is a popular approach in statistics and machine learning for the global optimization of expensive blackbox functions. It has strong theoretical foundations and also yields state-of-the-art empirical results for optimizing functions with few all-continuous inputs. However, many blackbox optimization problems in real-world applications do not fit into this scope. For example, the [&hellip;]<\/p>\n","protected":false},"featured_media":197130,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_hide_image_in_river":0,"footnotes":""},"research-area":[],"msr-video-type":[206954],"msr-locale":[268875],"msr-post-option":[],"msr-session-type":[],"msr-impact-theme":[],"msr-pillar":[],"msr-episode":[],"msr-research-theme":[],"class_list":["post-188356","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-video-type-microsoft-research-talks","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/24a8pBisH_g","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/188356","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":0,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/188356\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/197130"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=188356"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=188356"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=188356"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=188356"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=188356"},{"taxonomy":"msr-session-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-session-type?post=188356"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=188356"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=188356"},{"taxonomy":"msr-episode","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-episode?post=188356"},{"taxonomy":"msr-research-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-theme?post=188356"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}