{"id":190187,"date":"2013-12-06T00:00:00","date_gmt":"2013-12-10T10:24:10","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/msr-research-item\/scalable-learning-of-bayesian-network-classifiers\/"},"modified":"2016-08-02T06:11:47","modified_gmt":"2016-08-02T13:11:47","slug":"scalable-learning-of-bayesian-network-classifiers","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/scalable-learning-of-bayesian-network-classifiers\/","title":{"rendered":"Scalable learning of Bayesian network classifiers"},"content":{"rendered":"<div class=\"asset-content\">\n<p>I present our work on highly-scalable out-of-core techniques for learning well-calibrated Bayesian network classifiers.  Our techniques are based on a novel hybrid generative and discriminative learning paradigm.  These algorithms &#8211; provide straightforward mechanisms for managing the bias-variance trade-off  &#8211; have training time that is linear with respect to training set size, &#8211; require as few as one and at most four passes through the training data, &#8211; allow for incremental learning, &#8211; are embarrassingly parallelisable, &#8211; support anytime classification, &#8211; provide direct well-calibrated prediction of class probabilities, &#8211; can learn using arbitrary loss functions, &#8211; support direct handling of missing values, and &#8211; exhibit robustness to noise in the training data. Despite their computationally efficiency the new algorithms deliver classification accuracy that is competitive with state-of-the-art discriminative learning techniques.<\/p>\n<\/div>\n<p><!-- .asset-content --><\/p>\n","protected":false},"excerpt":{"rendered":"<p>I present our work on highly-scalable out-of-core techniques for learning well-calibrated Bayesian network classifiers. Our techniques are based on a novel hybrid generative and discriminative learning paradigm. These algorithms &#8211; provide straightforward mechanisms for managing the bias-variance trade-off &#8211; have training time that is linear with respect to training set size, &#8211; require as few [&hellip;]<\/p>\n","protected":false},"featured_media":198013,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_hide_image_in_river":0,"footnotes":""},"research-area":[],"msr-video-type":[206954],"msr-locale":[268875],"msr-post-option":[],"msr-session-type":[],"msr-impact-theme":[],"msr-pillar":[],"msr-episode":[],"msr-research-theme":[],"class_list":["post-190187","msr-video","type-msr-video","status-publish","has-post-thumbnail","hentry","msr-video-type-microsoft-research-talks","msr-locale-en_us"],"msr_download_urls":"","msr_external_url":"https:\/\/youtu.be\/HhysL6ex-7Y\/","msr_secondary_video_url":"","msr_video_file":"","_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/190187","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-video"}],"version-history":[{"count":0,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video\/190187\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/198013"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=190187"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=190187"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=190187"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=190187"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=190187"},{"taxonomy":"msr-session-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-session-type?post=190187"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=190187"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=190187"},{"taxonomy":"msr-episode","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-episode?post=190187"},{"taxonomy":"msr-research-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-research-theme?post=190187"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}