{"id":548040,"date":"2018-11-14T12:46:29","date_gmt":"2018-11-14T20:46:29","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-event&#038;p=548040"},"modified":"2025-08-06T11:56:43","modified_gmt":"2025-08-06T18:56:43","slug":"physics-ml-workshop","status":"publish","type":"msr-event","link":"https:\/\/www.microsoft.com\/en-us\/research\/event\/physics-ml-workshop\/","title":{"rendered":"Physics \u2229 ML"},"content":{"rendered":"\n\n<p><strong>Venue: <\/strong><br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.bing.com\/mapspreview?where1=14820%20NE%2036th%20Street%20Building%2099%20Redmond%20Washington%20USA%2098052\" target=\"_blank\" rel=\"noopener noreferrer\">Microsoft, Building 99,<br \/>\nRedmond, Washington, USA<\/a><\/p>\n<p><strong>Watch on demand:<\/strong><br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/resnet.microsoft.com\/video\/42679\" target=\"_blank\" rel=\"noopener noreferrer\">Day 1: 9:00 AM\u201310:30 AM<span class=\"sr-only\"> (opens in new tab)<\/span><\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/ml-applied-to-string-theory-and-ml-applied-to-condensed-matter\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 1: 11:00 AM\u201312:30 PM<\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/physics-%e2%88%a9-ml-workshop-day-1-short-talks\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 1: 2:00 PM\u20134:05 PM<\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/combinatorial-cosmology-plenary\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 2: 9:00 AM\u20139:45 AM<\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/physics-ml-short-talks-and-discussions\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 2: 10:15 AM\u201312:30 PM<\/a><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<div style=\"background-color: #dddddd;color: black;padding: 25px 15px 5px 15px\">\n<blockquote><p><strong>Update May 2020:<\/strong> Following up on our workshop here, we have begun a virtual biweekly seminar series continuing our conversations on the interface between physics and ML. Please visit <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/physicsmeetsml.org\/\" target=\"_blank\" rel=\"noopener\">http:\/\/physicsmeetsml.org\/<span class=\"sr-only\"> (opens in new tab)<\/span><\/a> for more information.<\/p><\/blockquote>\n<\/div>\n<div style=\"height: 20px\"><\/div>\n<p>The goal of <em>Physics<\/em> \u2229 <em>ML<\/em> (read &#8216;Physics Meets ML&#8217;) is to bring together researchers from machine learning and physics to learn from each other and push research forward together. In this inaugural edition, we will especially highlight some amazing progress made in string theory with machine learning and in the understanding of deep learning from a physical angle.\u00a0Nevertheless, we invite a cast with wide ranging expertise in order to spark new ideas.\u00a0Plenary sessions from experts in each field and shorter specialized talks will introduce existing research.\u00a0We will hold moderated discussions and breakout groups in which participants can identify problems and hopefully begin new collaborations in both directions. For example, physical insights can motivate advanced algorithms in machine learning, and analysis of geometric and topological datasets with machine learning can yield critical new insights in fundamental physics.<\/p>\n<h3>Organizers<\/h3>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/gregyang\/\">Greg Yang<\/a>, Microsoft Research<br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.jhhalverson.com\/\" target=\"_blank\" rel=\"noopener\">Jim Halverson<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, Northeastern University<br \/>\nSven Krippendorf, LMU Munich<br \/>\nFabian Ruehle, CERN, Oxford University<br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.rakkyeongseong.com\" target=\"_blank\" rel=\"noopener\">Rak-Kyeong Seong<span class=\"sr-only\"> (opens in new tab)<\/span><\/a>, Samsung SDS<br \/>\nGary Shiu, University of Wisconsin<\/p>\n<h3>Microsoft Advisers<\/h3>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/cmbishop\/\">Chris Bishop<\/a>, Microsoft Research<br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/jchayes\/\">Jennifer Chayes<\/a>, Microsoft Research<br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/michaelf\/\">Michael Freedman<\/a>, Microsoft Research<br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/psmo\/\">Paul Smolensky<\/a>, Microsoft Research<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<h2>Day 1 | Thursday, April 25<\/h2>\n<table style=\"padding: 8px;width: 100%;text-align: left;border-bottom-color: #000000;border-bottom-width: 1px;border-bottom-style: solid;border-collapse: collapse;border-spacing: inherit\">\n<tbody>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><strong>Time (PDT)<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Speaker<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 1<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Plenary talks<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">8:00 AM\u20139:00 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breakfast<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:00 AM\u20139:45 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Gauge equivariant convolutional networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Taco Cohen<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:45 AM\u201310:30 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Understanding overparameterized neural networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Jascha Sohl-Dickstein<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">10:30 AM\u201311:00 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:00 AM\u201311:45 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Mathematical landscapes and string theory<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Mike Douglas<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:45 AM\u201312:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Holography, matter and deep learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Koji Hashimoto<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">12:30 PM\u20132:00 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Lunch<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:00 PM\u20134:05 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 2<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Applying physical insights to ML<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:00 PM\u20132:45 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Plenary: A picture of the energy landscape of deep neural networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pratik Chaudhari<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:45 PM\u20134:05 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short talks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Neural tangent kernel and the dynamics of large neural nets<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Clement Hongler<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">On the global convergence of gradient descent for over-parameterized models using optimal transport<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">L\u00e9na\u00efc Chizat<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pathological spectrum of the Fisher information matrix in deep neural networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Ryo Karakida<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Fluctuation-dissipation relation for stochastic gradient descent<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sho Yaida<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">From optimization algorithms to continuous dynamical systems and back<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rene Vidal<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The effect of network width on stochastic gradient descent and generalization<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Daniel Park<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short certificates for symmetric graph density inequalities<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rekha Thomas<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Geometric representation learning in hyperbolic space<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Maximilian Nickel<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The fundamental equations of MNIST<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Cedric Beny<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Quantum states and Lyapunov functions reshape universal grammar<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Paul Smolensky<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Multi-scale deep generative networks for Bayesian inverse problems<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pengchuan Zhang<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Variational quantum classifiers in the context of quantum machine learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Alex Bocharov<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">4:05 PM\u20134:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">4:30 PM\u20135:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The intersect \u2229<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>Day 2 | Friday, April 26<\/h2>\n<table style=\"padding: 8px;width: 100%;text-align: left;border-bottom-color: #000000;border-bottom-width: 1px;border-bottom-style: solid;border-collapse: collapse;border-spacing: inherit\">\n<tbody>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><strong>Time (PDT)<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Speaker<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 3<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Applying ML to physics<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">8:00 AM\u20139:00 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breakfast<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:00 AM\u20139:45 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Plenary: Combinatorial Cosmology<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Liam McAllister<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:45 AM\u201310:15 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">10:15 AM\u201311:35 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short talks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Bypassing expensive steps in computational geometry<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yang-Hui He<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Learning string theory at Large N<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Cody Long<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Training machines to extrapolate reliably over astronomical scales<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Brent Nelson<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breaking the tunnel vision with ML<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sergei Gukov<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Can machine learning give us new theoretical insights in physics and math?<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Washington Taylor<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Brief overview of machine learning holography<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yi-Zhuang You<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Applications of persistent homology to physics<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Alex Cole<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Seeking a connection between the string landscape and particle physics<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Patrick Vaudrevange<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">PBs^-1 to science: novel approaches on real-time processing from LHCb at CERN<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Themis Bowcock<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">From non-parametric to parametric: manifold coordinates with physical meaning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Marina Meila<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Machine learning in quantum many-body physics: A blitz<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yichen Huang<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Knot Machine Learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Vishnu Jejjala<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:35 AM\u201312:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Panel discussion with panelists Michael Freedman, Clement Hongler, Gary Shiu, Paul Smolensky, Washington Taylor<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">12:30 PM\u20131:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Lunch<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 4<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Breakout groups<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">1:30 PM\u20133:00 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Physics breakout groups<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Symmetries and their realisations in string theory<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sergei Gukov, Yang-Hui He<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">String landscape<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Michael Douglas, Liam McAllister<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Connections of holography and ML<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Koji Hashimoto, Yi-Zhuang You<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">3:00 PM\u20134:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">ML breakout groups<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Geometric representations in deep learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Maximilian Nickel<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Understanding deep learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yasaman Bahri, Boris Hanin, Jaehoon Lee<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Physics and optimization<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rene Vidal<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<div style=\"height: 35px\"><\/div>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<p>\t<div data-wp-context='{\"items\":[]}' data-wp-interactive=\"msr\/accordion\">\n\t\t\t\t\t<div class=\"clearfix\">\n\t\t\t\t<div\n\t\t\t\t\tclass=\"btn-group align-items-center mb-g float-sm-right\"\n\t\t\t\t\tdata-bi-aN=\"accordion-collapse-controls\"\n\t\t\t\t>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Expand all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onExpandAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tExpand all\t\t\t\t\t<\/button>\n\t\t\t\t\t<span aria-hidden=\"true\"> | <\/span>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Collapse all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onCollapseAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tCollapse all\t\t\t\t\t<\/button>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t\t\t<ul class=\"msr-accordion\">\n\t\t\t\t\t\t\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-836\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-836\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-835\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tGauge equivariant convolutional networks\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-835\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-836\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Taco Cohen<\/p>\n<p>The principle of equivariance to symmetry transformations enables a theoretically grounded approach to neural network architecture design. Equivariant networks have shown excellent performance and data efficiency on vision and medical imaging problems that exhibit symmetries. Here we show how this principle can be extended beyond global symmetries to local gauge transformations. This enables the development of a very general class of convolutional neural networks on manifolds that depend only on the intrinsic geometry. This class includes and generalizes existing methods from equivariant- and geometric deep learning, and thus unifies these areas in a common gauge-theoretic framework.<\/p>\n<p>We implement gauge equivariant CNNs for signals defined on the surface of the icosahedron, which provides a reasonable approximation of the sphere. By choosing to work with this very regular manifold, we are able to implement the gauge equivariant convolution using a single conv2d call, making it a highly scalable and practical alternative to Spherical CNNs. Using this method, we demonstrate substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns.<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-838\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-838\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-837\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tUnderstanding overparameterized neural networks\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-837\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-838\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<\/p>\n<p><strong>Speaker:<\/strong> Jascha Sohl-Dickstein<\/p>\n<p>As neural networks become highly overparameterized, their accuracy improves, and their behavior becomes easier to analyze theoretically. I will give an introduction to a rapidly growing body of work which examines the learning dynamics and prior over functions induced by infinitely wide, randomly initialized, neural networks. Core results that I will discuss include: that the distribution over functions computed by a wide neural network often corresponds to a Gaussian process with a particular compositional kernel, both before and after training; that the predictions of wide neural networks are linear in their parameters throughout training; and that this perspective enables analytic predictions for how trainability depends on hyperparameters and architecture. These results provide for surprising capabilities\u2014for instance, the evaluation of test set predictions which would come from an infinitely wide trained neural network without ever instantiating a neural network, or the rapid training of 10,000+ layer convolutional networks. I will argue that this growing understanding of neural networks in the limit of infinite width is foundational for future theoretical and practical understanding of deep learning.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-840\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-840\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-839\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tMathematical landscapes and string theory\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-839\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-840\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Mike Douglas<\/p>\n<p>A fundamental theory of physics must explain how to derive the laws of physics ab initio, from purely formal constructions. In string\/M theory there are general laws, such as general relativity and Yang-Mills theory, which are fixed by the theory. There are also specific details, such as the spectrum of elementary particles and the strengths of interactions between them. These are not determined uniquely, but are derived from the geometry of extra dimensions of space. This geometry must take one of a few special forms called special holonomy manifolds, for example a Calabi-Yau (CY) manifold, or a G2 manifold. These manifolds are of significant mathematical interest independent of their relevance to string theory, and mathematicians and physicists have been working together to classify them and work out the relations between them.<\/p>\n<p>Such data \u2014a set of objects and relations between them, defined by simple axioms \u2014 can be called a mathematical landscape. There are many important landscapes besides special holonomy manifolds \u2014 finite groups, bundles on manifolds, other classes of manifolds, etc. Many landscapes, such as that of six-dimensional CY manifolds, turn out to be combinatorially large. It is a further challenge to extract simple and useful pictures from the vast wealth of their data. Machine learning will be an essential tool to meet this challenge.<\/p>\n<p>As an example, in the mid-90\u2019s, the physicists Kreuzer and Skarke did a survey of the 6-d toric hypersurface CYs to study mirror symmetry. The plot which demonstrated this, revealed a `shield\u2019 pattern which nobody had anticipated or even thought to ask about, whose explanation was attempted in several later works, and which may be the key to understanding the finite number of these spaces. Future studies of mathematical landscapes will no doubt reveal new unexpected patterns, especially if we have ML to help look for the patterns.<\/p>\n<p>We will give a high level survey of work in this direction, trying to minimize mathematical and physical technicalities, and to raise new questions.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-842\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-842\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-841\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tHolography, matter and deep learning\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-841\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-842\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Koji Hashimoto<\/p>\n<p>Revealing quantum nature of matter is indispensable for finding new features of known exotic matter and even new materials, and drives the progress in theoretical physics. One of the most important is to construct explicitly ground state wave functions for given Hamiltonians. On the other hand, the holographic principle, discovered in the context of string theory, claims equivalence between quantum matter and classical higher-dimensional gravity. It provided a completely new viewpoint on the understanding of quantum matter, such as quarks in QCD and strongly correlated electrons.<\/p>\n<p>These need to solve inverse problems, and machine learning should be effective intrinsically, with the following additionally important similarity in research: matter wave functions are given by tensor network optimization, and holographic principle defines quantum gravity where networks are regarded as discretized spacetime. Therefore, network optimization will play a central role in these sciences.<\/p>\n<p>In this talk, I give a brief review of several important topics related to the networks, and provide a concrete example of applying deep neural network to the holographic principle. The higher-dimensional curved spacetime is discretized to a deep neural network, and the input data (quark correlators) optimizes the network. The neural network weights are regarded as the emergent spacetime metric, and some other physical observables predicted from the trained network geometry can well be compared with supercomputer simulations of QCD.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-844\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-844\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-843\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tA picture of the energy landscape of deep neural networks\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-843\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-844\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Pratik Chaudhari<\/p>\n<p>Deep networks are mysterious. These over-parametrized machine learning models, trained with rudimentary optimization algorithms on non-convex landscapes in millions of dimensions have defied attempts to put asound theoretical footing beneath their impressive performance.<\/p>\n<p>This talk will shed light upon some of these mysteries. I will employ diverse ideas\u2014from thermodynamics and optimal transportation to partial differential equations, control theory and Bayesian inference\u2014and paint a picture of the training process of deep networks. Along the way, I will develop state-of-the-art algorithms for non-convex optimization.<\/p>\n<p>The goal of machine perception is not just to classify objects in images but instead, enable intelligent agents that can seamlessly interact with our physical world. I will conclude with a vision of how advances in machine learning and robotics may come together to help build such an Embodied Intelligence.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-846\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-846\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-845\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tCombinatorial Cosmology\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-845\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-846\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Liam McAllister<\/p>\n<p>A foundational problem in string theory is to derive statistical predictions for observable phenomena in cosmology and in particle physics. I will give an accessible overview of this subject, assuming no prior knowledge of string theory.<\/p>\n<p>I will explain that the set of possible natural laws is contained in the set of solutions of string theory that have no fields with zero mass and zero spin. Each solution is determined by a finite number of integers that specify the topology of the six-manifold on which the six extra dimensions of string theory are compactified. The total number of solutions is plausibly finite, albeit large. This set of solutions, the `landscape of string theory&#8217;, is the main object of study when relating string theory to observations.<\/p>\n<p>I will discuss how the work of deriving predictions from the string landscape can be formulated as a computational problem. The integers specifying the six-manifold topology are the fundamental parameters in nature, and the task is to find out which values they can take, and what observables result for each choice. I will illustrate this problem in the case of six-manifolds that are defined by triangulations of certain four-dimensional lattice polytopes. Such manifolds are finite in number, though the number may exceed 10^900, and the computational tasks are almost entirely combinatorial. In this realm, one can aim to use machine learning to find patterns, or to pick out solutions with desirable properties. I will suggest some problems of this sort.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t\t\t\t\t<\/ul>\n\t<\/div>\n\t<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The goal of Physics \u2229 ML is to bring together researchers from machine learning and physics to learn from each other and push research forward together. In this inaugural edition, we will especially highlight some amazing progress made in string theory with machine learning and in the understanding of deep learning from a physical angle. Nevertheless, we invite a cast with wide ranging expertise in order to spark new ideas. <\/p>\n","protected":false},"featured_media":551232,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_startdate":"2019-04-25","msr_enddate":"2019-04-26","msr_location":"Microsoft Research Redmond","msr_expirationdate":"","msr_event_recording_link":"","msr_event_link":"","msr_event_link_redirect":false,"msr_event_time":"","msr_hide_region":false,"msr_private_event":false,"msr_hide_image_in_river":0,"footnotes":""},"research-area":[13546],"msr-region":[197900],"msr-event-type":[197944,210063],"msr-video-type":[],"msr-locale":[268875],"msr-program-audience":[],"msr-post-option":[],"msr-impact-theme":[],"class_list":["post-548040","msr-event","type-msr-event","status-publish","has-post-thumbnail","hentry","msr-research-area-computational-sciences-mathematics","msr-region-north-america","msr-event-type-hosted-by-microsoft","msr-event-type-workshop","msr-locale-en_us"],"msr_about":"<!-- wp:msr\/event-details {\"title\":\"Physics \u2229 ML\",\"backgroundColor\":\"grey\",\"image\":{\"id\":551232,\"url\":\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/11\/1920x720-header-Physics-2018-2.jpg\",\"alt\":\"\"}} \/-->\n\n<!-- wp:msr\/content-tabs --><!-- wp:msr\/content-tab {\"title\":\"About\"} --><!-- wp:freeform --><p><strong>Venue: <\/strong><br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.bing.com\/mapspreview?where1=14820%20NE%2036th%20Street%20Building%2099%20Redmond%20Washington%20USA%2098052\" target=\"_blank\" rel=\"noopener noreferrer\">Microsoft, Building 99,<br \/>\nRedmond, Washington, USA<\/a><\/p>\n<p><strong>Watch on demand:<\/strong><br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"https:\/\/resnet.microsoft.com\/video\/42679\" target=\"_blank\" rel=\"noopener noreferrer\">Day 1: 9:00 AM\u201310:30 AM<\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/ml-applied-to-string-theory-and-ml-applied-to-condensed-matter\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 1: 11:00 AM\u201312:30 PM<\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/physics-%e2%88%a9-ml-workshop-day-1-short-talks\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 1: 2:00 PM\u20134:05 PM<\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/combinatorial-cosmology-plenary\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 2: 9:00 AM\u20139:45 AM<\/a><br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/video\/physics-ml-short-talks-and-discussions\/\" target=\"_blank\" rel=\"noopener noreferrer\">Day 2: 10:15 AM\u201312:30 PM<\/a><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<div style=\"background-color: #dddddd;color: black;padding: 25px 15px 5px 15px\">\n<blockquote><p><strong>Update May 2020:<\/strong> Following up on our workshop here, we have begun a virtual biweekly seminar series continuing our conversations on the interface between physics and ML. Please visit <a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/physicsmeetsml.org\/\" target=\"_blank\" rel=\"noopener\">http:\/\/physicsmeetsml.org\/<\/a> for more information.<\/p><\/blockquote>\n<\/div>\n<div style=\"height: 20px\"><\/div>\n<p>The goal of <em>Physics<\/em> \u2229 <em>ML<\/em> (read &#8216;Physics Meets ML&#8217;) is to bring together researchers from machine learning and physics to learn from each other and push research forward together. In this inaugural edition, we will especially highlight some amazing progress made in string theory with machine learning and in the understanding of deep learning from a physical angle.\u00a0Nevertheless, we invite a cast with wide ranging expertise in order to spark new ideas.\u00a0Plenary sessions from experts in each field and shorter specialized talks will introduce existing research.\u00a0We will hold moderated discussions and breakout groups in which participants can identify problems and hopefully begin new collaborations in both directions. For example, physical insights can motivate advanced algorithms in machine learning, and analysis of geometric and topological datasets with machine learning can yield critical new insights in fundamental physics.<\/p>\n<h3>Organizers<\/h3>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/gregyang\/\">Greg Yang<\/a>, Microsoft Research<br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.jhhalverson.com\/\" target=\"_blank\" rel=\"noopener\">Jim Halverson<\/a>, Northeastern University<br \/>\nSven Krippendorf, LMU Munich<br \/>\nFabian Ruehle, CERN, Oxford University<br \/>\n<a class=\"msr-external-link glyph-append glyph-append-open-in-new-tab glyph-append-xsmall\" href=\"http:\/\/www.rakkyeongseong.com\" target=\"_blank\" rel=\"noopener\">Rak-Kyeong Seong<\/a>, Samsung SDS<br \/>\nGary Shiu, University of Wisconsin<\/p>\n<h3>Microsoft Advisers<\/h3>\n<p><a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/cmbishop\/\">Chris Bishop<\/a>, Microsoft Research<br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/jchayes\/\">Jennifer Chayes<\/a>, Microsoft Research<br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/michaelf\/\">Michael Freedman<\/a>, Microsoft Research<br \/>\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/psmo\/\">Paul Smolensky<\/a>, Microsoft Research<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<!-- \/wp:freeform --><!-- \/wp:msr\/content-tab --><!-- wp:msr\/content-tab {\"title\":\"Agenda\"} --><!-- wp:freeform --><h2>Day 1 | Thursday, April 25<\/h2>\n<table style=\"padding: 8px;width: 100%;text-align: left;border-bottom-color: #000000;border-bottom-width: 1px;border-bottom-style: solid;border-collapse: collapse;border-spacing: inherit\">\n<tbody>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><strong>Time (PDT)<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Speaker<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 1<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Plenary talks<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">8:00 AM\u20139:00 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breakfast<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:00 AM\u20139:45 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Gauge equivariant convolutional networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Taco Cohen<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:45 AM\u201310:30 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Understanding overparameterized neural networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Jascha Sohl-Dickstein<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">10:30 AM\u201311:00 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:00 AM\u201311:45 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Mathematical landscapes and string theory<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Mike Douglas<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:45 AM\u201312:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Holography, matter and deep learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Koji Hashimoto<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">12:30 PM\u20132:00 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Lunch<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:00 PM\u20134:05 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 2<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Applying physical insights to ML<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:00 PM\u20132:45 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Plenary: A picture of the energy landscape of deep neural networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pratik Chaudhari<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:45 PM\u20134:05 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short talks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Neural tangent kernel and the dynamics of large neural nets<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Clement Hongler<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">On the global convergence of gradient descent for over-parameterized models using optimal transport<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">L\u00e9na\u00efc Chizat<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pathological spectrum of the Fisher information matrix in deep neural networks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Ryo Karakida<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Fluctuation-dissipation relation for stochastic gradient descent<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sho Yaida<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">From optimization algorithms to continuous dynamical systems and back<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rene Vidal<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The effect of network width on stochastic gradient descent and generalization<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Daniel Park<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short certificates for symmetric graph density inequalities<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rekha Thomas<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Geometric representation learning in hyperbolic space<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Maximilian Nickel<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The fundamental equations of MNIST<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Cedric Beny<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Quantum states and Lyapunov functions reshape universal grammar<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Paul Smolensky<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Multi-scale deep generative networks for Bayesian inverse problems<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pengchuan Zhang<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Variational quantum classifiers in the context of quantum machine learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Alex Bocharov<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">4:05 PM\u20134:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">4:30 PM\u20135:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The intersect \u2229<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>Day 2 | Friday, April 26<\/h2>\n<table style=\"padding: 8px;width: 100%;text-align: left;border-bottom-color: #000000;border-bottom-width: 1px;border-bottom-style: solid;border-collapse: collapse;border-spacing: inherit\">\n<tbody>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><strong>Time (PDT)<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Speaker<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 3<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Applying ML to physics<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">8:00 AM\u20139:00 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breakfast<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:00 AM\u20139:45 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Plenary: Combinatorial Cosmology<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Liam McAllister<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:45 AM\u201310:15 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">10:15 AM\u201311:35 AM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short talks<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Bypassing expensive steps in computational geometry<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yang-Hui He<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Learning string theory at Large N<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Cody Long<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Training machines to extrapolate reliably over astronomical scales<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Brent Nelson<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breaking the tunnel vision with ML<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sergei Gukov<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Can machine learning give us new theoretical insights in physics and math?<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Washington Taylor<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Brief overview of machine learning holography<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yi-Zhuang You<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Applications of persistent homology to physics<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Alex Cole<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Seeking a connection between the string landscape and particle physics<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Patrick Vaudrevange<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">PBs^-1 to science: novel approaches on real-time processing from LHCb at CERN<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Themis Bowcock<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">From non-parametric to parametric: manifold coordinates with physical meaning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Marina Meila<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Machine learning in quantum many-body physics: A blitz<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yichen Huang<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Knot Machine Learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Vishnu Jejjala<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:35 AM\u201312:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Panel discussion with panelists Michael Freedman, Clement Hongler, Gary Shiu, Paul Smolensky, Washington Taylor<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">12:30 PM\u20131:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Lunch<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 4<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Breakout groups<\/strong><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">1:30 PM\u20133:00 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Physics breakout groups<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Symmetries and their realisations in string theory<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sergei Gukov, Yang-Hui He<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">String landscape<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Michael Douglas, Liam McAllister<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Connections of holography and ML<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Koji Hashimoto, Yi-Zhuang You<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">3:00 PM\u20134:30 PM<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">ML breakout groups<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Geometric representations in deep learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Maximilian Nickel<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Understanding deep learning<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yasaman Bahri, Boris Hanin, Jaehoon Lee<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<tr>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Physics and optimization<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rene Vidal<\/td>\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<div style=\"height: 35px\"><\/div>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<!-- \/wp:freeform --><!-- \/wp:msr\/content-tab --><!-- wp:msr\/content-tab {\"title\":\"Abstracts\"} --><!-- wp:freeform --><p>\t<div data-wp-context='{\"items\":[]}' data-wp-interactive=\"msr\/accordion\">\n\t\t\t\t\t<div class=\"clearfix\">\n\t\t\t\t<div\n\t\t\t\t\tclass=\"btn-group align-items-center mb-g float-sm-right\"\n\t\t\t\t\tdata-bi-aN=\"accordion-collapse-controls\"\n\t\t\t\t>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Expand all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllExpanded\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onExpandAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tExpand all\t\t\t\t\t<\/button>\n\t\t\t\t\t<span aria-hidden=\"true\"> | <\/span>\n\t\t\t\t\t<button\n\t\t\t\t\t\tclass=\"btn btn-link m-0\"\n\t\t\t\t\t\tdata-bi-cN=\"Collapse all\"\n\t\t\t\t\t\tdata-wp-bind--aria-controls=\"state.ariaControls\"\n\t\t\t\t\t\tdata-wp-bind--aria-expanded=\"state.ariaExpanded\"\n\t\t\t\t\t\tdata-wp-bind--disabled=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-class--inactive=\"state.isAllCollapsed\"\n\t\t\t\t\t\tdata-wp-on--click=\"actions.onCollapseAll\"\n\t\t\t\t\t\ttype=\"button\"\n\t\t\t\t\t>\n\t\t\t\t\t\tCollapse all\t\t\t\t\t<\/button>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t\t\t<ul class=\"msr-accordion\">\n\t\t\t\t\t\t\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-836\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-836\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-835\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tGauge equivariant convolutional networks\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-835\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-836\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Taco Cohen<\/p>\n<p>The principle of equivariance to symmetry transformations enables a theoretically grounded approach to neural network architecture design. Equivariant networks have shown excellent performance and data efficiency on vision and medical imaging problems that exhibit symmetries. Here we show how this principle can be extended beyond global symmetries to local gauge transformations. This enables the development of a very general class of convolutional neural networks on manifolds that depend only on the intrinsic geometry. This class includes and generalizes existing methods from equivariant- and geometric deep learning, and thus unifies these areas in a common gauge-theoretic framework.<\/p>\n<p>We implement gauge equivariant CNNs for signals defined on the surface of the icosahedron, which provides a reasonable approximation of the sphere. By choosing to work with this very regular manifold, we are able to implement the gauge equivariant convolution using a single conv2d call, making it a highly scalable and practical alternative to Spherical CNNs. Using this method, we demonstrate substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns.<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-838\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-838\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-837\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tUnderstanding overparameterized neural networks\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-837\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-838\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<\/p>\n<p><strong>Speaker:<\/strong> Jascha Sohl-Dickstein<\/p>\n<p>As neural networks become highly overparameterized, their accuracy improves, and their behavior becomes easier to analyze theoretically. I will give an introduction to a rapidly growing body of work which examines the learning dynamics and prior over functions induced by infinitely wide, randomly initialized, neural networks. Core results that I will discuss include: that the distribution over functions computed by a wide neural network often corresponds to a Gaussian process with a particular compositional kernel, both before and after training; that the predictions of wide neural networks are linear in their parameters throughout training; and that this perspective enables analytic predictions for how trainability depends on hyperparameters and architecture. These results provide for surprising capabilities\u2014for instance, the evaluation of test set predictions which would come from an infinitely wide trained neural network without ever instantiating a neural network, or the rapid training of 10,000+ layer convolutional networks. I will argue that this growing understanding of neural networks in the limit of infinite width is foundational for future theoretical and practical understanding of deep learning.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-840\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-840\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-839\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tMathematical landscapes and string theory\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-839\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-840\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Mike Douglas<\/p>\n<p>A fundamental theory of physics must explain how to derive the laws of physics ab initio, from purely formal constructions. In string\/M theory there are general laws, such as general relativity and Yang-Mills theory, which are fixed by the theory. There are also specific details, such as the spectrum of elementary particles and the strengths of interactions between them. These are not determined uniquely, but are derived from the geometry of extra dimensions of space. This geometry must take one of a few special forms called special holonomy manifolds, for example a Calabi-Yau (CY) manifold, or a G2 manifold. These manifolds are of significant mathematical interest independent of their relevance to string theory, and mathematicians and physicists have been working together to classify them and work out the relations between them.<\/p>\n<p>Such data \u2014a set of objects and relations between them, defined by simple axioms \u2014 can be called a mathematical landscape. There are many important landscapes besides special holonomy manifolds \u2014 finite groups, bundles on manifolds, other classes of manifolds, etc. Many landscapes, such as that of six-dimensional CY manifolds, turn out to be combinatorially large. It is a further challenge to extract simple and useful pictures from the vast wealth of their data. Machine learning will be an essential tool to meet this challenge.<\/p>\n<p>As an example, in the mid-90\u2019s, the physicists Kreuzer and Skarke did a survey of the 6-d toric hypersurface CYs to study mirror symmetry. The plot which demonstrated this, revealed a `shield\u2019 pattern which nobody had anticipated or even thought to ask about, whose explanation was attempted in several later works, and which may be the key to understanding the finite number of these spaces. Future studies of mathematical landscapes will no doubt reveal new unexpected patterns, especially if we have ML to help look for the patterns.<\/p>\n<p>We will give a high level survey of work in this direction, trying to minimize mathematical and physical technicalities, and to raise new questions.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-842\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-842\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-841\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tHolography, matter and deep learning\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-841\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-842\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Koji Hashimoto<\/p>\n<p>Revealing quantum nature of matter is indispensable for finding new features of known exotic matter and even new materials, and drives the progress in theoretical physics. One of the most important is to construct explicitly ground state wave functions for given Hamiltonians. On the other hand, the holographic principle, discovered in the context of string theory, claims equivalence between quantum matter and classical higher-dimensional gravity. It provided a completely new viewpoint on the understanding of quantum matter, such as quarks in QCD and strongly correlated electrons.<\/p>\n<p>These need to solve inverse problems, and machine learning should be effective intrinsically, with the following additionally important similarity in research: matter wave functions are given by tensor network optimization, and holographic principle defines quantum gravity where networks are regarded as discretized spacetime. Therefore, network optimization will play a central role in these sciences.<\/p>\n<p>In this talk, I give a brief review of several important topics related to the networks, and provide a concrete example of applying deep neural network to the holographic principle. The higher-dimensional curved spacetime is discretized to a deep neural network, and the input data (quark correlators) optimizes the network. The neural network weights are regarded as the emergent spacetime metric, and some other physical observables predicted from the trained network geometry can well be compared with supercomputer simulations of QCD.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-844\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-844\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-843\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tA picture of the energy landscape of deep neural networks\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-843\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-844\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Pratik Chaudhari<\/p>\n<p>Deep networks are mysterious. These over-parametrized machine learning models, trained with rudimentary optimization algorithms on non-convex landscapes in millions of dimensions have defied attempts to put asound theoretical footing beneath their impressive performance.<\/p>\n<p>This talk will shed light upon some of these mysteries. I will employ diverse ideas\u2014from thermodynamics and optimal transportation to partial differential equations, control theory and Bayesian inference\u2014and paint a picture of the training process of deep networks. Along the way, I will develop state-of-the-art algorithms for non-convex optimization.<\/p>\n<p>The goal of machine perception is not just to classify objects in images but instead, enable intelligent agents that can seamlessly interact with our physical world. I will conclude with a vision of how advances in machine learning and robotics may come together to help build such an Embodied Intelligence.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t<li class=\"m-0\" data-wp-context='{\"id\":\"accordion-content-846\"}' data-wp-init=\"callbacks.init\">\n\t\t<div class=\"accordion-header\">\n\t\t\t<button\n\t\t\t\taria-controls=\"accordion-content-846\"\n\t\t\t\tclass=\"btn btn-collapse\"\n\t\t\t\tdata-wp-bind--aria-expanded=\"state.isExpanded\"\n\t\t\t\tdata-wp-on--click=\"actions.onClick\"\n\t\t\t\tid=\"accordion-button-845\"\n\t\t\t\ttype=\"button\"\n\t\t\t>\n\t\t\t\tCombinatorial Cosmology\t\t\t<\/button>\n\t\t<\/div>\n\t\t<div\n\t\t\taria-labelledby=\"accordion-button-845\"\n\t\t\tclass=\"msr-accordion__content\"\n\t\t\tdata-wp-bind--inert=\"!state.isExpanded\"\n\t\t\tdata-wp-run=\"callbacks.run\"\n\t\t\tid=\"accordion-content-846\"\n\t\t>\n\t\t\t<div class=\"msr-accordion__body\">\n\t\t\t\t<p><strong>Speaker:<\/strong> Liam McAllister<\/p>\n<p>A foundational problem in string theory is to derive statistical predictions for observable phenomena in cosmology and in particle physics. I will give an accessible overview of this subject, assuming no prior knowledge of string theory.<\/p>\n<p>I will explain that the set of possible natural laws is contained in the set of solutions of string theory that have no fields with zero mass and zero spin. Each solution is determined by a finite number of integers that specify the topology of the six-manifold on which the six extra dimensions of string theory are compactified. The total number of solutions is plausibly finite, albeit large. This set of solutions, the `landscape of string theory&#8217;, is the main object of study when relating string theory to observations.<\/p>\n<p>I will discuss how the work of deriving predictions from the string landscape can be formulated as a computational problem. The integers specifying the six-manifold topology are the fundamental parameters in nature, and the task is to find out which values they can take, and what observables result for each choice. I will illustrate this problem in the case of six-manifolds that are defined by triangulations of certain four-dimensional lattice polytopes. Such manifolds are finite in number, though the number may exceed 10^900, and the computational tasks are almost entirely combinatorial. In this realm, one can aim to use machine learning to find patterns, or to pick out solutions with desirable properties. I will suggest some problems of this sort.<\/p>\n<p><span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/li>\n\t\t\t\t\t\t<\/ul>\n\t<\/div>\n\t<span id=\"label-external-link\" class=\"sr-only\" aria-hidden=\"true\">Opens in a new tab<\/span><\/p>\n<!-- \/wp:freeform --><!-- \/wp:msr\/content-tab --><!-- \/wp:msr\/content-tabs -->","tab-content":[{"id":0,"name":"About","content":"<div style=\"background-color: #dddddd;color: black;padding: 25px 15px 5px 15px\">\r\n<blockquote><strong>Update May 2020:<\/strong> Following up on our workshop here, we have begun a virtual biweekly seminar series continuing our conversations on the interface between physics and ML. Please visit <a href=\"http:\/\/physicsmeetsml.org\/\" target=\"_blank\" rel=\"noopener\">http:\/\/physicsmeetsml.org\/<\/a> for more information.<\/blockquote>\r\n<\/div>\r\n<div style=\"height: 20px\"><\/div>\r\nThe goal of <em>Physics<\/em> \u2229 <em>ML<\/em> (read 'Physics Meets ML') is to bring together researchers from machine learning and physics to learn from each other and push research forward together. In this inaugural edition, we will especially highlight some amazing progress made in string theory with machine learning and in the understanding of deep learning from a physical angle.\u00a0Nevertheless, we invite a cast with wide ranging expertise in order to spark new ideas.\u00a0Plenary sessions from experts in each field and shorter specialized talks will introduce existing research.\u00a0We will hold moderated discussions and breakout groups in which participants can identify problems and hopefully begin new collaborations in both directions. For example, physical insights can motivate advanced algorithms in machine learning, and analysis of geometric and topological datasets with machine learning can yield critical new insights in fundamental physics.\r\n<h3>Organizers<\/h3>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/gregyang\/\">Greg Yang<\/a>, Microsoft Research\r\n<a href=\"http:\/\/www.jhhalverson.com\/\" target=\"_blank\" rel=\"noopener\">Jim Halverson<\/a>, Northeastern University\r\nSven Krippendorf, LMU Munich\r\nFabian Ruehle, CERN, Oxford University\r\n<a href=\"http:\/\/www.rakkyeongseong.com\" target=\"_blank\" rel=\"noopener\">Rak-Kyeong Seong<\/a>, Samsung SDS\r\nGary Shiu, University of Wisconsin\r\n<h3>Microsoft Advisers<\/h3>\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/cmbishop\/\">Chris Bishop<\/a>, Microsoft Research\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/jchayes\/\">Jennifer Chayes<\/a>, Microsoft Research\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/michaelf\/\">Michael Freedman<\/a>, Microsoft Research\r\n<a href=\"https:\/\/www.microsoft.com\/en-us\/research\/people\/psmo\/\">Paul Smolensky<\/a>, Microsoft Research"},{"id":1,"name":"Agenda","content":"<h2>Day 1 | Thursday, April 25<\/h2>\r\n<table style=\"padding: 8px;width: 100%;text-align: left;border-bottom-color: #000000;border-bottom-width: 1px;border-bottom-style: solid;border-collapse: collapse;border-spacing: inherit\">\r\n<tbody>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><strong>Time (PDT)<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Speaker<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 1<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Plenary talks<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">8:00 AM\u20139:00 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breakfast<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:00 AM\u20139:45 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Gauge equivariant convolutional networks<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Taco Cohen<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:45 AM\u201310:30 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Understanding overparameterized neural networks<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Jascha Sohl-Dickstein<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">10:30 AM\u201311:00 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:00 AM\u201311:45 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Mathematical landscapes and string theory<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Mike Douglas<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:45 AM\u201312:30 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Holography, matter and deep learning<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Koji Hashimoto<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">12:30 PM\u20132:00 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Lunch<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:00 PM\u20134:05 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 2<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Applying physical insights to ML<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:00 PM\u20132:45 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Plenary: A picture of the energy landscape of deep neural networks<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pratik Chaudhari<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">2:45 PM\u20134:05 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short talks<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Neural tangent kernel and the dynamics of large neural nets<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Clement Hongler<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">On the global convergence of gradient descent for over-parameterized models using optimal transport<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">L\u00e9na\u00efc Chizat<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pathological spectrum of the Fisher information matrix in deep neural networks<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Ryo Karakida<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Fluctuation-dissipation relation for stochastic gradient descent<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sho Yaida<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">From optimization algorithms to continuous dynamical systems and back<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rene Vidal<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The effect of network width on stochastic gradient descent and generalization<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Daniel Park<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short certificates for symmetric graph density inequalities<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rekha Thomas<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Geometric representation learning in hyperbolic space<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Maximilian Nickel<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The fundamental equations of MNIST<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Cedric Beny<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Quantum states and Lyapunov functions reshape universal grammar<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Paul Smolensky<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Multi-scale deep generative networks for Bayesian inverse problems<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Pengchuan Zhang<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Variational quantum classifiers in the context of quantum machine learning<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Alex Bocharov<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">4:05 PM\u20134:30 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">4:30 PM\u20135:30 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">The intersect \u2229<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n<h2>Day 2 | Friday, April 26<\/h2>\r\n<table style=\"padding: 8px;width: 100%;text-align: left;border-bottom-color: #000000;border-bottom-width: 1px;border-bottom-style: solid;border-collapse: collapse;border-spacing: inherit\">\r\n<tbody>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><strong>Time (PDT)<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Speaker<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 3<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Applying ML to physics<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">8:00 AM\u20139:00 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breakfast<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:00 AM\u20139:45 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Plenary: Combinatorial Cosmology<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Liam McAllister<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">9:45 AM\u201310:15 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Break<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">10:15 AM\u201311:35 AM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Short talks<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Bypassing expensive steps in computational geometry<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yang-Hui He<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Learning string theory at Large N<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Cody Long<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Training machines to extrapolate reliably over astronomical scales<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Brent Nelson<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Breaking the tunnel vision with ML<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sergei Gukov<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Can machine learning give us new theoretical insights in physics and math?<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Washington Taylor<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Brief overview of machine learning holography<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yi-Zhuang You<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Applications of persistent homology to physics<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Alex Cole<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Seeking a connection between the string landscape and particle physics<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Patrick Vaudrevange<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">PBs^-1 to science: novel approaches on real-time processing from LHCb at CERN<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Themis Bowcock<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Q&amp;A<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">From non-parametric to parametric: manifold coordinates with physical meaning<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Marina Meila<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Machine learning in quantum many-body physics: A blitz<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yichen Huang<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Knot Machine Learning<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Vishnu Jejjala<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">11:35 AM\u201312:30 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Panel discussion with panelists Michael Freedman, Clement Hongler, Gary Shiu, Paul Smolensky, Washington Taylor<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">12:30 PM\u20131:30 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Lunch<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Session 4<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><strong>Breakout groups<\/strong><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">1:30 PM\u20133:00 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Physics breakout groups<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Symmetries and their realisations in string theory<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Sergei Gukov, Yang-Hui He<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">String landscape<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Michael Douglas, Liam McAllister<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Connections of holography and ML<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Koji Hashimoto, Yi-Zhuang You<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\">3:00 PM\u20134:30 PM<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">ML breakout groups<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Geometric representations in deep learning<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Maximilian Nickel<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Understanding deep learning<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Yasaman Bahri, Boris Hanin, Jaehoon Lee<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<tr>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000;width: 27%\"><\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Physics and optimization<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\">Rene Vidal<\/td>\r\n<td style=\"padding: 8px;border-bottom: 1px solid #000000\"><\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n<div style=\"height: 35px\"><\/div>"},{"id":2,"name":"Abstracts","content":"[accordion]\r\n[panel header=\"Gauge equivariant convolutional networks\"]\r\n\r\n<strong>Speaker:<\/strong> Taco Cohen\r\n\r\nThe principle of equivariance to symmetry transformations enables a theoretically grounded approach to neural network architecture design. Equivariant networks have shown excellent performance and data efficiency on vision and medical imaging problems that exhibit symmetries. Here we show how this principle can be extended beyond global symmetries to local gauge transformations. This enables the development of a very general class of convolutional neural networks on manifolds that depend only on the intrinsic geometry. This class includes and generalizes existing methods from equivariant- and geometric deep learning, and thus unifies these areas in a common gauge-theoretic framework.\r\n\r\nWe implement gauge equivariant CNNs for signals defined on the surface of the icosahedron, which provides a reasonable approximation of the sphere. By choosing to work with this very regular manifold, we are able to implement the gauge equivariant convolution using a single conv2d call, making it a highly scalable and practical alternative to Spherical CNNs. Using this method, we demonstrate substantial improvements over previous methods on the task of segmenting omnidirectional images and global climate patterns.\r\n[\/panel]\r\n[panel header=\"Understanding overparameterized neural networks\"]\r\n\r\n<strong>Speaker:<\/strong> Jascha Sohl-Dickstein\r\n\r\nAs neural networks become highly overparameterized, their accuracy improves, and their behavior becomes easier to analyze theoretically. I will give an introduction to a rapidly growing body of work which examines the learning dynamics and prior over functions induced by infinitely wide, randomly initialized, neural networks. Core results that I will discuss include: that the distribution over functions computed by a wide neural network often corresponds to a Gaussian process with a particular compositional kernel, both before and after training; that the predictions of wide neural networks are linear in their parameters throughout training; and that this perspective enables analytic predictions for how trainability depends on hyperparameters and architecture. These results provide for surprising capabilities\u2014for instance, the evaluation of test set predictions which would come from an infinitely wide trained neural network without ever instantiating a neural network, or the rapid training of 10,000+ layer convolutional networks. I will argue that this growing understanding of neural networks in the limit of infinite width is foundational for future theoretical and practical understanding of deep learning.\r\n\r\n[\/panel]\r\n[panel header=\"Mathematical landscapes and string theory\"]\r\n\r\n<strong>Speaker:<\/strong> Mike Douglas\r\n\r\nA fundamental theory of physics must explain how to derive the laws of physics ab initio, from purely formal constructions. In string\/M theory there are general laws, such as general relativity and Yang-Mills theory, which are fixed by the theory. There are also specific details, such as the spectrum of elementary particles and the strengths of interactions between them. These are not determined uniquely, but are derived from the geometry of extra dimensions of space. This geometry must take one of a few special forms called special holonomy manifolds, for example a Calabi-Yau (CY) manifold, or a G2 manifold. These manifolds are of significant mathematical interest independent of their relevance to string theory, and mathematicians and physicists have been working together to classify them and work out the relations between them.\r\n\r\nSuch data \u2014a set of objects and relations between them, defined by simple axioms \u2014 can be called a mathematical landscape. There are many important landscapes besides special holonomy manifolds \u2014 finite groups, bundles on manifolds, other classes of manifolds, etc. Many landscapes, such as that of six-dimensional CY manifolds, turn out to be combinatorially large. It is a further challenge to extract simple and useful pictures from the vast wealth of their data. Machine learning will be an essential tool to meet this challenge.\r\n\r\nAs an example, in the mid-90\u2019s, the physicists Kreuzer and Skarke did a survey of the 6-d toric hypersurface CYs to study mirror symmetry. The plot which demonstrated this, revealed a `shield\u2019 pattern which nobody had anticipated or even thought to ask about, whose explanation was attempted in several later works, and which may be the key to understanding the finite number of these spaces. Future studies of mathematical landscapes will no doubt reveal new unexpected patterns, especially if we have ML to help look for the patterns.\r\n\r\nWe will give a high level survey of work in this direction, trying to minimize mathematical and physical technicalities, and to raise new questions.\r\n\r\n[\/panel]\r\n[panel header=\"Holography, matter and deep learning\"]\r\n\r\n<strong>Speaker:<\/strong> Koji Hashimoto\r\n\r\nRevealing quantum nature of matter is indispensable for finding new features of known exotic matter and even new materials, and drives the progress in theoretical physics. One of the most important is to construct explicitly ground state wave functions for given Hamiltonians. On the other hand, the holographic principle, discovered in the context of string theory, claims equivalence between quantum matter and classical higher-dimensional gravity. It provided a completely new viewpoint on the understanding of quantum matter, such as quarks in QCD and strongly correlated electrons.\r\n\r\nThese need to solve inverse problems, and machine learning should be effective intrinsically, with the following additionally important similarity in research: matter wave functions are given by tensor network optimization, and holographic principle defines quantum gravity where networks are regarded as discretized spacetime. Therefore, network optimization will play a central role in these sciences.\r\n\r\nIn this talk, I give a brief review of several important topics related to the networks, and provide a concrete example of applying deep neural network to the holographic principle. The higher-dimensional curved spacetime is discretized to a deep neural network, and the input data (quark correlators) optimizes the network. The neural network weights are regarded as the emergent spacetime metric, and some other physical observables predicted from the trained network geometry can well be compared with supercomputer simulations of QCD.\r\n\r\n[\/panel]\r\n[panel header=\"A picture of the energy landscape of deep neural networks\"]\r\n\r\n<strong>Speaker:<\/strong> Pratik Chaudhari\r\n\r\nDeep networks are mysterious. These over-parametrized machine learning models, trained with rudimentary optimization algorithms on non-convex landscapes in millions of dimensions have defied attempts to put a\r\nsound theoretical footing beneath their impressive performance.\r\n\r\nThis talk will shed light upon some of these mysteries. I will employ diverse ideas\u2014from thermodynamics and optimal transportation to partial differential equations, control theory and Bayesian inference\u2014and paint a picture of the training process of deep networks. Along the way, I will develop state-of-the-art algorithms for non-convex optimization.\r\n\r\nThe goal of machine perception is not just to classify objects in images but instead, enable intelligent agents that can seamlessly interact with our physical world. I will conclude with a vision of how advances in machine learning and robotics may come together to help build such an Embodied Intelligence.\r\n\r\n[\/panel]\r\n[panel header=\"Combinatorial Cosmology\"]\r\n\r\n<strong>Speaker:<\/strong> Liam McAllister\r\n\r\nA foundational problem in string theory is to derive statistical predictions for observable phenomena in cosmology and in particle physics. I will give an accessible overview of this subject, assuming no prior knowledge of string theory.\r\n\r\nI will explain that the set of possible natural laws is contained in the set of solutions of string theory that have no fields with zero mass and zero spin. Each solution is determined by a finite number of integers that specify the topology of the six-manifold on which the six extra dimensions of string theory are compactified. The total number of solutions is plausibly finite, albeit large. This set of solutions, the `landscape of string theory', is the main object of study when relating string theory to observations.\r\n\r\nI will discuss how the work of deriving predictions from the string landscape can be formulated as a computational problem. The integers specifying the six-manifold topology are the fundamental parameters in nature, and the task is to find out which values they can take, and what observables result for each choice. I will illustrate this problem in the case of six-manifolds that are defined by triangulations of certain four-dimensional lattice polytopes. Such manifolds are finite in number, though the number may exceed 10^900, and the computational tasks are almost entirely combinatorial. In this realm, one can aim to use machine learning to find patterns, or to pick out solutions with desirable properties. I will suggest some problems of this sort.\r\n\r\n[\/panel]\r\n[\/accordion]"}],"msr_startdate":"2019-04-25","msr_enddate":"2019-04-26","msr_event_time":"","msr_location":"Microsoft Research Redmond","msr_event_link":"","msr_event_recording_link":"","msr_startdate_formatted":"April 25, 2019","msr_register_text":"Watch now","msr_cta_link":"","msr_cta_text":"","msr_cta_bi_name":"","featured_image_thumbnail":"<img width=\"960\" height=\"360\" src=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/11\/1920x720-header-Physics-2018-2.jpg\" class=\"img-object-cover\" alt=\"Bubble universes. Computer illustration of multiple bubble universes as predicted by the Eternal Inflation theory. The inflationary theory proposes that after the Big Bang, a condition known as a false vacuum created a repulsive force that caused an incredibly rapid expansion, much faster than the ordinary expansion observed today. Since this expansion is faster than the speed of light, areas of inflation would form bubbles that would be completely isolated from each other. This artwork could also represent the creation of separate parallel universes as fluctuations in a quantum foam.\" decoding=\"async\" loading=\"lazy\" srcset=\"https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/11\/1920x720-header-Physics-2018-2.jpg 1920w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/11\/1920x720-header-Physics-2018-2-300x113.jpg 300w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/11\/1920x720-header-Physics-2018-2-768x288.jpg 768w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/11\/1920x720-header-Physics-2018-2-1024x384.jpg 1024w, https:\/\/www.microsoft.com\/en-us\/research\/wp-content\/uploads\/2018\/11\/1920x720-header-Physics-2018-2-1600x600.jpg 1600w\" sizes=\"auto, (max-width: 960px) 100vw, 960px\" \/>","event_excerpt":"The goal of Physics \u2229 ML is to bring together researchers from machine learning and physics to learn from each other and push research forward together. In this inaugural edition, we will especially highlight some amazing progress made in string theory with machine learning and in the understanding of deep learning from a physical angle. Nevertheless, we invite a cast with wide ranging expertise in order to spark new ideas.","msr_research_lab":[199565],"related-researchers":[],"msr_impact_theme":[],"related-academic-programs":[],"related-groups":[],"related-projects":[],"related-opportunities":[],"related-publications":[],"related-videos":[589135,589753,589777,596401],"related-posts":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/548040","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-event"}],"version-history":[{"count":35,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/548040\/revisions"}],"predecessor-version":[{"id":1147065,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event\/548040\/revisions\/1147065"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media\/551232"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=548040"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=548040"},{"taxonomy":"msr-region","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-region?post=548040"},{"taxonomy":"msr-event-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-event-type?post=548040"},{"taxonomy":"msr-video-type","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-video-type?post=548040"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=548040"},{"taxonomy":"msr-program-audience","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-program-audience?post=548040"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=548040"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=548040"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}