Transforming Machine Learning and Optimization Through Quantum Computing 

In 1982, Richard Feynman first proposed using a “quantum computer” to simulate physical systems with exponential speed over conventional computers. Quantum algorithms can solve problems in number theory, chemistry, and materials science that would otherwise take longer than the lifetime of the universe to solve on an exascale machine. Quantum computers offer new methods for machine learning, including training Boltzmann machines and perceptron models. These methods have the potential to dramatically improve upon today’s machine learning algorithms used in almost every device, from cell phones to cars. But can quantum models make it possible to probe altogether different types of questions and solutions? If so, how can we take advantage of new representations in machine learning? How will we handle large amounts of data and input/output on a quantum computer? This session will focus on both known improvements and open challenges in using quantum techniques for machine learning and optimization.

Date:
Speakers:
Krysta Svore; Helmut Katzgraber; Matthias Troyer; Nathan Wiebe
Affiliation:
Texas A&M; Microsoft