Guest Editors’ Introduction: Special Section Learning Deep Architectures

  • Samy Bengio
  • Li Deng
  • Hugo Larochelle
  • Honglak Lee
  • Ruslan Salakhtdinov

IEEE Trans. Pattern Analysis and Machine Intelligence | , Vol 35

The past seven years have seen a resurgence of research in the design of deep architecture models and learning algorithms, i.e., methods that rely on the extraction of a multilayer representation of the data. Often referred to as deep learning, this topic of research has been building on and contributing to many different research topics, such as neural networks, graphical models, feature learning, unsupervised learning, optimization, pattern recognition, and signal processing. Deep learning is also motivated and inspired by neuroscience and has had a tremendous impact on various applications such as computer vision, speech recognition, and natural language processing. The clearly multidisciplinary nature of deep learning led to a call for papers for a special issue dedicated to learning deep architectures, which would provide a forum for the latest advances on the subject. Associate Editor in Chief (AEIC) Max Welling took the initiative for the special issue in an earlier attempt to collaborate with Editor In Chief of the IEEE Signal Processing Magazine (Li Deng) for a joint special issue. We were five guest editors to oversee the task of selecting among the submissions the eight papers that were included in this special section. We were assisted by a great team of dedicated reviewers. Former Editor in Chief (EIC) Ramin Zabih and current EIC David Forsyth have greatly assisted in the process of realizing this special section. We all thank them for their crucial role in making this special section a success