Deep Crossing on CNTK

Table of Contents Introduction to Deep Crossing Embedding: Simple Embedding for Dimensionality Reduction Stacking: Combining Inputs into a Single Vector Residual Units: Applying Deep Learning Deep Crossing in BrainScript Inputs Embedding Layer Stacking Layer ResNet Layer Finishing BrainScript Introduction to Deep Crossing Deep Crossing is a model introduced by Shan et. al. [1] as a solution to the problem of….

Recurrent Neural Networks with CNTK and applications to the world of ranking

Today, I want to talk about three things that we’ve been working on recently: (1) recurrent neural networks (RNNs); (2) implementing common RNN paradigms in CNTK; and (3) one slightly detailed example of how we are using these networks to help the ranking team and other teams throughout Bing. I’ll start with a brief overview of RNNs and the specific….

GRUs on CNTK with BrainScript

Almost all of the models that we’re currently working on involve Recurrent Neural Networks (RNNs) and therefore Long Short Term Memory (LSTM) cells. The BrainScript core library that comes with CNTK has great support for working with LSTM-based networks thanks to Frank Seide’s implementations. These allow us to quickly setup a powerful RNN model such as the query classification network….