Microsoft at ICLR 2020
Microsoft is a Silver sponsor of the Eighth International Conference on Learning Representations (ICLR) this year. ICLR is the premier gathering of professionals dedicated to the advancement of the branch of artificial intelligence called representation…
VL-BERT
VL-BERT is a simple yet powerful pre-trainable generic representation for visual-linguistic tasks. It is pre-trained on the massive-scale caption dataset and text-only corpus, and can be fine-tuned for various down-stream visual-linguistic tasks, such as Visual…
BERT-nmt
BERT-fused NMT is a new algorithm in which we first use BERT to extract representations for an input sequence, and then the representations are fused with each layer of the encoder and decoder of the…
KG-A2C
KG-A2C is a reinforcement learning agent that builds a dynamic knowledge graph while exploring and generates natural language using a template-based action space – outperforming all current agents on a wide set of text-based games.
FreeLB
FreeLB is an adversarial training approach for improving transformer-based language models on Natural Language Understanding tasks. It accumulates the gradient in the ascent steps and updates the parameters with the accumulated gradients, which is approximately…