XtremeDistil
XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale.
GitHubAn index of datasets, SDKs, APIs and other open source code created by Microsoft researchers and shared with the broader academic community. We also maintain a collection highlighting some of the tools you’ll find here.
XtremeDistil framework for distilling/compressing massive multilingual neural network models to tiny and efficient models for AI at scale.
GitHubThe GitHub repository includes instructions to build and deploy a video analytics service tailored for operator’s infrastructure using Kubernetes in the Azure public MEC and Azure IoT Edge, and video ML containers like NVIDIA Triton…
GitHubHere, we provide a plug-in-and-play implementation of Admin, which stabilizes previously-diverged Transformer training and achieves better performance, without introducing additional hyper-parameters. The design of Admin is half-precision friendly and can be reparameterized into the original…
GitHubImplementation of MoLeR: a generative model of molecular graphs which supports scaffold-constrained generation. This open-source code accompanies our paper “Learning to Extend Molecular Scaffolds with Structural Motifs”, which has been accepted at the ICLR 2022…
GitHubA deep learning approach towards the large-scale prediction and analysis of bird acoustics from 100 different bird species
GitHubA Python package for generating concise, high-quality summaries of a probability distribution In distribution compression, one aims to accurately summarize a probability distribution $\P$ using a small number of representative points. Near-optimal thinning procedures achieve…
GitHub