Knowledge Distillation as Semiparametric Inference [Talk]
More accurate machine learning models often demand more computation and memory at test time, making them difficult to deploy on CPU- or memory-constrained devices. Knowledge distillation alleviates this burden by training a less expensive student…
Consistent k-Median: Simpler, Better and Robust
Fairness-related harms in AI systems: Examples, assessment, and mitigation
AI has transformed modern life via previously unthinkable feats, from machines that can master the ancient board game Go and self-driving cars to developments we experience more routinely, such as virtual agents and personalized product…
Factorized Neural Layers
This repo contains code to reproduce experiments in the paper “Initialization and Regularization of Factorized Neural Layers”. It is split into codebases for different models and settings we evaluate; please see the corresponding directories for…