
Motivated by our recent work, we are releasing a repository for training transformer models with differential privacy. Our repository is based on integrating the Opacus library to the Hugging Face platform. We aim to serve the privacy-preserving ML community in…
GitHubdp-transformers Training transformer models with differential privacy Transformer models have recently taken the field of Natural Language Processing (NLP) by storm as large language models based on the transformer architecture have shown impressive performance across…
Are you interested in radically improving the security of Microsoft’s products? Do you want to work on cutting-edge malware analysis systems? Are you committed to helping Microsoft customers keep their computers secure and combating evolving…
In this talk, we discuss the general supersingular isogeny problem, the foundational hardness assumption underpinning isogeny-based cryptography. We implement and optimize the best attack against this problem – the Delfs-Galbraith algorithm – to explicitly determine…
For many organizations, trusting their data to the cloud requires having a complete understanding of and control over the environment in which that data resides and how it’s being processed. Microsoft understands this, and we…