Tool
dp-transformers repository
Motivated by our recent work (opens in new tab), we are releasing a repository for training transformer models with differential privacy. Our repository is based on integrating the Opacus library (opens in new tab) to the Hugging Face (opens…
Project
dp-transformers
Training transformer models with differential privacy Transformer models have recently taken the field of Natural Language Processing (NLP) by storm as large language models based on the transformer architecture have shown impressive performance across a…
Publication