[REFAI Seminar 06/08/21] Transformer efficiency: From model compression to training acceleration
- Yu Cheng, Microsoft
06/08/21 Yu Cheng, Microsoft Research
“Transformer efficiency: From model compression to training acceleration”
More Info about REFAI Seminar: https://sites.google.com/site/boyuaneecs/efficient-ai-seminar-talk?authuser=0
-
-
Yu Cheng
Principal Researcher
-
-
Watch Next
-
Dion2: A new simple method to shrink matrix in Muon
- Anson Ho,
- Kwangjun Ahn
-
-
-
-
-
-
Beyond Swahili: Designing Inclusive AI for Bantu Languages
- Alfred Malengo Kondoro
-
GeoMind: A Multi-Agent Framework for Geospatial Decision Support
- Muhammad Sohail Danish
-
-