How people use Copilot for Health
- Beatriz Costa-Gomes ,
- Pavel Tolmachev ,
- Eloise Taysom ,
- Viknesh Sounderajah ,
- Hannah Richardson (nee Murfet) ,
- Philipp Schoenegger ,
- Xiaoxuan Liu ,
- Matthew M Nour ,
- Seth Spielman ,
- Samuel F. Way ,
- Yash Shah ,
- Michael Bhaskar ,
- Harsha Nori ,
- Christopher Kelly ,
- Peter Hames ,
- Bay Gross ,
- Mustafa Suleyman ,
- Dominic King
|
We analyze over 500,000 de-identified health-related conversations with Microsoft Copilot from January 2026 to characterize what people ask conversational AI about health. We develop a hierarchical intent taxonomy of 12 primary categories using privacy-preserving LLM-based classification validated against expert human annotation, and apply LLM-driven topic-clustering for prevalent themes within each intent. Using this taxonomy, we characterize the intents and topics behind health queries, identify who these queries are about, and analyze how usage varies by device and time of day. Five findings stand out. First, nearly one in five conversations involve personal symptom assessment or condition discussion, and even the dominant general information category (40%) is concentrated on specific treatments and conditions, suggesting that this is a lower bound on personal health intent. Second, one in seven of these personal health queries concern someone other than the user, such as a child, a parent, a partner, suggesting that conversational AI can be a caregiving tool, not just a personal one. Third, personal queries about symptoms and emotional health queries increase markedly in the evening and nighttime hours, when traditional healthcare is most limited. Fourth, usage diverges sharply by device: mobile concentrates on personal health concerns, while desktop is dominated by professional and academic work. Fifth, a substantial share of queries focuses on navigating healthcare systems such as finding providers, and understanding insurance, highlighting friction in the delivery of existing healthcare. These patterns have direct implications for platform-specific design, safety considerations, and the responsible development of health AI.