We are a group of researchers at Microsoft Research, Northwestern University, University of Minnesota, Dartmouth College, Mental Health America and LivingWorks looking to improve AI chatbots (e.g., ChatGPT, Copilot, Claude) for users experiencing suicidal distress.
The goal of this study is to understand how to best design AI experiences to be supportive of users that interact with AI in moments of acute crises that may involve thoughts of suicide or self-harm.
We are conducting a survey study and recruiting those with lived experiences with suicide or self-harm to capture their perspectives on different ways that AI chatbots can be supportive.
To be eligible for the study, you must
- Be 18 years or older, AND
- Have a home address in the United States, AND
- Have lived experience with suicide or self-harm (Lived experience can be defined as personally experiencing, knowing, or having known someone close with experiences of suicide ideation and/or crisis or self-harm with or without intent to kill oneself), AND
- Have experience using AI chatbots (e.g., ChatGPT) for general mental health support OR have experience with those that use AI chatbots for general mental health support such that they can represent perspectives of someone with first-hand experiences
If you are interested in participating, please complete the survey at the link below.
Your responses are important to us. We will be reviewing all survey responses to confirm their validity and quality before issuing compensation. Participants who complete the survey and meet our validity and quality expectations will receive a $50 Tango digital gift card.
More information about the study
Research Team
- Microsoft Research: Jina Suh, Denae Ford Robinson, Ann Paradiso
- Microsoft: Keertana Namuduri, Eugenia Kim, Ebele Okoli, Teresa Rexin
- Mental Health America: Theresa Nguyen
- University of Minnesota: Leah Ajmani
- Northwestern University: Arka Ghosh, Jessica Schleider
- LivingWorks: Pete Gutierrez, Jetta Hanson
- Dartmouth College: Benjamin Kaveladze
What is the goal of this research?
The goal of this study is to understand how to best design AI experiences to be supportive of users that interact with AI in moments of acute crises that may involve thoughts or preparation of suicide or self-harm or may lead to increased risks of suicide or self-harm.
Who is eligible to participate in this research?
We are looking for those with lived experience with suicide or self-harm.
Lived experience can be defined as personally experiencing, knowing, or having known someone with experiences of suicide ideation and/or crisis or self-harm with or without intent to kill oneself.
When you visit the survey link, there will be a series of questions to verify the eligibility of your participation.
To participate, you must
- Be 18 years or older, AND
- Have a home address in the United States, AND
- Have lived experience with suicide or self-harm, AND
- Have experience using AI chatbots (e.g., ChatGPT) for mental health support OR have experience with those that use AI chatbots for mental health support such that they can represent perspectives of someone with first-hand experiences
Can I share this study information with others?
If you know of someone whose experience might be relevant and might be willing to participate, please feel free to forward the study information. We kindly ask that you do not widely post about the study information on online forums like social media or websites.
What does participation entail?
Participation involves completing a one-hour online survey.
Will I be compensated for participating in this study?
By participating in this research study, you will receive a $50 Tango digital gift card. We will be reviewing all survey responses to confirm their validity and quality before issuing compensation. A valid response is one where a clear attempt was made to answer a majority of questions to the best of your ability.
How many people will participate?
We plan to recruit about 400 participants. The study will be closed once we reach the required number of participants.
How will the study benefit me?
There will be no direct benefit to you as a result of participating in this study.
We hope that findings from this research will help AI developers and designers to build AI chatbots that are more supportive of those experiencing acute crises, including moments of suicidal or self-harm thoughts.
What are the risks of participating in this study?
The risks of participating may include emotional discomforts such as embarrassment, distress, or anxiety when discussing past personal experiences and thoughts related to moments of crises, suicide, or self-harm and the use of AI for social and emotional support. To minimize these risks, we have implemented several safeguards:
- This survey has been vetted and approved by our clinical psychologist collaborators
- You can take breaks in between portions of the survey
- You can skip any optional questions you do not wish to answer. We only require answers to questions that are essential for our analysis or study logistics (such as participant compensation).
- At any time during the survey, you can discard your responses and exit the survey. Unfortunately, we can only provide compensation for completed surveys with genuine responses. Discarded responses will be deleted immediately and will not be part of any analysis.
- In the footer of every survey page is a link to essential mental health resources that provide a broad range of services. Please use this sheet at any time during the survey.
What will happen to my data?
Researchers will keep your participation and the information you share as confidential as possible. The information you share will be labeled in our records with a code instead of your name or other direct identifier. The key to this code will be stored separately and destroyed within 6 months after study completion. Your contact information will only be used for compensation purposes and will be discarded within 6 months of study completion
De-identified survey responses will be shared with our research collaborators at Northwestern University, Dartmouth College, and LivingWorks. They will retain access to this data for 2 years.
Researchers may share the results of this study publicly, such as in journal articles or conference presentations, but your name will not be included. Information and data collected from you during this study may be used for Microsoft’s future research studies or to improve our products or services. If that happens, researchers will remove any direct identifiers, like your name or email address, before sharing. We may also share de-identified data with future collaborators.
What happens if I no longer wish to be a part of this study?
Whether or not you participate is entirely up to you. You can decide to participate now and stop participating later. Your decision of whether or not to participate will have no impact on any other services or agreements you have with Microsoft outside of this research.
If you decide to withdraw from the study and you want researchers to remove your study information, you can contact the study team at aicrisisresearch@service.microsoft.com. However, after we remove any link to identifiers, it would no longer be possible to delete your data, but any results of the research will not identify you individually.
Who can I talk to if I have questions about the study?
For questions about how Microsoft manages your privacy, please see the Microsoft Privacy Statement (http://go.microsoft.com/fwlink/?LinkId=521839 (opens in new tab)).
If you have any questions, please reach out to the research team at aicrisisresearch@service.microsoft.com.
If you have any questions about your rights as a research participant, please contact the Microsoft Research Ethics Review Program at MSRStudyfeedback@microsoft.com.
This study was approved by the Microsoft Research Ethics Review Board [Study #9383]. Thank you for helping with our research!