Digital Safety Content Report
The Digital Safety Content Report (DSCR) covers actions that Microsoft has taken in relation to child sexual exploitation and abuse imagery (CSEAI), grooming of children for sexual purposes, terrorist and violent extremist content (TVEC), and non-consensual intimate imagery (NCII).
Digital technologies allow people across the globe to share information, news, and opinions – that, together, span a broad range of human expression. Unfortunately, some people use online platforms and services to exploit the darkest sides of humanity, which diminishes both safety and the free exchange of ideas.
At Microsoft, we believe digital safety is a shared responsibility requiring a whole-of-society approach. This means that the private sector, academic researchers, civil society, and governmental and intergovernmental actors all work together to address challenges that are too complex – and too important – for any one group to tackle alone.
For our part, we prohibit certain content and conduct on our services, and we enforce rules that we’ve set to help keep our customers safe. We use a combination of automated detection and human content moderation to remove violating content and suspend accounts. Additional information is available on Microsoft’s Digital Safety site.
The Microsoft Services Agreement includes a Code of Conduct, which outlines what’s allowed and what’s prohibited when using a Microsoft account. Some services offer additional guidance, such as the Community Standards for Xbox, to show how the Code of Conduct applies on their services. Reporting violations of the Code of Conduct is critical to helping keep our online communities safe for everyone. More information on how to report violating content and conduct is included below.
Microsoft has a long-standing commitment to online child safety. We develop tools and engage with a variety of stakeholders to help address this issue. As specified in our Code of Conduct and on our content and conduct policies page on Microsoft’s Digital Safety site, we prohibit any child sexual exploitation or abuse, which is content or activity that harms or threatens to harm a child through exploitation, trafficking, extortion, or endangerment, including through the sharing of visual media that contain sexual content that involves or sexualizes a child or through grooming of children for sexual purposes.
Microsoft is a member of the WePROTECT Global Alliance, the multistakeholder organization fighting child sexual exploitation and abuse online. Microsoft also supports the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse and works closely with WePROTECT to promote them.
Microsoft is a founding member of the Technology Coalition, the tech industry’s non-profit association to combat online child sexual exploitation and abuse.
We also support and/or hold leadership and advisory roles with numerous other child safety organizations, including the Family Online Safety Institute, INHOPE, the Internet Watch Foundation, and the National Center for Missing and Exploited Children (NCMEC).
Processes and systems
Child Exploitation Prevention and Detection
Detection and removal of child sexual exploitation and abuse imagery (CSEAI)
We deploy tools to detect child sexual exploitation and abuse imagery (CSEAI), including hash-matching technology (e.g., PhotoDNA) and other forms of proactive detection. In-product reporting is also available for services such as OneDrive, Skype, Xbox, and Bing, whereby users can report suspected child exploitation or other content. Microsoft developed PhotoDNA, a robust hash-matching technology to help find duplicates of known child sexual exploitation and abuse imagery. We continue to make PhotoDNA freely available to qualified organizations, and we leverage PhotoDNA across Microsoft’s consumer services.
As a U.S.-based company, Microsoft reports all apparent CSEAI or grooming of children for sexual purposes to NCMEC via the CyberTipline, as required by U.S. law. We take action on the account(s) associated with the content we have reported to NCMEC. Users have the opportunity to appeal these account actions by visiting this webpage and using this web form.
Outcomes – January through June 2023
During the period of January-June 2023, Microsoft submitted 79,971 reports to NCMEC.
For our hosted consumer services – such as OneDrive, Outlook, Skype and Xbox – Microsoft actioned 46,856 pieces of content and 7,456 consumer accounts associated with CSEAI or grooming of children for sexual purposes during this period. Microsoft detected 99.2 percent of the content that was actioned through automated technologies, while the remainder was reported to Microsoft by users or third parties. Of the accounts actioned for CSEAI or grooming of children for sexual purposes, 1.4 percent were reinstated upon appeal.
For Bing, Microsoft works to prevent CSEAI from entering the Bing search index by leveraging block lists of sites containing CSEAI identified by credible agencies, and through PhotoDNA scanning of the index and visual search references when users upload images on one of Bing’s hosted features such as visual search. During this reporting period, Microsoft actioned 227,823 pieces of content which were confirmed as apparent CSEAI through content moderation processes and reported to NCMEC, with 94.7 percent detected through PhotoDNA scanning and other proactive measures.
Note - Data in this report represents the period January-June 2023 and includes Microsoft consumer services such as OneDrive, Outlook, Skype, Xbox and Bing. This report does not include data representing LinkedIn or GitHub which issue their own transparency reports.
This report addresses Microsoft consumer services including (but not limited to) OneDrive, Outlook, Skype, Xbox and Bing. Xbox also publishes its own transparency report, outlining our approach to safety in gaming. This report does not include data representing LinkedIn or GitHub which issue their own transparency reports.
When we refer to “hosted consumer services,” we are talking about Microsoft services where Microsoft hosts content generated or uploaded by credentialed users (i.e., those logged into a Microsoft account). Examples of these services include OneDrive, Skype, Outlook and Xbox.
For this report, “content actioned” refers to when we remove a piece of user-generated content such as images and videos, from our services and/or block user access to a piece of user-generated content.
For purposes of Bing, “content actioned” may also mean filtering or de-listing a URL from the search engine index.
For this report, “account actioned” refers to when we suspend or block access to an account, or restrict access to content within the account.
“Proactive detection” refers to Microsoft-initiated flagging of content on our services, whether through automated or manual review.
Microsoft uses scanning technologies (e.g., PhotoDNA or MD5) and other AI-based technologies, such as text-based classifiers, image classifiers, and the grooming detection technique.
Accounts reinstated refers to actioned accounts that were fully restored including content and account access, upon appeal.
In 2009, Microsoft partnered with Dartmouth College to develop PhotoDNA, a technology that aids in finding and removing known images of child sexual exploitation and abuse.
PhotoDNA creates a unique digital signature (known as a “hash”) of an image which is then compared against signatures (hashes) of other photos to find copies of the same image. When matched with a database containing hashes of previously identified illegal child sexual abuse images, PhotoDNA helps detect, disrupt, and report the distribution of child sexual exploitation material. PhotoDNA is not facial recognition software and cannot be used to identify a person or an object in an image. A PhotoDNA hash is not reversible, meaning it cannot be used to recreate an image.
Microsoft has made PhotoDNA freely available to qualified organizations, including technology companies, law enforcement agencies, developers, and non-profit organizations.
More information can be found on the PhotoDNA site.
As explained by the National Center for Missing & Exploited Children (NCMEC), the CyberTipline “is the nation’s centralized reporting system” through which “the public and electronic service providers can make reports of suspected online enticement of children for sexual acts, extra-familial child sexual molestation, child pornography, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet.”
As a U.S.-based company, Microsoft reports all apparent CSEAI to NCMEC, as required by US law. According to NCMEC, staff review each tip to work to find a potential location for the incident reported so that it may be made available to the appropriate law enforcement agency across the globe. A CyberTip report to NCMEC can include one or multiple items.
Microsoft complies with global regulations to take action against child sexual exploitation and abuse content it discovers on its services. For example, pursuant to 18 USC 2258A, we report apparent child sexual exploitation content to the National Center for Missing and Exploited Children, which serves as a clearinghouse to notify law enforcement globally of suspected illegal child sexual exploitation content. Microsoft also leverages the derogation permitted by European Union Regulation (EU) 2021/1232 as required for its use of PhotoDNA and other detection technologies in services governed by EU Directive 2002/58/EC.