Digital technologies allow people across the globe to share information, news, and opinions – that, together, span a broad range of human expression. Unfortunately, some people use online platforms and services to exploit the darkest sides of humanity, which diminishes both safety and the free exchange of ideas.

At Microsoft, we believe digital safety is a shared responsibility requiring a whole-of-society approach. This means that the private sector, academic researchers, civil society, and governmental and intergovernmental actors all work together to address challenges that are too complex – and too important – for any one group to tackle alone. 

For our part, we prohibit certain content and conduct on our services, and we enforce rules that we’ve set to help keep our customers safe. We use a combination of automated detection and human content moderation to remove violating content and suspend accounts. Additional information is available on Microsoft’s Digital Safety site.

The Microsoft Services Agreement includes a Code of Conduct, which outlines what’s allowed and what’s prohibited when using a Microsoft account. Some services offer additional guidance, such as the Community Standards for Xbox, to show how the Code of Conduct applies on their services. Reporting violations of the Code of Conduct is critical to helping keep our online communities safe for everyone. More information on how to report violating content and conduct is included below.

Non-consensual intimate imagery

Practices

Microsoft takes seriously the harms caused by the sharing of non-consensual sexually intimate imagery. In many circumstances, sharing sexually intimate images of another person without that person’s consent violates their personal privacy and dignity. Microsoft prohibits the distribution of non-consensual intimate imagery (NCII). Microsoft also prohibits content soliciting NCII or advocating for the production or redistribution of intimate imagery without the subject’s consent.

Processes and systems

NCII Prevention and Detection

Any member of the public can request the removal of a nude or sexually explicit image or video of themselves that has been shared without their consent through this web form. Once violating content is reviewed and confirmed, Microsoft removes reported links to photos and videos from search results in Bing globally and/or removes access to the content itself when shared on Microsoft hosted consumer services. This includes content that is both real or deepfake imagery.

Non-consensual intimate imagery removal requests, January-June 2023

 

 

Requests reported

 

 

Requests actioned

 

 

Percentage of requests actioned

TOTAL

1,145

482

42%

Note: Numbers are aggregated across Bing and Microsoft hosted consumer services for which a content removal request was received during this reporting period.

 

Select previous Digital Safety Content Report to download

Continue

FAQ

General questions about this report

|

This report addresses Microsoft consumer services including (but not limited to) OneDrive, Outlook, Skype, Xbox and Bing. Xbox also publishes its own transparency report, outlining our approach to safety in gaming. This report does not include data representing LinkedIn or GitHub which issue their own transparency reports.

When we refer to “hosted consumer services,” we are talking about Microsoft services where Microsoft hosts content generated or uploaded by credentialed users (i.e., those logged into a Microsoft account). Examples of these services include OneDrive, Skype, Outlook and Xbox.

For this report, “content actioned” refers to when we remove a piece of user-generated content from our services and/or block user access to a piece of user-generated content.

 

For purposes of Bing, “content actioned” may also mean filtering or de-listing a URL from the search engine index.

For this report, “account actioned” refers to when we suspend or block access to an account, or restrict access to content within the account.

“Proactive detection” refers to Microsoft-initiated flagging of content on our services, whether through automated or manual review.

Microsoft uses scanning technologies (e.g., PhotoDNA or MD5) and other AI-based technologies, such as text-based classifiers, image classifiers, and the grooming detection technique.

Accounts reinstated refers to actioned accounts that were fully restored including content and account access, upon appeal.

|

In July 2015, when Microsoft announced its approach to non-consensual intimate imagery, also referred to as “revenge porn,” which is the sharing of nude or sexually explicit photos or videos online without consent, we said we would report the number of requests for takedown in transparency reports. A removal request is a request from an individual to have NCII removed from Microsoft services.

 

In previous years, we have reported this as “non-consensual pornography.” However, we have updated this term to “non-consensual intimate imagery” to ensure that the language we use to refer to this type of violation is respectful to victims and reflects the intrusive and damaging nature of this type of content.

Microsoft has a dedicated web form for reporting NCII, which gives guidance on what steps can be taken.