Microsoft’s Xbox Publishes First Transparency Report on Content Moderation

  • Company took action against more than 4.3 million accounts
  • Increased its proactive moderation by nine times over year ago

A Microsoft Xbox One video game controller. 

Photographer: Michael Ciaglo/Bloomberg
Lock
This article is for subscribers only.

Microsoft Corp.’s Xbox released its first transparency report on Monday, detailing how the gaming giant moderates its 3 billion global players.

Xbox took action against more than 4.3 million inauthentic accounts between January and June, according to the report, after increasing its proactive moderation nine times compared with the same period a year earlier. These inauthentic accounts are typically automated or bot accounts that can be used to trick or harass players through spam, facilitate cheating activities, inflate friend or follow numbers or initiate distributed denial of service, or DDoS, attacks.