“With this inaugural Xbox Transparency Report, it is our goal to share with you more about the wide range of actions that the Xbox team takes to moderate content on our platform and create safer experiences,” the company said. Out of the 4.78 million enforcements, 4.33 million were centred around detecting accounts that have been tampered with or are being used in inauthentic ways. This represents 57% of the total enforcement in the reporting period.
Other proactive enforcements taken by Xbox include 199,000 for adult sexual content, 87,000 for fraud and 54,000 for harassment or bullying.
As per the report, these inauthentic accounts (typically automated or bot-created accounts) created an unlevel playing field for authentic players. They impacted players in multiple ways, including the production of unsolicited messages, or spam, facilitation of cheating activities that disrupt play and improper inflation of friend/follower numbers among others.
Actions taken by moderation agents
As per the report, when players report any account that violates the company’s policies, the content moderation agents or systems will take action. The punishments include a temporary suspension of 3 days, 7 days, 14 days or permanent suspension.
“The length of suspension is primarily based on the offending content with repeated violations of the policies resulting in lengthier suspensions, an account being permanently banned from the service, or a potential device ban,” Microsoft said in the report.
Content moderators have received over 33 million reports during the first half of 2022, as per the report. Out of those 46% were related to communications, 43% related to conduct (like cheating, unsporting conduct) and 11% related to user generated content.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.