Call of Duty’s AI-powered anti-toxicity voice recognition has already detected 2 million accounts

Key Takeaways:

– Activision implemented a real-time voice moderation tool in Call of Duty games to combat toxic speech.
– The AI-powered moderation system was first introduced in Modern Warfare 2 and Warzone in August 2023.
– The system has detected over two million accounts engaging in toxic behavior.
– Only one in five players report toxic behavior, highlighting the importance of active reporting.
– The moderation system takes action against players even if their behavior goes unreported.
– Call of Duty has seen an 8% reduction in repeat offenders and a 50% reduction in severe instances of disruptive voice chat since implementing the moderation system.
– Violators can face consequences such as being muted or having social features restricted.
– Activision plans to continue evolving and expanding the moderation system, including adding new languages.
– Call of Duty is committed to combating toxicity and creating a fair and fun gaming experience for all players.

TechRadar:

Call of Duty‘s anti-toxicity voice chat moderation system has detected more than two million accounts that are being investigated.

Last year, Activision announced that it would be implementing a new real-time voice moderation tool to its more recent Call of Duty games that would “enforce against toxic speech” by detecting things like “hate speech, discriminatory language, harassment, and more” from players.

Source link

AI Eclipse TLDR:

Activision has announced that its anti-toxicity voice chat moderation system in Call of Duty games has detected over two million accounts that are currently under investigation. The system was implemented last year to enforce against toxic speech by detecting hate speech, discriminatory language, harassment, and more. The AI-powered moderation tool was initially beta-tested in Modern Warfare 2 and Warzone in August 2023 in English only. It was later expanded to Call of Duty: Modern Warfare 3 and introduced globally (except for Asia) with support for Spanish and Portuguese languages. According to Activision, the moderation system has taken action against accounts that engaged in disruptive voice chat based on the Call of Duty Code of Conduct. However, the company noted that only one in five players reported toxic behavior and speech, highlighting the need for active reporting. Activision has rolled out messages thanking players for reporting and plans to provide additional feedback when taking action on reports. The company reported an 8% reduction in repeat offenders and a 50% reduction in players exposed to severe instances of disruptive voice chat since the implementation of the voice moderation system. Violators of the rules of conduct can face consequences such as global muting from voice and text chat or restrictions on other social features. Activision aims to continue evolving and expanding its moderation technology, including adding new languages to the voice moderation system in future updates. The company is committed to combating toxicity and ensuring a fair and fun gaming experience for all players.