Call of Duty’s AI-powered anti-toxicity voice recognition has already detected 2 million accounts

Call of Duty’s anti-toxicity voice chat moderation system has detected more than two million accounts that are being investigated.Last year, Activision announced that it would be implementing a new real-time voice moderation tool to its more recent Call of Duty games that would ‘enforce against toxic speech’ by detecting things like ‘hate speech, discriminatory language, harassment, and more’ from players.A beta version of the AI-powered moderation system was added to Modern Warfare 2 and Warzone in August 2023 in North

Read more