Activision has announced new measures to combat toxicity within Call of Duty, confirming it’ll be introducing what it calls “global real-time voice chat moderation” alongside the launch of Modern Warfare 3 on 10th November, with a US beta trial of the feature starting today.

Call of Duty’s new voice chat moderation system will employ AI-powered technology from Modulate to identify and enforce against toxic speech in real-time, with flagged language including hate speech, discrimination, and harassment.

An initial beta for Call of Duty’s new voice moderation system is being rolled out across Warzone and Modern Warfare 2 starting today, 30th August, in North America, and a global release will coincide with Modern Warfare 3’s arrival in November. Activision notes the tools will only support English at first, with additional languages coming “at a later date”.

In a Q&A accompanying today’s announcement, Activision explains the AI-powered system will only be responsible for identifying and reporting percieved offences for further review – attaching a behaviour category and rated level of severity to each submission – and that the publisher itself will determine how each violation is enforced based on its Security and Enforcement Policy.

Special Offer

Claim your exclusive bonus now! Click below to continue.