Summary
WithCall of Duty: Modern Warfare 3drawing closer to release, it is clear that the title is aiming to shake up a lot of the formula established by its recent predecessors. Aside from a slew of changes to past game mechanics, the title is even aiming to alter how online interaction is handled in its online modes.
Specifically,Call of Duty: Modern Warfare 3recently made waves with news that it will employ an ambitious AI-led moderation tool for multiplayer voice chat. This is largely in response toCall of Duty’s longstanding association with online trash talk, coming alongside industry-wide measures to tackle toxic encounters across competitive online shooters. It is likely that this AI will go on to become a key feature ofCoDmultiplayer, changing the way online communication takes place in-game forever.

RELATED:Call of Duty: Modern Warfare 3 Already Has One Thing in Common With The Worst Zombies Map Ever
Call of Duty: Modern Warfare 3’s Revolutionary AI Moderation Tool
Activision recently confirmed that a rigorous new AI technology called “ToxMod” will be rolled out forthe multiplayer ofCall of Duty: Modern Warfare 3. Using state-of-the-art procedural learning techniques, the software will be trained to detect a range of unfavorable topics within the game’s voice chat, including racism, ableism, sexism, and more. The software has already been trialed in North America for other Activision titles likeMW2, but the company is clearly confident that it can now be globally rolled out in time forMW3’s release in November.
As it stands,ToxMod will flag up any language that it has deemed inappropriate, creating a report that will then have to be manually reviewed by an employee. This allows the software to continue learning without erroneously banning anyone, with the technology still being in its relative infancy. It is unknown what kind of sanctions will be placed on players who are found to break Activision’s hate speech guidelines, but it will likely be quite severe in the form of suspensions and bans.
This concentrated effort on Activision’s part to tackle hate speech is indicative of the negative opinions that surroundCall of Duty’s online experience. Ever since gaming voice chat became commonplace,CoDhas been synonymous with trash talking across the gaming industry, with this often involving the kind of speech that Activision is actively targeting with ToxMod. Reporting this kind of behavior has always been quite difficult in terms of having bans issued to offending players, yet the AI software inMW3could be revolutionary for finally holding people accountable.
News of ToxMod’s introduction toCall of Dutycame shortly afterMicrosoft announced plans to add a voice chat report system to Xbox. With this allowing players to record in-game voice chat as evidence for voice reports, it seems that the wider video game industry is also cracking down on unfiltered multiplayer voice interaction alongsideCoD. ToxMod may currently seem like a glaring change to howCoDhas historically played, but the software is likely to become a staple part of the franchise’s online safety for the foreseeable future.
With AI as a whole making such leaping improvements in recent years, it was only a matter of time before it made its mark on the gaming space. Given the longstandingtoxicity that is associated withCall of Duty’s multiplayer, it is no wonder that Activision has decided to use AI to make identifying and punishing online offenders a streamlined and consistent process.
Call of Duty: Modern Warfare 3launches on November 10 for PC, PS4, PS5, Xbox One, and Xbox Series X.
MORE:Call of Duty: Modern Warfare 3’s Backwards Approach To Maps Shouldn’t Be The New Norm