The battle against misbehavior in gaming continues, and one aspect that more and more companies have been trying to address lately is voice chat. For instance, Riot Games has been recording voice chat with the intention of using it when players are reported, and Xbox recently announced that players will be able to save voice clips of unpleasant interactions themselves to report them.
Apparently, Activision wants to join in as well, as stated in a post on their website. They have announced that with the release of Call of Duty: Modern Warfare 3 on November 10, they will initiate a collaboration with Modulate to moderate voice chat. Behind this initiative is Modulate’s solution, called Toxmod, which is an AI-driven technology designed to identify inappropriate content in real-time during voice chats, allowing appropriate actions to be taken.
A beta version of this technology has reportedly been in use in North America since August 30, initially in the games Modern Warfare 2 and Warzone.