Game developers are looking for ways to clamp down on abusive language online—and the massively popular shooter franchise Call of Duty is notorious amongst gamers for toxic players, as a 2022 study reinforced. Now publisher Activision is trying to do more about it, with help from AI.
Call of Duty is turning to artificial intelligence to moderate voice chats in its online player community with the launch of a new ToxMod feature, Activision said Wednesday. The feature, developed in collaboration with Boston-based AI startup Modulate, is now live in North America in the games Call of Duty: Modern Warfare II and Call of Duty: Warzone.
ToxMod uses artificial intelligence to identify and take action against toxic speech, including harmful language, hate speech, discriminatory language, and harassment—all in real time. It will expand globally (excluding Asia) on November 10 with the launch of the latest Call of Duty title, Modern Warfare III.
“This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system,” Activision said in a
Go to Source to See Full Article
Author: Jason Nelson
Tip BTC Newswire with Cryptocurrency