Activision plans to implement a new system in its new shooter Call of Duty: Modern Warfare 3 of the 2023 model, designed to deal with toxic users. It will work on the basis of artificial intelligence, which will moderate voice chats in real time.
To solve this problem, the publisher has connected the ToxMod AI development team, which will be responsible for the filter. The system will detect negative statements, block users and send them special notifications. Moderation will support 14 languages of the world, while AI has already been launched in the previous part of the series (though only on North American servers).
Recall that the release of Call of Duty: Modern Warfare 3 will take place on November 10.
The action game also received a content update.