Activision plans to implement a new system in its new shooter Call of Duty: Modern Warfare 3 of the 2023 model, designed to deal with toxic users. It will work on the basis of artificial intelligence, which will moderate voice chats in real time.
To solve this problem, the publisher has connected the ToxMod AI development team, which will be responsible for the filter. The system will detect negative statements, block users and send them special notifications. Moderation will support 14 languages of the world, while AI has already been launched in the previous part of the series (though only on North American servers).
Recall that the release of Call of Duty: Modern Warfare 3 will take place on November 10.
Horror in the spirit of Resident Evil and Silent Hill.
The project contains elements of survival and detective.
The film was originally expected to gross no more than $75 million.
Tarsier Studios has unveiled a teaser for its new game, featuring a talking pig warning of danger. The full announcement will take place at Gamescom 2024, August 20. A Little Nightmare threequel from Supermassive Games is also in development.