'Call Of Duty' Video Game Using AI To Monitor What Players Say, Censor 'Toxic Speech'

  • by:
  • Source: ZeroHedge
  • 09/08/2023
Call of Duty, a shooter video game published by Activision, has started using artificial intelligence to monitor what players say during online matches in order to flag and crack down on "toxic speech" more effectively as online gaming looks poised to become the newest frontier of censorship.

Activision said recently on its blog that Call of Duty is doubling down on its fight against "hate speech" and other types of "toxic and disruptive behavior" among players in online chatrooms by enlisting the help of AI to identify and police player conduct.

"Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more," the company said in the post.

The speech policing algorithms, which online players have no ability to turn off, will monitor and record what they say in order to identify speech that the company deems unfit for its virtual game spaces.
Gamers playing on Laptops by Fredrick Tendong is licensed under Unsplash unsplash.com

Get latest news delivered daily!

We will send you breaking news right to your inbox

© 2024 louder.news, Privacy Policy