Pretty much all online games tend to have problems with toxic and abusive players. This is something that goes hand in hand with such games, although in recent times developers are starting to take the issue more seriously than before. Blizzard is no stranger to the problem and have been working on dealing with toxicity in Overwatch.

However for the most part this usually relies on players reporting other toxic players which isn’t the most efficient of methods, which is why Blizzard could be looking to automate the process. Speaking to Kotaku in an interview, Overwatch’s director Jeff Kaplan revealed that the company has been experimenting with the use of algorithms and machine learning to help spot toxic behavior in players.

According to Kaplan, “We’ve been experimenting with machine learning. We’ve been trying to teach our games what toxic language is, which is kinda fun. The thinking there is you don’t have to wait for a report to determine that something’s toxic. Our goal is to get it so you don’t have to wait for a report to happen.”

Kaplan also hopes that this could eventually get to the point where it can tell what toxic behavior looks like without the need for any verbal or written cues. That being said, Kaplan has stated before that Blizzard spending time addressing toxic behavior in Overwatch is slowing down updates made to the game, so hopefully with machine learning, toxic behavior can be addressed automatically and more efficiently.

Filed in Gaming >General. Read more about AI (Artificial Intelligence), Blizzard and Overwatch.

Related Articles
User Comments