It’s just a game

Toxicity in multiplayer video games is something every multiplayer gamer has to deal with. Whether you play League of Legends, Minecraft bedwars, or Roblox, there is no game that is safe from toxicity. What are game developers doing against toxicity? And should we just accept it as part of the experience or are we better than that?

Toxicity

So what exactly is toxicity? Toxicity is the general term used to denote any offensive, mean, or purposefully counter-productive behaviour and language used by players. It can range from someone throwing slurs in the text chat to a person deciding to play in such a way that the match will be purposefully lost. Toxicity is part of the darker side of gaming and esports. While trash talk and the occasional offensive joke are often seen as part of the deal, there is a limit that has to be set. Being called slurs of having your games ruined just because some players think that that’s okay should not be seen as just being part of online communities and esports. Everyone should be able to play games without being told they don’t belong every other match they play.

What are the devs doing?

For every multiplayer game out there, there is a different system or method that deals with the toxic part of its player base. Some games focus more on a reward system that rewards those who don’t display any toxic behaviour. Many games include an endorsement system where players can endorse each other’s good behaviour and players receive rewards for being seen as positive members of the player base. For example, the 6v6 shooter game Overwatch rewards players with weekly loot boxes, the amount of which depends on their endorsement level. Other games use the endorsement system in their matchmaking, only matching positive endorsement players with each other to ensure the positive members of the community are playing mostly with those who have similar good behaviour.

Another popular method is automatic chat detection for any unsavoury and offensive language use. Currently, it is uncommon for a multiplayer game to not have a system like this active. When a player is detected to be regularly using toxic or offensive language, this can result in punishment ranging from a chat ban to a ban from playing entirely. The downside of this system is that people are becoming more creative in finding ways to avoid the system itself and with a lot of multiplayer games becoming free to play, a lot of banned players are able to simply create a new account.

The biggest challenge is finding a system or method that has the highest efficiency in snuffing out toxic players. While a reporting system might work as a baseline, it can also result in the mass reporting of players who did nothing wrong who might then end up getting banned because of it. In the end, it seems like human intervention is needed both for reports of the behaviour and for transforming the behaviour in general.

We can do better

I think that we can all do better when it comes to toxic behaviour that we encounter or maybe even have as players. We should not default to telling people to just mute their teammates and to simply not use integral parts of competitive gaming like voice chats when we are discussing how to deal with toxic behaviours.