Even with increasingly effective moderation tools, the scope of player content and communication continues to expand. Online discourse has become more heated than ever before. In a recent GamesBeat Next panel, industry experts discussed the best practices for moderating game communities of all sizes and demographics. This conversation is critical as privacy, safety, and trust regulations become more prominent in the gaming industry. AI is playing a significant role in detecting and addressing harmful or toxic content, driving the evolution of content moderation strategies in gaming.
Proactivity in Content Moderation
“While child safety and user safety have been incredibly important in gaming, even before social media, we are lagging behind in our approach to content moderation,” says Tomer Poran, VP solution strategy at ActiveFence. He emphasizes the shift toward proactivity in gaming, mirroring what happened in social media a few years ago. Gaming companies are investing more in sophisticated, proactive content moderation tools. With advancements in technology, voice content moderation has become possible and affordable, enabling developers to address harmful speech in games.
“Every single company we’ve ever rolled out with, what they say is, I knew it was bad, but I didn’t think it was this bad,” adds Hank Howie, game industry evangelist at Modulate. “The stuff that gets said online, the harm that can get done. And now the technology is there. We can stop it.”
As technology grows more sophisticated, developers can fine-tune their moderation strategies based on the unique characteristics of their audience. Different games cater to diverse demographics, necessitating tailored flagging systems. Safety by design becomes crucial, determining the necessary product features and establishing guidelines for monitoring and enforcement.
Engaging the Community in Moderation
Active involvement of the game community is important in tackling trolls and toxic behaviors. By making it clear that certain behaviors are not tolerated, communities can be shaped positively. “We’ve seen stats on our side where if you can cut out the toxicity, we’ve been able to reduce churn 15-20 percent with new players and with returning players,” Howie explains. “Once they see they can get into a game and not get yelled at, not get made to feel badly, they stick around.”
“Once your community understands that certain behaviors will no longer be tolerated, certain things can no longer be said, you’d be surprised at how quickly they snap into line,” Howie adds. “They play more. I would hazard a guess they spend more, too.”
Maintaining an effective code of conduct is crucial, but it also needs to evolve over time and adapt to new situations. Schell Games, focused on younger audiences, faced unexpected challenges with toxic behaviors and cheating. The beta community appeared kinder than the live game community, prompting the need for policy revisions. Continual feedback and ongoing engagement with moderators are vital for a code of conduct to remain effective.
“Things like radicalization, recruitment, and then there’s always racism and misogyny — it’s all coming together at the same time,” Howie acknowledges. “The technology is there to meet it. The time is excellent to really take a look at this area and say, we just need to clean it up and make it better for our players.”