How Rec Room Successfully Reduced Toxicity in Player Voice Chat by 70%

Presented by Modulate

The trust and safety team at Rec Room, a popular social gaming platform, has achieved remarkable success in reducing toxicity over the past 18 months. In this VB Spotlight, we explore the metrics, tools, and strategies that have enhanced player happiness, increased engagement, and transformed the gaming experience.

Watch free on demand now!

Enhancing player experience and safety should be a priority for game developers. In a recent VB Spotlight, Mark Frumkin, director of account management at Modulate, and Yasmin Hussain, head of trust and safety at Rec Room, discussed effective strategies to protect players from toxicity. They shared insights on Rec Room’s collaboration with ToxMod, a machine learning-driven voice chat moderation solution.

Launched in 2016, Rec Room boasts over 100 million lifetime users. Players engage in real-time interactions through text and voice chat across various platforms, including PC, mobile, VR headsets, and consoles, all while using customizable avatars.

“Rec Room was designed to create a space filled with countless worlds and rooms created not only by us but also by our players,” Hussain noted. “Trust and safety are essential to that vision.”

However, real-time voice interactions inevitably attract some players who behave inappropriately. How can developers change the behavior of those not adhering to community standards?

Over the past year, Rec Room has successfully reduced instances of toxic voice chat by approximately 70%, according to Hussain, although this progress did not occur overnight.

Tackling Toxicity Step by Step

The initial step involved implementing continuous voice moderation across all public rooms, setting clear expectations for player behavior. Next, the team focused on determining the most effective responses for misbehavior. They conducted various tests, experimenting with different lengths for mutes and bans, as well as two types of warnings: a strict warning and one that offered positive reinforcement.

Their findings revealed that instant detection and a one-hour mute significantly curtailed bad behavior. This immediate feedback served as a strong reminder to players that toxicity would not be tolerated, reducing immediate violations while keeping players engaged.

Although this approach didn't completely eliminate toxicity, it made notable progress. Upon investigation, the team discovered that a small percentage of players was responsible for a majority of violations. How could they effectively address this group?

“There was a clear link between a few players and a large number of violations, which prompted us to design further experiments,” Hussain explained. “By adjusting our interventions—like issuing an initial mute or warning, followed by subsequent mutes—we aim to create a cumulative effect that encourages learning. We are seeing promising results from this strategy.”

Implementing Experimentation in Trust and Safety

Frumkin emphasized the importance of tracking specific metrics to refine moderation strategies. Key data points include identifying what players are saying, the frequency of violations, and the profile of repeat offenders.

Establishing a clear hypothesis upfront is crucial. “The hypothesis is key,” Hussain stated. “When we tested different interventions to reduce violations, it was distinct from our efforts to change specific player behaviors.”

Iteration is vital for learning and refining strategies, but experiments must also run long enough to collect meaningful data and influence player conduct.

“We want players to adhere to community standards and become positive contributors. That often requires unlearning behaviors developed over time,” Hussain added. “Three to six weeks is typically needed for players to adapt to this new normal.”

Nevertheless, challenges persist. Progress in one area can lead to the emergence of new issues, necessitating ongoing adjustments to moderation techniques. While real-time speech moderation is complex, the Rec Room team feels confident in their interventions’ accuracy and their players' increasing sense of safety.

“We’ve made significant strides in reducing violations, with approximately 90% of our players reporting feeling safe and welcome in Rec Room,” Hussain noted. “It’s vital not only for justice to be executed but also for players to witness these changes to reinforce that our community standards are upheld.”

The Future of AI-Powered Voice Moderation

To make Rec Room a safer and more enjoyable environment, ToxMod continuously analyzes data related to policy violations and player interactions. It’s essential that moderation evolves, not only discouraging toxic behavior but also promoting actions that enhance the player experience.

“We’re developing the ability to identify pro-social behaviors,” Frumkin mentioned. “Recognizing players who are supportive or good at de-escalating tense situations allows us to highlight role models within the community. Amplifying positive influences can significantly enhance the environment.”

Voice moderation, particularly in real-time audio, poses considerable challenges. However, AI-powered tools are revolutionizing moderation strategies and expanding the capabilities of development teams.

“This advancement allows us to raise our ambitions. What seemed impossible yesterday is now achievable,” Hussain remarked. “We're witnessing considerable improvements in the efficiency and effectiveness of machine learning technologies, presenting new opportunities to prioritize community safety.”

To learn more about tackling toxicity in gaming, strategies for changing player behavior, and the transformative impact of machine learning, don’t miss this informative VB Spotlight, available for free on demand.

Watch free now!

Agenda

- Voice moderation tactics to detect hate and harassment

- Rec Room’s success and insights in developing a voice moderation strategy

- Key takeaways from voice moderation data every developer should track

- The link between reducing toxicity and increasing player retention and engagement

Presenters

Yasmin Hussain, Head of Trust & Safety, Rec Room

Mark Frumkin, Director of Account Management, Modulate

Rachel Kaser, Technology Writer, Moderator

Most people like

Find AI tools in YBX