Understanding the Challenge
Rec Room, a popular social gaming platform, has made significant strides in reducing toxicity among its users. Over the past 18 months, the trust and safety team, in collaboration with Modulate, has implemented various strategies and tools to enhance player experience and safety. The focus has been on using ToxMod, a machine learning-based voice chat moderation system, to create a more welcoming environment for its 100 million users. The aim is to change player behavior and foster a positive community.
Key Strategies and Results
- Continuous voice moderation was introduced across all public rooms to set clear behavior expectations.
- A series of tests were conducted to determine effective responses to violations, including varying mute durations and types of warnings.
- The introduction of a one-hour mute for violations proved effective in reducing toxic behavior.
- It was discovered that a small group of players was responsible for a significant number of violations, prompting targeted interventions.
The Bigger Picture
Creating a safer gaming environment is essential for player retention and engagement. The success of Rec Room in reducing toxicity not only improves the player experience but also encourages positive interactions among users. As the platform evolves, ongoing adjustments to moderation strategies will be necessary to address emerging challenges. The use of AI-powered tools like ToxMod is key to this evolution, enabling the identification of both negative and positive behaviors within the community. Ultimately, these efforts contribute to a thriving gaming space where players feel secure and valued.











