Episode Description: In this episode of Player: Engage, Greg interviews Sharon Fisher, the Head of Trust and Safety at Keywords Studios. Sharon shares her extensive experience in building and managing trust and safety teams, the challenges of moderating online communities, and the role of technology and AI in preventing harmful behavior. The conversation also covers strategies for creating safer gaming environments and the importance of well-being for trust and safety moderators.
Timestamps & Key Takeaways:
- 02:12.22 - 04:11.27: Moderation and PreventionSharon discusses the importance of preventive moderation and the tools needed to support moderators in identifying and addressing real-time threats.
- 07:02.04 - 09:54.65: Collaboration and Community SafetyEmphasizes the need for developers to collaborate with community teams early in the game development process to anticipate and mitigate potential risks.
Highlights the importance of preventive measures to create a safer gaming environment from the start. - 11:33.78 - 14:37.91: AI and Human Interaction in Trust and SafetySharon talks about the integration of AI in trust and safety, stressing the necessity of human oversight to ensure accuracy and context in moderation.
Discusses the challenges of training AI models and the importance of balancing AI and human intervention. - 18:37.93 - 20:33.75: Incentivizing Positive EngagementSharon proposes focusing on rewarding positive behaviors in communities rather than just punishing negative actions.
Suggests creating systems that encourage constructive interactions and reduce the appeal of trolling.
Key Concepts:
- Preventive Moderation:Sharon emphasizes the importance of having preventive measures in place to quickly identify and address harmful behavior in real-time. This involves using advanced tools and technologies to support moderators in their roles.
- Collaboration Between Developers and Community Teams:Highlighting the necessity for developers to work closely with community teams from the early stages of game development. This collaboration helps in identifying potential risks and implementing safety features proactively.
- Balancing AI and Human Oversight:The integration of AI in trust and safety is crucial, but it must be balanced with human oversight. Sharon discusses the challenges in training AI models and the importance of human moderators to ensure accurate and contextually appropriate decisions.
- Incentivizing Positive Behavior:Instead of focusing solely on punitive measures, Sharon advocates for systems that reward positive community engagement. This approach can help in shaping a healthier community culture and reducing the impact of negative behavior.
- Moderator Well-being:The well-being of trust and safety moderators is essential. Sharon shares how Keywords Studios has implemented programs to support the mental and emotional health of their moderators, leading to improved retention and overall job satisfaction.
- Brand Protection and Community Safety:Sharon discusses how proactive trust and safety measures can protect a brand’s reputation and create a safer environment for all players. She emphasizes that companies should invest in these measures not only for legal and ethical reasons but also for long-term community and business benefits.