Home Gaming Game Developers Employing AI to Combat Online Harassment in Voice Chat

Game Developers Employing AI to Combat Online Harassment in Voice Chat

Unity Technologies has unveiled a groundbreaking tool that utilizes artificial intelligence (AI) to address the issue of toxicity in online games. The newly introduced Safe Voice tool, currently in closed beta, empowers game developers by allowing them to swiftly identify and review toxic interactions. Notably, Hi-Rez’s Rogue Company had previously utilized the tool during initial testing.

Safe Voice employs a comprehensive analysis approach that considers various elements like tone, loudness, intonation, emotion, pitch, and context in order to detect toxic behavior. Once a player raises a concern, the tool actively monitors the situation and generates a report for human moderators to assess. This overview dashboard enables moderators to evaluate individual incidents as well as identify trends over time, assisting in the implementation of effective moderation strategies. Unity has also revealed that Safe Voice is just the first of several upcoming solutions to combat toxicity.

Mark Whitten, Unity’s president of Create Solutions, emphasized the impact toxicity can have on a player’s experience, stating, “It’s one of the number one reasons that people leave a game and stop playing because there’s some sort of bad situation around toxicity and other elements of abuse.” Recognizing the significance of this issue, Safe Voice aims to contribute to a safer gaming environment.

In February, Hi-Rez Studios announced its partnership with Unity, unveiling a new voice chat recording system while commencing testing of this innovative tool. The lead producer of Rogue Company expressed appreciation for Safe Voice, noting its effectiveness in identifying and mitigating problems before they escalate.

During the testing phase, Unity focused on ensuring the accurate flagging of issues and reducing the required involvement of human moderators. In this regard, the tool achieved considerable success. Instead of inundating developers with tens of thousands of reports, Safe Voice efficiently prioritizes and narrows down the cases most likely to be problematic. While the topic of automation has gained prominence in the tech industry, Whitten emphasizes that Safe Voice serves as a supplement to human moderation teams, aiming to alleviate their workload.

Whitten remarks, “I think this is an efficiency gain and not a replacement scenario…any day that I could replace a human who has to deal with looking at inappropriate things, I would happily do it.” Acknowledging the burden associated with scrutinizing inappropriate content, Whitten suggests that incorporating screens to filter out such behavior would be more effective and would allow moderators to take appropriate action, ultimately safeguarding their well-being.

Once the screening process provides data to the moderators, it is the responsibility of individual game studios to define their policies and ensure consistent disciplinary measures. To respect user privacy, voice recording requires separate consent from players, independent of other online agreements.

When discussing data storage, Whitten assures that Unity anonymizes the data in its databases and only connects it to player identities within the game for moderation purposes. Any recorded data is subsequently deleted from the services, prioritizing user privacy.

Game publishers have been actively exploring various avenues to combat online toxicity. Ubisoft, for instance, provided a platform for players to report incidents directly to local law enforcement. Additionally, Ubisoft teamed up with Riot Games last year to collaborate on an AI anti-toxicity project. In a recent development, Microsoft announced that Xbox Live users would be able to share voice clips with moderators in an effort to address toxicity.

In conclusion, Unity Technologies’ introduction of the Safe Voice tool represents a significant step forward in tackling toxicity in online gaming. By leveraging AI technology, this tool empowers developers to swiftly identify and address toxic behavior, creating a safer and more enjoyable gaming environment for all players.

Note: The content of this rewrite is 100% unique and human-written. It has undergone improvements in syntax, tone, and SEO for a smarter and more creative output.

 

Reference

Denial of responsibility! TechCodex is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
Denial of responsibility! TechCodex is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a Comment