Chapter 5
Gaming needs to play it safe to secure its future
Gaming is the fastest-growing element of the entertainment industry. And while console and PC gaming have their hardcore predominantly mature fans, most of this growth is thanks to generation Z and the continued global appeal of casual smartphone gaming. That means organizations should focus on keeping elements of CX such as simple troubleshooting, onboarding and in-app or in-game purchases as frictionless as possible. The majority of consumers who are now considered gamers may well enjoy playing games, but they would never consciously identify themselves as being gamers.
And because they don’t fit the traditional personas, organizations need to view these players as being at the very start of the gaming learning curve and so will need a high-touch, low-friction approach to customer care.
But as well as simplicity, gamers need safety. Organizations will need to redouble their efforts when it comes to policing the in-game environment. This is especially important as gaming advances towards immersive virtual worlds, providing insights into the future shape of the third iteration of the web.
A persistent challenge within the gaming industry that existed long before the metaverse became a buzzword is the need to proactively protect players
A persistent challenge within the gaming industry that existed long before the metaverse became a buzzword is the need to proactively protect players. Cyberbullying and offensive comments and content are a serious, growing problem.
But as well as an opportunity to anonymously offend, thanks to in-game chat, online gaming portals and platforms present opportunities for fraud and social engineering.
Therefore, the industry must addresses this issue by applying a balanced mix of human and technological content moderation and fraud detection. This approach will protect brand image, reduce customer churn and lay the foundation for developing features and capabilities to support the next digital evolution of gaming.
And it’s not just gaming — trust, safety and active content moderation services are required across the entire entertainment industry. For instance, website comment sections and social media communities are growing in importance for publishers and legacy media as a means of increasing reader engagement and the amount of time spent within their owned ecosystems and portals. Unfortunately, these practices can open publications to greater risk of legal action. The debate is still ongoing as to whether social networks are publishers or curation platforms — and therefore exempt from taking responsibility for the content shared and opinions expressed across their networks. However, by definition, news organizations are publishers and as such need to take proactive steps to moderate discussion.
Takeaways
Optimize your existing CX so it serves a wider range of potential customers.
Create safe, inclusive gaming experiences through trust, safety and moderation.
Keeping players safe now will keep them as customers as you develop new gaming experiences and environments.
Spotlight
Picking the right moderation service
The need for effective and proactive content moderation exists across the entirety of the media and entertainment industry. However, within gaming, the problem of inappropriate content, hate speech and even attempts at fraud is more acute and more likely to generate negative headlines within the news media.
But as the debate regarding what should and shouldn’t be allowed on platforms intensifies, so does the conversation around the health and wellbeing of the moderators charged with vetting user-generated content.
It’s for this reason that media, entertainment and gaming organizations need a solution that not only blends the latest technology with human decision making but also provides the support necessary so that moderators’ health and wellbeing are protected and always front of mind.
The correct application of AI is effective at flagging content that’s non-compliant, breaches a copyright or uses unacceptable language. However, AI alone cannot effectively police online gaming platforms, in-game chat or newspaper user communities and forums. Human understanding, particularly as it pertains to linguistic and cultural nuance as well as brand personaity, is essential in delivering a solution that benefits both organizations and customers.
Human understanding, particularly as it pertains to linguistic and cultural nuance as well as brand personaity, is essential in delivering a solution that benefits both organizations and customers
To be considered effective, any solution aimed at moderating content or uploads should be able to:
- Review and disposition inappropriate, non-compliant and harmful user-generated content uploads, comments, submissions and engagement
- Review and moderate advertisements to ensure ad rules and regulations are adhered to, such as disclosures, infringements, deceptive content and industry-specific regulatory content
- Authenticate user accounts based on an individual organization’s standards and policies
- Quickly detect fraud and mitigate risk with unauthorized users and activities and promotion of counterfeit goods and services
- Identify and remove content in violation of regulations governing copyright and intellectual property.
However, this approach will fail to deliver unless it also has built-in safeguards to protect and support moderators. These protections should include:
- Using clearly defined profiles at the hiring stage to ensure moderators have the requisite resiliency for the role
- A proven approach to training that encompasses mental health and wellbeing and coping skills as well as use of tools or legal requirements
- Limited and monitored shifts with shift duration tied to the types of moderation being undertaken
- A clearly defined support network for regular check-ins and ad-hoc support
- Proof that the organization is tracking health and wellbeing alongside other more traditional performance metrics and making any necessary adjustments in real time.