Real-time moderation solution, Bodyguard.ai signs new partnership with Team BDS to tackle online hate in eSports

  • 70% of people who play games online experience harassment
  • 4 in every 10 messages targeting players after a defeat are toxic.
  • 92% think solutions should be implemented to reduce toxic behaviour in multiplayer games.
  • The first of its kind partnership will help to further the case to fight online hate in eSports.

Real-time gaming moderation solution provider Bodyguard.ai today announces a ground-breaking partnership with Swiss esports organisation Team BDS that aims to lay down a major marker in the battle against online hate in eSports.

Team BDS is a formidable player in the eSports arena. As one of the fastest-growing teams in the world, it has already won the last Rainbow Six Major in Sweden,, the Rocket League Championship series, and were finalists in the League of Legends’ European Masters in 2022 alone.

Online toxicity continues to be a major issue within the gaming community which seems to be particularly prone to the online fight culture tendency on the internet. From insults to threats, the vast majority of gamers are subject to hateful content when playing online. Analysis from Unity Games shows that 72% of gamers experience harassment while playing online, and 92% think solutions should be implemented to reduce toxic behaviour in multiplayer games. One in four women say that toxicity no longer makes them want to play online multiplayer games, according to Women in Games.

Globally, the esports audience will reach 532 million worldwide by the end of 2022 and is expected to reach 640.8 billion by 2025 with sponsorships accounting for almost 60% of its revenue. Overall, the live-streaming audience in video gaming reached almost 810 million in 2021 and is expected to reach 1.41 billion by 2025. This huge growth comes with the potential to expose even more players and fans to toxicity online.

Through the partnership with Bodyguard.ai, Team BDS is taking a strong stance against all forms of toxicity by becoming the very first eSports team to invest in an automated moderation solution for the wellness of its players, fans and partners.

Arnaud Chemin, Head of Gaming at Bodyguard.ai comments: “The gaming sector, and in particular the esports market is booming. We know that millions of people across the world enjoy participating and watching their favourite players and teams. We are thrilled to announce our partnership with Team BDS, one of the fastest growing eSport teams in the world to help protect their athletes and fans from online hate. We are thrilled to be able to work with such a formidable gaming enterprise to raise awareness about the need to address toxicity in the gaming arena and challenge the industry to make it a better, safer, freer environment for all.”

Bodyguard.ai has been supporting Team BDS since April 2022, connecting to its social network accounts including Twitch, Instagram and Twitter. However, the full-scale partnership will provide the team access to the full suite of Bodyguard.ai gaming services.

Bodyguard.ai’s automated moderation service is able to moderate millions of comments in seconds across Twitch, Youtube, chat rooms, in-game chat, forums, fan communities and other social platforms with a 90% success rate.

Users are provided with an easy-to-use single solution capable of providing a detailed analysis of any gaming community, spanning all player behaviour and interactions. Not only does this allow organisations to reward positive behaviour through comprehensive insights and the ability to filter on specific categories, but it also allows action to be taken against those sharing toxic content.

In a time where toxicity in the gaming community is rife, automated solutions can have a massive impact. If left unchecked, problems can soon snowball, causing both targeted and non-targeted players to leave to avoid toxic behaviours. Not only can this leave organisations and communities with dwindling numbers, but it can also harm reputations and inflict damage to brand images.

Chemin adds: “It is clear that gamers are subject to significant levels of online abuse, and that must change. Behind every toxic comment there is a real person who may be significantly affected by the online hate and abuse that they are subjected to. Interactions are an essential part of the player experience. Gaming organisations need to protect their community from disruptive behaviours and harassment. With an automatic moderation solution, conversations will stay healthy and players remain happy and protected – making the sport better for everyone.”

 

Previous post YugabyteDB 2.17 and New YugabyteDB Managed Features Focus on the Needs of Business-Critical Applications
Next post IT business doubled in size after helping people transition to hybrid working