FIFA will use the upcoming World Cup as an opportunity to leverage AI that can prevent footballers and their followers from seeing abusive messages directed at them on social media.
Euro 2020’s legacy was tarnished by the abusive messages that players received on social media, particularly those who missed penalties for England in the final against Italy. In fact, more than half of players across the Euro 2020 and Afcon 2022 finals received some form of discriminatory abuse.
At the upcoming World Cup, therefore, FIFA will deploy Threat Matrix – a proactive monitoring and analysis service that recognises keywords (as well as images and emojis), flags abusive social media content, and compiles evidence to take to relevant authorities. This approach is expected to improve player protection in both the short and the long term.
FIFA President Gianni Infantino: “Our duty is to protect football, and that starts with the players who bring so much joy and happiness to all of us by their exploits on the field of play. Unfortunately, there is a trend developing where a percentage of posts on social media channels directed towards players, coaches, match officials and the teams themselves is not acceptable, and this form of discrimination – like any form of discrimination – has no place in football.”