The International Olympic Committee (IOC) estimates that the 2024 Summer Olympics would generate more than half a billion social media posts. This does not even include the comments. Assuming the average post is 10 words, that’s around 638 times longer than the King James Bible. If you spent one second on each post, it would take over 16 years to read all of them.
Many of those posts will include people’s names, and upwards of 15,000 athletes and 2,000 officials will be subject to a mountain of attention right now.
Cheering and expressions of national pride will come side by side with hate, coordinated abuse, harassment, and even threats of violence.
This poses a serious risk for the Olympians’ mental health, not to mention their performance at the games. But the IOC is exploring a new solution. The next time you post about the Olympics in the coming weeks, an AI-powered system will review your words to keep athletes safe from cyberbullying and abuse.
Online abuse has become a growing issue within elite sports, with many high-profile athletes and women calling for more to be done to protect them. American tennis player Sloane Stephens, for example, revealed she received more than 2,000 abusive messages after one match. England footballer Jude Bellingham has also spoken out over the racist abuse that he and other players receive on a regular basis. The English Football Association recently announced it is funding a police unit to prosecute people who abuse players.