The Molly Rose Foundation discovered that just two websites—Pinterest and TikTok—were responsible for more than 95% of the over 12 million content moderation decisions made by six of the largest platforms.
The study also mentioned Facebook, Instagram, Snapchat, and X, which was once Twitter, as the other four platforms.
According to the foundation, most platforms’ reactions to this kind of content were “inconsistent, uneven, and unfit for purpose”.
According to the organization, X is only accountable for 1% of all suicide and self-harm content found on the main websites examined, while Meta’s Instagram and Facebook account for 1% apiece.
The organization is now issuing warnings, claiming that the Online Safety Act falls short in addressing what it considers to be blatantly obvious systemic flaws in social media companies’ content moderation strategies.
The chairman of the charity, Ian Russell, has pleaded with the government to support a new online safety bill that will allow for even more stringent legislation.