Tuesday marks the launch of the new “teen accounts” in the US, UK, Canada, and Australia.
For all under-18s, they will activate a number of privacy settings by default, such as preventing non-followers from seeing their content and requiring them to explicitly accept each new follower.
However, minors between the ages of 13 and 15 can only change the settings by adding a guardian or parent to their account.
There is pressure on social media businesses all around the world to make their platforms safer because there are worries that not enough is being done to protect children from inappropriate content.
Children’s charity in the United Kingdom Instagram’s announcement was deemed a “step in the right direction” by the NSPCC.
However, it also mentioned that account settings can “highlight the necessity for parents and kids to protect themselves.”
Preventive measures that stop harmful content and sexual abuse from spreading on Instagram in the first place are necessary to support these efforts, as stated by Rani Govender, the NSPCC’s online child safety policy manager.
A “new experience for teens, guided by parents” is how Meta characterizes the modifications.
It states that they will “better support parents and give them peace of mind that their teens are safe with the right protections in place.”
Ian Russell told the news that his daughter Molly looked at pictures of self-harm and death on Instagram before she killed herself at the age of 14.