Teen accounts Instagram’s new feature, Teen Accounts, seeks to add more protections and more control over their account settings for younger users. This new initiative, an important step toward making the platform safer, seeks to mitigate serious risks associated with the platform’s use among adolescents. According to independent research commissioned by campaigners, even with these new accounts young users will continue to face serious risks.
The introduction of Teen Accounts allows existing users to transition to this enhanced setting, while new users will automatically be assigned a Teen Account upon registration. Meta, Instagram’s parent company, states that these accounts “provide built-in protections for teens limiting who’s contacting them, the content they can see, and the time spent on our apps.”
Against this backdrop of assurances, American researcher Becca Spinks has discovered some troubling evidence. Her research uncovered Instagram groups with 65,000 members that encourage young users to harm themselves. Spinks expressed her shock at this discovery, stating, “I was absolutely floored to see 65,000 members of a community.” She further described the graphic nature of some content in these groups, noting, “It was so graphic, there were people in there taking polls on where they should cut next.”
Concerns regarding the effectiveness of Teen Accounts are compounded by Instagram’s algorithms, which still promote “sexualised imagery, harmful beauty ideals and other negative stereotypes.” As consumer advocates point out, even these problems render the security protections claimed to support the new account feature moot.
Baroness Beeban Kidron, a leading campaigner for children’s safety online, said that Instagram’s move fell short. This is no place for a teen,” she said. She explained that the platform fails to properly prevent users from passing underage and frequently pushes adult content to minors. Kidron slammed Instagram for exposing youth to commercial pressures. She emphasized the need for more awareness on the risks involved and called the overall environment “deeply sexualized.”
The UK government is taking steps to enforce stricter regulations through the Online Safety Act, which requires platforms to demonstrate that they have systems in place to protect children within three months. Even X, the former parent company of Twitter, recently issued an unequivocal threat to comply. In assuming power, they pledged to implement the Act’s principles in the UK. “We have clear rules in place to protect the safety of the service and the people using it,” X remarked.
On Tuesday, Meta announced that it was relaxing its requirements. Now, teens in the UK will have these enhanced protections come automatically into effect. What’s more, users under 16 will need to get parental permission to switch their accounts back to public mode.