The UK government is set to implement new regulations under the Online Safety Act, which aims to bolster protections for children navigating the digital landscape. The act requires more robust age verification measures. Businesses are now required to prove that users are above the age of 18 before they provide access to any adult-related content. This bill will go into effect on July 25, 2025, and take aim at social media, search engines, and gaming apps.
The Online Safety Act would hold companies accountable for removing harmful content from their platforms quickly, and it would compel them to do so. This goes for any materials that advocate for suicide, self-harm, trans people, eating disorders, and porn! Each organization should designate a specific person who is accountable for the safety of children. They must conduct annual impact assessments on efforts for risk mitigation and management focused on children on their platforms.
According to Ofcom research, children aged eight to 17 spend an average of two to five hours online each day. The Children’s Commissioner found it extremely concerning that, of 13-year-olds surveyed, half had been exposed to “hardcore, misogynistic” pornographic content. That hateful material saw a prominent display across all social media. Taken together, these findings highlight the importance and urgency behind the new rules that seek to protect young users.
Firms will need to implement more than 40 enforceable steps to remain open and operating in the UK. One major change would be to change the algorithms that determine which content gets shown in kids’ feeds. This is intended to weed out abusive content with greater precision and help foster a more secure online space for young users.
The consequences for not following the rules are quite severe. This new enforcement power allows for penalties of up to £18 million or 10% of a company’s global revenue. Furthermore, senior executives would be subject to jail time for failures in their duty of care to ensure the online safety of users.
Ian Russell, chairman of the Molly Rose Foundation, went on to say that he feared the new regulations wouldn’t be effective. He told the hearing that he was “dismayed by the lack of ambition” in the codes meant to protect children online. The foundation was established in her memory to empower girls through sports and active lifestyles. Sadly, she ended her life at just 14 years old after being targeted by negative content on Instagram and Pinterest in 2017.
The Duke and Duchess of Sussex are not the only ones to speak out on the importance of protecting children online. They reiterated that “not enough is being done” to address the harms caused by social media companies. This feeling is indicative of an increasing demand from parents, educators, public health advocates and many more for greater protections from dangerous digital content.
To supporters, the critics have a point—that invasive or ineffective methods might be appropriate. Silkie Carlo, the Director of Big Brother Watch, highlighted major issues regarding data breaches and violations of privacy. She highlighted the dangers of misinformation, digital inequality, and censorship that would result from using such verification processes.
As the deadline for these regulations approaches, businesses are under the gun to get in compliance. They are ensuring the safety of their younger audiences now more than ever before. The new rules represent a significant advance in combatting the dangers of content on the internet. They would like to see children protected from all harmful material.