Online safety laws unsatisfactory, minister says
4 min readUK Technology Secretary Peter Kyle has admitted that the country’s online safety laws are “very uneven” and “unsatisfactory.” His comments came after calls from campaigners, including Ian Russell, father of Molly Russell, to strengthen regulations. Molly Russell tragically took her own life at the age of 14 after viewing harmful content online. Russell’s calls for action have put pressure on the government to address gaps in the existing Online Safety Act.
In a letter to Prime Minister Rishi Sunak, Ian Russell argued that the Online Safety Act, passed in 2023, requires immediate fixes to effectively protect individuals from harmful content. He particularly called for the imposition of a “duty of care” on tech companies, ensuring they are held accountable for the content shared on their platforms.
Peter Kyle, speaking to the BBC, expressed his frustration with the current state of the law, highlighting how the Conservative government had initially intended to compel social media companies to remove certain “legal-but-harmful” content, such as posts related to eating disorders. However, this provision was dropped after backlash from conservative lawmakers, including current Conservative leader Kemi Badenoch. Critics, including Badenoch, argued that the legislation risked stifling free speech, with Badenoch declaring in 2022 that the bill was “in no fit state to become law.”
The proposal to regulate “legal-but-harmful” content was ultimately abandoned for adult social media users, with the revised law focusing on providing users more control to filter out unwanted content. However, the legislation still requires tech companies to protect children from harmful material, even if it is not illegal.
Kyle acknowledged the uneven nature of the legislation and explained that the law, as it stands, is “unsatisfactory.” He did not commit to any immediate changes but expressed openness to making improvements. He emphasized that the Online Safety Act provided some “very good powers” and that he would be using these powers assertively to tackle emerging safety concerns. In the coming months, ministers are expected to gain additional authority to ensure that online platforms are providing age-appropriate content. Non-compliance by tech companies will result in “very strident” sanctions, according to Kyle.
A Whitehall source later clarified that the government had no plans to repeal the Online Safety Act but would instead focus on working within its existing framework. The government is, however, open to further legislation if needed, with officials aiming to be “agile and quick” in addressing rapidly evolving trends in online safety.
Ian Russell’s letter also referenced the growing influence of tech industry leaders such as Mark Zuckerberg, CEO of Meta, and Elon Musk, owner of X (formerly Twitter). Russell criticized Zuckerberg’s shift towards a more “laissez-faire, anything-goes model,” which he claims could bring back the harmful content that led to Molly’s tragic death. Russell expressed concern that these changes in the tech industry were putting greater pressure on the government to act.
Earlier this week, Zuckerberg announced that Meta would be phasing out fact-checking programs, opting instead for a system where users could add “community notes” to posts they deem misleading. This shift marked a significant departure from Meta’s previous approach, where third-party moderators would check potentially false or harmful content on Facebook and Instagram. Zuckerberg defended the change, arguing that content moderation had become “too politically biased” and asserting that Meta needed to return to its roots of promoting free expression. The new system, which currently applies only in the US, aims to reduce the number of “innocent” posts being removed, though Zuckerberg acknowledged that it would also lead to “less bad stuff” being caught by moderators.
Meta responded to criticism from Ian Russell by stating that its policies on content related to suicide, self-harm, and eating disorders would remain unchanged. The company assured that it would continue using automated systems to detect and remove high-risk content.
Peter Kyle addressed the issue by emphasizing that Meta’s changes were specific to the US and did not affect the UK. He made it clear that the law in the UK remains unchanged: companies operating in the country are required to abide by local laws, which mandate the removal of illegal content. The Online Safety Act, set to come into effect later this year, will require social media companies to remove illegal content such as child sexual abuse material, material inciting violence, and posts that facilitate suicide. In addition, tech companies will need to take steps to protect children from harmful content, including pornography, self-harm content, bullying, and dangerous stunts. Platforms will also be expected to adopt age-assurance technologies to prevent children from accessing inappropriate material.
The law also tackles disinformation, requiring companies to take action against illegal, state-sponsored content. If their services are likely to be used by children, companies must also implement measures to protect users from misinformation.
While the UK government’s approach to online safety has evolved, the current legal framework is still being scrutinized for its effectiveness. With evolving challenges in the tech landscape, policymakers face the delicate task of ensuring online safety without overreaching into areas that could limit free speech or hinder technological progress.