The controversial UK legislation, which introduces a new set of content moderation rules for online platforms and designates Ofcom as the primary internet regulator, has been approved by parliament. This paves the way for Royal Assent, and the Online Safety Bill is expected to become law in the coming days.
During the final stages of the bill’s discussion in the House of Lords, Lord Parkinson of Whitley Bay reiterated the government’s aim to make the UK the safest place online, particularly for children. Following affirmative votes on some last-minute amendments, the focus now shifts swiftly to Ofcom, which is prepared to implement the legislation promptly.
The new law grants Ofcom the authority to impose fines of up to 10% of annual turnover or up to £18 million, whichever is higher, for violations of the regulatory framework.
The Online Safety Bill, previously known as the Harms Bill, has been in development for several years as policymakers in the UK grappled with various online safety concerns. Originally, it aimed to address illegal content like terrorism and child sexual abuse material (CSAM) while also targeting a wide range of potentially harmful online activities, including violent content, incitement to violence, suicide encouragement, disinformation, cyberbullying, and children accessing adult material. The bill was officially published in May 2021.
Over time, the bill’s scope expanded to encompass additional responsibilities and requirements to address a variety of safety concerns, such as trolling, fraudulent advertisements, deepfake pornography, and animal cruelty. Changes in the governing Conservative party led to different senior ministers overseeing the legislation, including Oliver Dowden and Nadine Dorries, who pushed for the swift implementation of criminal liability for tech CEOs.
The current Secretary of State in charge of the bill is Michelle Donelan, who scaled back its scope at the end of the previous year, removing provisions focused on legal but harmful content due to concerns about potential impacts on free speech. However, civil rights and free speech advocates remain apprehensive.
Another major point of contention revolves around the potential impact on web security and privacy. The bill grants sweeping powers to Ofcom to require platforms to scan message content for illegal material, raising concerns from end-to-end encrypted services that have threatened to leave the UK unless the bill is amended to protect strong encryption. While the government has attempted to address this issue with a carefully worded ministerial statement, privacy and security experts remain vigilant.
There is also concern that the bill may lead to widespread age verification for the UK internet, as web services attempt to limit their liability by confirming users’ ages before accessing potentially inappropriate content for minors. Jimmy Wales, the founder of Wikipedia, has criticized the bill as an instrument of state censorship and pledged not to age-gate or selectively censor articles under any circumstances.
Balancing the demands of child safety advocates for a completely secure internet and the concerns of digital rights, civil liberties, and human rights groups about potential infringements on democratic freedoms will now be the responsibility of Ofcom.
In a brief statement, the UK’s new web content regulator welcomed the bill’s passage through parliament and expressed readiness to implement the new rules. Dame Melanie Dawes, Ofcom’s CEO, stated that Ofcom will soon consult on the first set of standards for tech firms to meet in addressing illegal online harms, including child sexual exploitation, fraud, and terrorism.
Beyond specific concerns, there is a general worry about the extensive regulatory burden that the legislation will impose on the UK’s digital economy. It applies not only to major social media platforms but also to numerous smaller and less well-resourced online services that must comply or face significant penalties.