Search

How can we help?

Icon

Online Safety Bill set to become law

As part of the government’s manifesto commitments, they promised to introduce a bill that would strengthen online safety in the UK particularly for minors. This commitment has now been met with the introduction of the Online Safety Bill, which has now passed through all the parliamentary stages and is awaiting Royal Assent.

The bill provides for a new regulatory framework which has the general purpose of making the use of internet services regulated and safer for individuals in the UK.

The bill places new regulatory duties on social media companies to hold them responsible for the content that they host. This landmark legislation is designed to mitigate the harms associated with online interactions, including cyberbullying and the dissemination of illegal content.

What action does the bill require Social Media companies to take?

Under the bill, social media platforms will be expected to:

  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

What penalties will the bill enforce?

The bill provides a framework whereby companies can be fined significant amounts if they do not act swiftly to prevent and remove illegal content, and to protect minors from seeing material that is harmful to them.

Ofcom are to be made the regulator for online safety and will have the powers to fine by up to £18 million, or 10% of their global annual revenue, whichever is higher. This therefore poses a significant change and considers the accountability principle as set out in current UK data protection legislation.

When the bill receives Royal Assent, Ofcom intends to consult on a set of standards for social media companies to meet in tackling online harms, including child sexual exploitation, fraud, and terrorism.

Further, the bill provides for the directors of these companies to be held directly liable, and potentially face prison time for the offences.

The bill also introduces legislation which will make it easier to convict individuals who share intimate images without consent, as well as adding new offences for cyber-flashing and the sharing of non-consensual deep fake pornography.

Lucy Densham Brown

Solicitor

View profile

+44 118 960 4655

The bill provides for the directors of these companies to be held directly liable, and potentially face prison time for the offences.

What are the difficulties with the law?

The bill itself has been particularly controversial and remains so. The bill is just over 300 pages and very comprehensive, although critics are calling it long and convoluted.

It is unclear yet how it will work alongside Human Rights protections such as Freedom of Speech, and how it will coexist with UK GDPR legislation to protect privacy and personal data. Messaging service WhatsApp for example has already threatened to refuse to comply with the powers in the bill, as it could require them to examine the contents of encrypted messages for illegal content.

Further, Wikipedia, the online Encyclopaedia, has said that it will not comply with the requirement to conduct age checks on users as it would violate their commitment to collect minimal data about readers and contributors.

It is unclear as yet how this will play out in practice, but it seems likely that this bill will face legal scrutiny in the courts when it does come into force, and that it will certainly have a large impact on how social media companies operate in the UK.

About this article

Disclaimer
This information is for guidance purposes only and should not be regarded as a substitute for taking legal advice. Please refer to the full General Notices on our website.

Lucy Densham Brown

Solicitor

View profile

+44 118 960 4655

About this article

Read, listen and watch our latest insights

art
  • 12 September 2024
  • Privacy and Data Protection

2024 in review: tracking key data protection developments

As we approach the final quarter of 2024, it’s an opportune moment to revisit the data protection trends and developments that were anticipated at the end of 2023. Now, let’s see how those predictions have played out.

art
  • 02 September 2024
  • Employment

Social Media – how private is your personal data

Nowadays most people have at least one social media account. Whether it’s Facebook or TikTok, X, or LinkedIn, most adults have an online presence.

art
  • 29 August 2024
  • Privacy and Data Protection

What a controller or a processor needs to know…in a nutshell

Data processing agreements are a common feature of contracts for the supply of services, for example often featuring as self-contained schedules to master services agreements.

Pub
  • 20 August 2024
  • Privacy and Data Protection

Data Protection unlocked for HR: How to ensure compliance?

In the second episode of the ‘Data Protection Unlocked for HR’ podcast series, Harry Berryman and Shauna Jones, members of the Clarkslegal data protection team, share invaluable insights on how HR can ensure compliance, safeguard employee data, and maintain privacy standards.

art
  • 14 August 2024
  • Privacy and Data Protection

Data protection audit – what you need to know

A data protection audit is the process of auditing all of your data protection processes and procedures to understand your current levels of compliance and identify any areas for improvement.

art
  • 05 August 2024
  • Employment

AI and Recruitment

To assist employers who are using, or considering the use of, AI in recruitment, we have put together a summary of the key risks that employers should be aware of.