Search

How can we help?

Icon

Online Safety Bill set to become law

As part of the government’s manifesto commitments, they promised to introduce a bill that would strengthen online safety in the UK particularly for minors. This commitment has now been met with the introduction of the Online Safety Bill, which has now passed through all the parliamentary stages and is awaiting Royal Assent.

The bill provides for a new regulatory framework which has the general purpose of making the use of internet services regulated and safer for individuals in the UK.

The bill places new regulatory duties on social media companies to hold them responsible for the content that they host. This landmark legislation is designed to mitigate the harms associated with online interactions, including cyberbullying and the dissemination of illegal content.

What action does the bill require Social Media companies to take?

Under the bill, social media platforms will be expected to:

  • remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • prevent children from accessing harmful and age-inappropriate content
  • enforce age limits and age-checking measures
  • ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • provide parents and children with clear and accessible ways to report problems online when they do arise

What penalties will the bill enforce?

The bill provides a framework whereby companies can be fined significant amounts if they do not act swiftly to prevent and remove illegal content, and to protect minors from seeing material that is harmful to them.

Ofcom are to be made the regulator for online safety and will have the powers to fine by up to £18 million, or 10% of their global annual revenue, whichever is higher. This therefore poses a significant change and considers the accountability principle as set out in current UK data protection legislation.

When the bill receives Royal Assent, Ofcom intends to consult on a set of standards for social media companies to meet in tackling online harms, including child sexual exploitation, fraud, and terrorism.

Further, the bill provides for the directors of these companies to be held directly liable, and potentially face prison time for the offences.

The bill also introduces legislation which will make it easier to convict individuals who share intimate images without consent, as well as adding new offences for cyber-flashing and the sharing of non-consensual deep fake pornography.

Lucy Densham Brown

Solicitor

View profile

+44 118 960 4655

The bill provides for the directors of these companies to be held directly liable, and potentially face prison time for the offences.

What are the difficulties with the law?

The bill itself has been particularly controversial and remains so. The bill is just over 300 pages and very comprehensive, although critics are calling it long and convoluted.

It is unclear yet how it will work alongside Human Rights protections such as Freedom of Speech, and how it will coexist with UK GDPR legislation to protect privacy and personal data. Messaging service WhatsApp for example has already threatened to refuse to comply with the powers in the bill, as it could require them to examine the contents of encrypted messages for illegal content.

Further, Wikipedia, the online Encyclopaedia, has said that it will not comply with the requirement to conduct age checks on users as it would violate their commitment to collect minimal data about readers and contributors.

It is unclear as yet how this will play out in practice, but it seems likely that this bill will face legal scrutiny in the courts when it does come into force, and that it will certainly have a large impact on how social media companies operate in the UK.

About this article

Disclaimer
This information is for guidance purposes only and should not be regarded as a substitute for taking legal advice. Please refer to the full General Notices on our website.

Lucy Densham Brown

Solicitor

View profile

+44 118 960 4655

About this article

Read, listen and watch our latest insights

art
  • 24 April 2024
  • Privacy and Data Protection

Personal Data FAQs

Personal data refers to any information related to an identifiable living individual. 

art
  • 22 April 2024
  • Privacy and Data Protection

Think tank study finds that up to 8 million jobs may be at risk from AI

Injuring someone’s feelings through acts of discrimination, harassment or victimisation can be a costly business.

Pub
  • 26 March 2024
  • Privacy and Data Protection

AI Podcast: AI and Data Security

In the third and final podcast in our ‘AI Podcast’ trilogy, members of the data protection team, will be discussing how to use AI to process data safely. They will be looking closely at the risks for businesses and the types of data security protections you can put in place.

art
  • 26 March 2024
  • Privacy and Data Protection

Key considerations for data retention policies

In the ever-evolving landscape of data protection regulations, data retention stands as a crucial aspect of compliance and risk management for organisations across industries.

art
  • 18 March 2024
  • Privacy and Data Protection

Consent or pay: Issues and considerations, Meta’s potential breach

The ICO has stated that any organisation considering using “consent or pay” must ensure that the consent to processing of personal data for personalised advertising is being given freely, and is fully informed.

art
  • 13 March 2024
  • Privacy and Data Protection

21 March 2024 Deadline: Are your international data transfer agreements compliant?

If your organisation transfers personal data from the UK to another country, it needs to comply with statutory requirements to ensure adequate levels of protection for that data are in place.