top of page

UNITED KINGDOM: Online Safety Act

The United Kingdom Parliament passed the Online Safety Act 2023 in order to regulate media and expression on the internet. On May 12, 2021, it was released as a draft bill, and on September 19, 2023, it was read a third time. The BBC reports that the Act, which was modeled after the 2019 Online Harms White Paper, allows the relevant Secretary of State to choose which speech and media are considered "harmful" and to record or censor them, subject to legislative permission. On October 26, 2023, the Act obtained royal assent and was enacted into UK law.

Despite expert concerns that it is impossible to deploy such a scanning system without jeopardizing users' privacy, the Act mandates services, including end-to-end encrypted messengers, to scan for child sexual abuse material. The Act provides new functions and powers for Ofcom. Ofcom is the regulator and competition authority for the UK communications industries. It regulates the TV and radio sectors, fixed line telecoms, mobiles, postal services, plus the airwaves over which wireless devices operate.


The Act covers user-to-user services, which are defined as an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service. The examples are instant messaging and chat platforms, social media photo and video sharing services, online and mobile games, search services, and pornographic websites.

The act excepted from its scope part of a regulated search service if:

(a)the only user-generated content enabled by that part of the service is content of any of the following kinds

(i)emails, SMS and MMS messages, one-to-one live aural communications and related identifying content;

(ii)content arising in connection with any of the activities described in paragraph 4(1) of Schedule 1 (comments etc on provider content); and

(b)no regulated provider pornographic content is published or displayed on that part of the service.

concerns brought about by content and behavior on their mobile applications and websites.


The Act imposes a new duty of care on online platforms, mandating that they defend their users' material from unlawful or lawful but "harmful" information.

The goal here is to prevent illicit content from ever appearing, not just to remove it from the web already in existence. Platforms will have to consider how they build their websites in order to lessen the possibility that they will be used for illegal purposes in the first place.

The following are examples of illegal content that platforms must remove:

  1. child sexual abuse;

  2. controlling or coercive behaviour;

  3. extreme sexual violence;

  4. fraud;

  5. hate crime;

  6. inciting violence;

  7. illegal immigration and people smuggling;

  8. promoting or facilitating suicide;

  9. promoting self harm;

  10. revenge porn;

  11. selling illegal drugs or weapons;

  12. sexual exploits.

Platforms who disregard this obligation risk fines of up to £18 million, or 10% of their yearly revenue, whichever is greater. It also gives Ofcom the authority to prohibit access to specific websites. It requires big social media companies to maintain access to journalistic or "democratically important" content, like user comments on political parties and problems, and not remove it.

The Act faced criticism for its plans to limit the publication of "lawful but harmful" speech, which amounted to a new kind of speech restriction that would have otherwise been lawful.Consequently, the Act's provisions aimed at compelling large digital platforms to delete "legal but harmful" content were struck in November 2022. Rather, digital platforms will have to implement mechanisms that let consumers more effectively screen out "harmful" content that they don't wish to view.

The Act gives the Secretary of State broad authority to instruct Ofcom, the media regulator, in carrying out its duties, including instructing Ofcom over the content of codes of practice. Concerns have been made over the government's potential to erode Ofcom's independence and authority by interfering with speech regulations through unrestricted emergency-like powers.

Read full text

2 views0 comments


bottom of page