Empty Link Skip to Content

The Online Safety Code: Q&A

Executive Summary

Coimisiún na Meán (the “Commission”) recently published the final Online Safety Code (the “Code”) which sets out binding rules for video-sharing platform services (“VSPS”) that have established EU headquarters in Ireland.

The Code is part of the Commission’s overall Online Safety Framework and follows a public consultation period, which ran from December 2023 to February 2024.  The purpose of the Code is to ensure that relevant online services (which includes VSPSs) have adequate protections in place for their users pursuant to the Online Safety and Media Regulation Act 2022 (the “Act”) and the 2018 EU Audiovisual Media Services Directive (the “AVMS Directive”).

When is the Code applicable?

The Code is divided into two parts - Part A sets out general obligations which came into effect on 19 November 2024. Part B sets out the more prescriptive obligations which will become fully effective on 21 July 2025 to afford VSPSs an implementation window of up to nine months to design internal systems to ensure compliance with these obligations.

Who does the Code apply to?

The Code applies to VSPS, meaning services which allow users to upload and view videos online, who have established their EU headquarters in Ireland.  In January 2024, the Commission formally designated 10 VSPSs to whom the Code may apply. The full list is available on the Commission’s website. The Commission has so far issued notices of determination of the application of the Code to nine of the 10 platforms.

What are the obligations?

Part A

Part A of the Code sets out general obligations for VSPS to protect minors and the general public against certain video content.  It requires VSPS providers to take measures, which are appropriate to the size of the platform and nature of the service, to protect: 

  1. children or minors from programmes, user-generated videos and audiovisual commercial communications which may impair their physical, mental, or moral development;
  2. the general public from programmes, user-generated videos and audiovisual commercial communications containing incitement to violence or hatred against a group or a member of a group based on any of the grounds referred to in Article 21 of the EU Charter on Fundamental Rights (ie, sex, race, colour, ethnic or social origin, genetic features, language, religion or belief, political or any other opinion, membership of a national minority, property, birth, disability, age or sexual orientation); and
  3. the general public from programmes, user-generated videos, and audiovisual commercial communications containing content the dissemination of which constitutes an activity which is a criminal offence under Union law, namely public provocation to commit a terrorist offence, offences concerning child sex abuse material, and offences concerning racism and xenophobia.

The Code provides a list of ten appropriate measures that platforms should implement to protect children and the general public in respect of the content listed at (a) to (c).  These include: implementation of terms and conditions which include requirements to take measures to protect the public from such content; user-friendly mechanisms to report or flag content, the implementation of age verification systems in respect of certain content and easy-to-use content rating systems allowing users to rate the content.

Compliance can be achieved independently of any one specific measure being taken, once the VSPS can demonstrate adequate protections are in place. However, the strictest measures should be employed for the purposes of protecting minors from the most harmful content. It will ultimately be for the Commission to determine whether the measures adopted by a VSPS are adequate, given the size of the VSPS and the nature of the service that is provided, to provide the necessary protections.

Part B

Part B of the Code sets out more detailed obligations for VSPSs to protect minors and the general public against certain content including (but not limited to) that which constitutes cyberbullying, promotion of sharing methods of self-harm or suicide (including dangerous challenges), promoting eating disorders, EU criminal content (child sexual abuse material, terrorism, racism, or xenophobia that promotes hate or violence) and adult-only content.

The obligations Part B requires VSPSs to:

  • ensure terms and conditions contain restrictions which preclude users from uploading or sharing restricted video content or restricted indissociable user-generated content;
  • ensure terms and conditions either preclude users from uploading “adult only” content or require users who upload such content to rate it as such and implement effective age assurance measures;
  • where appropriate and where users have been given prior warning, suspend users who have infringed the terms and conditions of the platform;
  • provide for parental controls for users under the age of 16 in respect of content which may impair their physical, mental or moral development, and provide information as to how those controls operate;
  • establish and operate transparent and user-friendly mechanisms for users of a VSPS to report or flag infringing content; and
  • establish and operate transparent, easy-to-use and effective procedures for the handling and resolution of complaints made by users in relation to the above measures.

Each VSPS provider is required to provide a report to the Commission every 3 months on the service provider’s handling of communications from users raising complaints or other matters.

In addition, and notable from a data protection perspective, VSPS providers must ensure the personal data of children collected / generated by them when implementing obligations in the Code relating to age verification and parental controls is not processed for commercial purposes, such as direct marketing or targeted advertising.

Enforcement

The Commission is empowered to impose maximum fines for failure to comply with the Code of €20 million or 10% of the turnover of the VSPS provider for the previous financial year, whichever is greater.  

In determining whether a VSPS has failed to comply with the Code and any enforcement steps to be taken, the Commission shall have regard to whether compliance with an obligation under the Code is proportionate and practicable for the VSPS to apply, taking into account the size of the VSPS and the nature of the service provided.

DSA interaction

The Code will be enforced alongside the Digital Services Act, which came into full effect on 17 February 2024.  While the Code will apply to VSPSs, the Digital Services Act applies to a wider range of online services, with differing levels of obligations depending on the size and nature of those services. Similar to the DSA, transparency is a key focus of the Code. Certain areas, including disinformation and recommender systems, are not specifically addressed by the Code, however form part of the overall requirements of the DSA.

If you have any queries in relation to this update, please contact  Sarah Jayne HannaSimon Shinkwin or any member of the Technology and Innovation Group or Competition and Regulation Group.