Add a bookmark to get started

23 February 20237 minute read

Ireland enacts its Online Safety and Media Regulation Act and appoints its first Media Commission

Change is Here

After many years of debate, Ireland has finally adopted bold new rules on online content which are expected to fundamentally shift how users, both in Ireland and elsewhere, consume, contribute and complain about a vast range of media – from Netflix to newspapers, from TV to TikTok. 

On 10 December 2022, the long anticipated Online Safety and Media Regulation Act (Act) was signed into Irish law.  This development was closely followed on 17 January 2023 by the nomination of the new Media Commission including the first Online Safety Commissioner.  The new Act goes above and beyond mere implementation of the provisions of the revised Audiovisual Media Services Directive and anticipates penalties and enforcement mechanisms on a scale provided for in the GDPR.  It is also inevitable that aspects of the Act will overlap with the EU’s Digital Services Act.  Given Ireland’s prominence as a location for tech firms to headquarter within the EU, it anticipated that the Act will have a broad impact.

In this update, we share our thoughts on the following key changes:

1   Establishment of a Media Commission

2   New regulatory regime for audiovisual on-demand media services and video-sharing platform services

3   Regulation of content for online safety

4   An individual complaints mechanism

 

The Media Commission (Coimisiún Na Meán)

The Act creates a new and potentially powerful Media Commission in place of the existing Broadcasting Authority of Ireland.  The Media Commission will have responsibility to oversee the updated regulations for broadcasting, audio-visual on-demand media services and video-sharing platform services.  The Media Commission will establish and maintain a register of media services for all audiovisual on-demand media services in Ireland.  Significantly, the Media Commission will have a broad range of investigative and enforcement powers including:

  • the imposition of a financial sanction of up to EUR20 million or 10% of turnover;
  • compelling a designated online service to take certain specified actions;
  • removing an on-demand audiovisual media service from the registry of such services; or
  • blocking access to a designated online service in Ireland.

The Irish Government has named four individuals to be appointed to the Media Commission including the Online Safety Commissioner.  A Commissioner for Digital Services will also be appointed as the Media Commission has been designated as Ireland’s Digital Services Coordinator for the purposes of implementation and enforcement of the EU’s Digital Services Act, which seeks to modernise the e-Commerce Directive regarding illegal content, transparent advertising, and disinformation.

 

Regulation of Video-Sharing Platform Services and Audiovisual on Demand Media Services

The Act will bring about a major overhaul of media and content regulation in Ireland by amending the Broadcasting Act 2009, implementing the revised Audiovisual Media Services Directive and aligning the regulation of traditional broadcasting services and audiovisual on-demand media services.  The new measures under the Act are wide-ranging, affecting not only traditional media broadcasters but also many audiovisual media services/internet service providers, including video-sharing platform services, who previously only faced limited regulatory oversight. 

Audiovisual on-demand media service providers will be required to register their services and will have to comply with binding media codes and rules covering a range of matters. The Media Commission has the power to direct audiovisual on-demand media service providers to register their services. Failure to register by an audiovisual on-demand media service provider may result in administrative financial sanctions.

The Act will bring video-sharing platform service, such as YouTube, within the scope of Irish regulation for the first time.  The Media Commission will have the power to designate a relevant video-sharing platform service or a category of video-sharing platform services as a service to which the online safety codes may be applicable. The Media Commission must ensure that video-sharing platform services take appropriate measures to protect their users and comply with any requirements set out by the revised Audiovisual On-Demand Media Directive.

 

Online safety regime

A new online safety framework regime will apply to video-sharing platform services and other online services designated by the Media Commission.  These may range from social media and online gaming to private messaging services. The new regime will focus on the spread and amplification of harmful online content, which is to include:

  • certain criminalised content under Irish criminal law;
  • harmful content that meets a certain risk threshold (which is to be determined on the balance of probabilities), such as:
  1. cyberbullying/online humiliation;
  2. encouraging or promoting eating disorders;
  3. encouraging, promoting, or making available knowledge of self-harm or suicide;
  4. introducing a further criminal offence of “flashing”, which will fall within a further category of specific harmful online content.  Online service providers will have to remove content which falls within the category upon request.

A binding online safety code will be created to deal with a range of matters, including standards and measures relating to content delivery and moderation, assessments by service providers of the availability of harmful online content, the making of reports and the handling of user complaints.  In addition, there will be non-binding online safety guidance materials and advisory notes in order to foster a safety-first culture of compliance.  Online safety will be regulated and overseen by the newly appointed Online Safety Commissioner.

 

New Individual Complaint Mechanisms

Systemic issues will be brought to the attention of the Media Commission through the creation of a “super-complaints” scheme working in tandem with nominated bodies, including expert NGOs in areas such as child protection.  In 2022, an expert group determined that an individual complaints mechanism is feasible (but required effective resourcing).  For now, the intention is to introduce such a mechanism on a phased basis prioritising those complaints where online content relates to children (e.g. cyber-bullying).  Further resourcing will be an issue and require refinement. Notably, the Department intends to cover resourcing costs through a levy on the providers of designated online services.

The expert group recommends that the Media Commission be enabled to (a) handle complaints relating to the categories of harmful online content that relate to non-offence specific content (i.e. content such as bullying, humiliation, eating disorders, suicide, self-harm and any other category of harmful online content specified by the Media Commission), and (b) triage complaints relating to the offence-specific categories of harmful online content in co-operation with law enforcement and other relevant bodies.  The report also recommended that the Media Commission be permitted to consider complaints relating to offence-specific content if law enforcement or another relevant body informs the Media Commission that it is not pursuing a complaint relating to offence-specific content that has been referred to them by the Media Commission.

 

Wider Regulatory Agenda

The developments in Ireland reflect similar developments in other EU jurisdictions and the UK where national laws have been enacted or are likely to be enacted in order to regulate online safety over and above the existing frameworks.  These developments are likely to lead to significant potential tensions and even conflict between content regulation at an EU level and at national level creating a real challenge for organisations – see below to related DLA Piper updates.  It is inevitable that aspects of the Act will overlap with the EU’s Digital Services Act although the precise nature of that overlap remains to be seen.  In any event, it is clear that there is a shift towards increased regulation of digital content and an acknowledgement at national and EU level that such a shift is overdue.

 

Next steps

The Act has not yet been commenced but that is expected very shortly.  We will be publishing further, more detailed analysis of the Act and content regulation in Ireland more generally in the coming months.  You might also be interested to our related updates at:

Max Mitscherlich has helpfully contributed to the drafting of this article.

Print