Internet Explorer 11 (IE11) is not supported. For the best experience please open using Chrome, Firefox, Safari or MS Edge

Online video-sharing and platform providers that have head offices here should take note; they may fall under the purview of the Online Safety Act (Act).

Minister Richard Bruton TD announced that he will introduce the Act to improve online safety and ‘ensure that children can be protected online’. The Act also aims to implement changes required under the Audiovisual Media Services Directive (AVMSD). The deadline to implement changes under the AVMSD is 19 September 2020.

The Act would introduce online safety rules applicable to Irish residents. These include the regulation of video sharing platforms like YouTube; on demand services like RTE Player; Virgin Media Player; iTunes; and also include minor changes to the regulation of traditional TV.

Expansion of content regulation to apply to all online platforms and not just video sharing platform services

The revised AVMSD regulates video sharing platform services, or VSPs, for the first time. The Act proposes to expand the application of these rules to apply not only to VSPs, but also to platforms in respect of other user-generated content for Irish residents. User-generated content would include photos, comments and other material which is not audio-visual in nature.

Under the Act, VSPs and other online platforms would have to take measures to protect and enforce these principles through their terms with both users and advertisers, which include:

  • Minors from potentially harmful content

  • The general public from content containing incitement to violence or hatred, and

  • The general public from content, the distribution of which constitutes is a criminal offence under EU law, including content containing provocation to commit a terrorist offence, offences concerning child sexual abuse material or concerning racism and xenophobia.

VSPs and online platforms may also be required to:

  • Operate an online safety code, which could be certified as ‘fit for purpose’ by a regulator or required to change
  • Build safety into the design of the platforms through technology and human intervention
  • Provide a transparent and easy to use system for users to flag potentially breaching content
  • Implement user-led content rating systems
  • Age verification and parental control systems to users
  • Provide a complaints mechanism

Harmful content

The Minister specifically identified three types of material which could be included in a definition of harmful content including:

  • Serious cyber bullying, including content which is seriously threatening, seriously intimidating, seriously harassing or seriously humiliating

  • Material which promotes self-harm or suicide, and

  • Material designed to encourage prolonged nutritional deprivation that would have the effect of exposing a person to risk of death or endangering health

Proposed Online Safety Commissioner

The Act may also introduce a Commissioner with extensive powers including to:

  • Require regular reports from industry

  • Audit

  • Require content takedown within a set timeframe

  • Enforcement power including the power to seek court injunctions

  • Impose administrative fines

  • Publish the fact that a service has failed to comply or co-operate, and

  • Seek that criminal proceedings be brought against a service provider

Changes in regulation of linear and non-linear services - TV versus on-demand services

An additional important change for Ireland is the requirement to more closely monitor the activities of on-demand services like those offered by RTE Player, Virgin Media Player and iTunes. The Minister is considering the type of relationship an on-demand service established in Ireland and the Department should have, and whether the same content rules should apply to both services.

Proposed regulatory structure

The Act proposes to set up a regulatory structure to oversee in one of two ways:

  • A multi-person commission akin to the Competition and Consumer Protection Commission, or

  • Two regulatory bodies: one to oversee content under editorial control and the other to regulate online safety

Although the Act may propose to criminalise additional behaviour, the regulator would not have a role in relation to prosecution of individuals for disseminating illegal content.

Consultation period

A short 6 week consultation period on the options proposed by the Minister will end on close of business, 15 April 2019. After the consultation period, the Minister will introduce a draft heads of bill to government.

The EU and other jurisdictions

Other jurisdictions are taking similar action to prevent harmful content online, specifically to protect children. The Department cited the establishment of the eSafety Commissioner in Australia and the Harmful Digital Communications act 2015 in New Zealand as comparisons.

The EU Commission recently established the Expert Group on Safer Internet for Children. The group will develop best practice principles to implement across EU member states to keep children safe when using the internet.


The introduction or the Act is in step with regulatory activity in other jurisdictions.

However, preventing the dissemination of harmful content while also respecting legitimate speech and the traditional immunities afforded to online service providers will take time to work through. The Act should be technology neutral, align with internationally-accepted standards for child safety online, and not be reactionary. As the Department itself notes, for a regulator’s powers and sanctions to be effective and proportionate, the obligations on service providers and their bases must be clear. Fast-tracking the legislation to achieve these objectives may not be the best way forward.

Interested parties can make submissions to the consultation process here

For more information on this topic, contact a member of our Technology team.

The content of this article is provided for information purposes only and does not constitute legal or other advice.

Share this: