Internet Explorer 11 (IE11) is not supported. For the best experience please open using Chrome, Firefox, Safari or MS Edge

WRC Guidance on the Use of AI to Prepare Submissions

In the recent highly publicised case before the WRC involving Ryanair and a former member of its cabin crew, an Adjudication Officer criticised the complainant’s use of AI in drafting his submissions. The WRC has since drafted guidance on the use of AI tools in preparing written submissions or documents for use as evidence. Our Employment Law & Benefits team review these guidelines and provide key takeaways for litigants.


What you need to know

  • Recent caselaw: After criticising the complainant’s use of AI in Fernando Oliveira v Ryanair DAC, the Workplace Relations Commission (WRC) published guidance on the use of AI tools in the preparation of submissions.
  • Associated risks: AI tools are not trained on Irish law or procedures, and they sometimes fabricate caselaw and legislation. In addition, AI generated arguments may not reflect the realities of Irish law. Importantly from a confidentiality perspective, AI tools can store or use personal data or commercially sensitive information provided to them.
  • Responsibility: The WRC treats all submissions as the relevant party’s own even if AI tools are used in their preparation.
  • Optional disclosure: The guidance note discusses the option of disclosing the use of AI in drafting submissions as a voluntary measure.
  • Misuse and consequences: If submissions are found to contain incorrect or misleading information, it may affect their admissibility, cause delays, undermine the party’s arguments and/or affect their credibility.

Recent caselaw

In the recent highly publicised case of Fernando Oliveira v Ryanair DAC[1], an Adjudication Officer emphasised the duty that each party to WRC proceedings has to:

“…ensure that their submissions are relevant and accurate and do not set out to mislead either the other party or the Adjudication Officer.”

The complainant in this case had used an Artificial Intelligence tool (AI tool) to draft his submissions and this led to inclusion of “citations that were not relevant, mis-quoted and in many instances, non-existent.”

The WRC has since drafted guidance on the use of AI tools in preparing written submissions or documents for use as evidence in employment and equality law cases.

Associated risks

  • AI tools are not trained on Irish employment and equality law or WRC procedures. Therefore, while it is acknowledged in the guidance note that AI tools may prove helpful in drafting documents, it warns that they should not be relied on as legal advice.
  • AI tools sometimes fabricate caselaw, legislation or WRC decisions that do not exist. In Fernando Oliveira v Ryanair DAC, at least two of the citations relied on, specifically ADJ-00039821 and ADJ-00040112, appeared to be AI hallucinations, as they did not refer to reported WRC caselaw.
  • AI generated arguments may sound confident but may not reflect the realities of Irish employment or equality law.
  • AI tools may store or use personal data or commercially sensitive information provided to them.
  • The guidance document emphasises that the WRC treats all submissions as the party’s own even if AI tools are used in their preparation. This means that if any legal information is incorrect or misleading, this is taken to be the responsibility of the party who is relying on the submissions.

Optional disclosure of AI use

The guidance note discusses the option of disclosing the use of AI in drafting submissions as a voluntary measure. It recommends the use of the following statement:

“Parts of this submission were drafted using an AI writing tool. I have reviewed and confirmed the accuracy of all content.”

While this disclosure statement is optional, it promotes transparency and assists the WRC in understanding how a submission was prepared.

Misuse and consequences

In the case of Erdogan v Workplace Relations Commission[2], Mr Justice Simons stated that:

“an Adjudication Officer is entitled to ensure that a hearing progresses expeditiously by asking parties to confine themselves to the relevant issues”.

If submissions are found to contain incorrect or misleading information, whether produced by AI or otherwise, an Adjudication Officer may refuse to admit it as evidence.

Inaccuracy or irrelevance may also cause delays in the case. In Fernando Oliveira v Ryanair DAC, the Adjudication Officer noted that a considerable amount of Ryanair’s and the Adjudication Officer’s time was “wasted” trying to establish whether the citations provided by the complainant were legitimate or not.

In serious cases, the inclusion of misleading information may undermine the arguments of the party and affect their credibility.

Key takeaways for litigants

  1. Understand the content: Do not include material in your submissions that you do not fully understand, or cannot explain or stand over if questioned.
  2. Data sensitivity: Avoid inputting sensitive personal or commercial data into public AI tools.
  3. Legal strategy: AI cannot assess the specific strengths or weaknesses of a case and should not be relied upon for legal strategy.
  4. Use AI as an assistant, not a lawyer: AI can be helpful for organising thoughts and even producing first drafts of documents, but it is not a substitute for legal advice.
  5. Legal advice: Seek specific, expert legal advice to avoid the major risks associated with the use of AI in drafting submissions.

Comment

While AI tools can be used to support litigation preparation, it is not a substitute for understanding the law and applying it accurately. For expert assistance and advice in the areas of employment and equality law, please contact our Employment Law & Benefits team.

The content of this article is provided for information purposes only and does not constitute legal or other advice.

[1] ADJ-00055225.

[2] [2021] IEHC 348.



Share this: