Internet Explorer 11 (IE11) is not supported. For the best experience please open using Chrome, Firefox, Safari or MS Edge

Artificial Intelligence is an exciting and rapidly evolving field with huge potential to transform our lives for the better – but can present challenges for professionals working with children and vulnerable adults. In this article, our Health & Prosecutions team takes a look at the policy approach in the EU to these issues.


In the course of our work acting for healthcare providers and, in particular, professionals working with children and vunerable adults, we are often told of risks and misgivings around AI algorithms and how they direct content to these groups. The European Commission’s Joint Research Centre (JRC) has reported on how this and other types of concerns might be addressed at a policy level.

The JRC employs scientists to provide independent scientific advice and support to EU policy. In early 2022, they published a report on Artificial Intelligence and the Rights of the Child. They found that AI based internet and digital technologies offer many opportunities but if not properly designed and used could affect rights to protection, participation, education, and privacy. They don’t address risk to vulnerable adults, but we think the issues raised are applicable across both groups.

How was the research done?

The researchers used a mix of approaches including:

  1. Examining three AI applications for children through the lens of children’s rights, looking at privacy, possible algorithmic discrimination and lack of fairness
  2. Organising two workshops with children and young people, and three workshops with policymakers and researchers in AI and children’s rights, which revealed that each group prioritised different concerns
  3. Reviewing policy initiatives for AI and children’s rights of major international organisations and noting they identified similar risks and opportunities but had differing goals and priorities

Key requirements

The report found that key requirements to develop trustworthy AI will be driven by engagement between stakeholders and addressing knowledge gaps when children interact with AI.

What makes AI trustworthy?

According to the JRC report this requires:

  1. Empowering children and their carers to have control over the way their personal data is used by AI tech
  2. Explaining AI systems in child-friendly language and in a transparent manner
  3. Holding AI actors accountable for the proper functioning of the systems
  4. Absence of discriminatory biases in the data or algorithms they rely on

How do you address this and what does effective engagement mean?

  1. The requirement at policy level is to design participation approaches involving children, researchers, policymakers, industry, parents, and teachers allowing them to define common goals and to build child-friendly AI by design
  2. Including under-represented groups which would promote fairness and mitigate discrimination
  3. Creating frameworks and toolkits, incorporating aspects such as personal data protection and risk assessment, which would help guide design and evaluation of child-friendly systems

Knowledge gaps – How do you fix this?

There is limited scientific evidence on the impact of AI on children and the report also identifies knowledge gaps that need to be addressed, such as:

  1. Research on the impact of AI tech on children’s cognitive and socio-emotional capacities
  2. Schools’ preparation of children for a world transformed by AI technology and the development of their competences and literacy, and
  3. Development of AI-based systems that are fit for particular children’s cognitive stage

Next steps

The report will be used to support the implementation of EU strategies such as the
EU strategy on the rights of the child, the EU Strategy for Better Internet for Children (BIK+), and the proposed EU AI Act.

Conclusion

The challenge for those working with children and vunerable adults remains both in educating themselves and passing on that education to those they work with, as well as reporting and feeding back their experiences to industry and regulators.

For more information and expert advice, contact a member of our Health & Prosecutions Team.

The content of this article is provided for information purposes only and does not constitute legal or other advice.



Share this: