Internet Explorer 11 (IE11) is not supported. For the best experience please open using Chrome, Firefox, Safari or MS Edge

The European Commission published a Report on Liability for Artificial Intelligence and other emerging digital technologies which details the findings of an Expert Group on Liability and New Technologies – New Technologies Formation on 21 November 2019.

The Group considered “whether and to what extent existing liability schemes are adapted to the emerging market realities following the development of new technologies such as artificial intelligence, advanced robotics, the internet of things and cyber security issues”. The Group also analysed the current liability regimes across the Member States and assessed their suitability and adequacy to deal with damage resulting from the use of emerging digital technologies.

In summary, they found that while the laws of the Member States do ensure basic protection of rights, also referred to as primarily damages in tort and contract, these laws are not specifically applicable to this dynamic, complex and fast developing area. This is as a result of more technical issues which arise with this technology such as complexity, modification through updates, self-learning during operation, limited predictability and vulnerability to cybersecurity threats.

The Group made the following key findings on how liability regimes should be designed and, where necessary, changed to adapt to this evolving area of digital technology:

  1. A person operating a permissible technology that nevertheless carries an increased risk of harm to others, for example AI-driven robots in public spaces, should be subject to strict liability for damage resulting from its operation.
  2. In situations where a service provider ensuring the necessary technical framework has a higher degree of control than the owner or user of an actual product or service equipped with AI, this should be taken into account in determining who primarily operates the technology.
  3. A person using a technology that does not pose an increased risk of harm to others should still be required to abide by duties to properly select, operate, monitor and maintain the technology in use and, failing that should be liable for breach of these duties if at fault.
  4. A person using a technology which has a certain degree of autonomy should not be less accountable for ensuing harm than if said harm had been caused by a human auxiliary.
  5. Manufacturers of products or digital content incorporating emerging digital technology should be liable for damage caused by defects in their products, even if the defect was caused by changes made to the product under the producer’s control after it had been placed on the market.
  6. For situations exposing third parties to an increased risk of harm, compulsory liability insurance could give victims better access to compensation and protect potential wrongdoers against the risk of liability.
  7. Where a particular technology increases the difficulties of proving the existence of an element of liability beyond what can be reasonably expected, victims should be entitled to facilitation of proof.
  8. Emerging digital technologies should come with logging features, where appropriate in the circumstances, and failure to log, or to provide reasonable access to logged data, should result in a reversal of the burden of proof so as not to adversely affect the victim.
  9. The destruction of the victim’s data should be regarded as damage, compensable under specific conditions.
  10. It is not necessary to give devices or autonomous systems a legal personality, as the harm these may cause can, and should, be attributable to existing persons or bodies.

For more information, contact a member of our Product Regulation team.

The content of this article is provided for information purposes only and does not constitute legal or other advice.

Share this: