7th November 2024
Hilton London Canary Wharf
24th June 2025
Hilton London Canary Wharf
Search
Close this search box.
TSS
justt-banner-advert
TSS
justt-banner-advert

What data protection considerations are there when procuring, developing and deploying AI systems?

By Liz Smith, associate in the commercial team at independent UK law firm Burges Salmon

In the rapidly evolving landscape of AI technology, data protection remains a crucial area of concern for businesses. Here we summarise some of the key data protection considerations for businesses procuring, developing or deploying AI systems…

  • Purpose and lawful basis: Whenever personal data is processed within the AI value chain, whether the business is developing, deploying or procuring an AI system, there must be an appropriate lawful basis and such personal data must only be processed for the stated purpose.
  • Role: It is important to identify from an early stage the role of the business in the context of data protection legislation (Data Controller, Data Processor or Joint Controller) to understand the applicable obligations. The role of the business is likely to change based on where the business sits in the AI value chain, and whether it is deploying and developing an AI model vs procuring an AI model. Where the business acts as a data controller and is procuring an AI model, it needs to be clear on what personal data is being processed by the supplier and for what purpose (for example, is personal data being used to train the model?)
  • Security: It is important to ensure appropriate levels of security against unauthorised or unlawful processing, accidental loss, destruction or damage. As AI is rapidly developing the security risks are also changing at pace. Most businesses will likely procure an AI system rather than develop one in house. The integration of the AI system into the wider IT structure, as well as reliance on third party software and intricacy of the AI value chain, adds an extra degree of complexity which is likely to increase security risks. This complexity can make it more difficult to identify and manage securityrisks and to flow-down and monitor compliance with security policies, therefore it is important businesses undertake robust due diligence when engaging suppliers and pay special attention to the specific risks posed by AI systems to their business. Given this is a rapidly developing area, businesses should actively monitor and take into account state-of-the-art security practices.
  • Data Protection Impact Assessments (DPIA): A DPIA is a critical process for organisations using AI to ensure that personal data is handled lawfully and transparently.  A DPIA must be completed before an AI system is deployed if processing is likely to result in high risk to individuals. The meaning of ‘likely to result in high risk’ is not defined in UK GDPR but a key point to note that it is the purpose of the DPIA to ascertain if the processing is high risk, so whether or not a DPIA is required should be determined on an assessment of the potential for high-risk. Some processing of personal data (for example, large scale use of sensitive data) will always require a DPIA.
  • Transparency: Businesses must be transparent about their use of AI, providing clear information to individuals about how their data is being used and for what purposes. The ICO’s guidance focuses on ensuring AI systems are “explainable” to data subjects and emphasises the need to tailor explanations to affected individuals. Businesses should consider whether updates are required to their data protection policy and privacy notices to meet this requirement.
  • Automated decision making: The use of automated decision making which has legal or similarly significant effects on an individual triggers specific legal requirements under data protection legislation. If the decision impacts any individual legal entitlements or the ability to obtain funding or secure a job it is likely to fall in scope of these specific legal requirements. Businesses can only carry out this type of decision making where the decision is:
  1. necessary for the entry into or performance of a contract;
  2. authorised by law that applies to the business; or
  3. based on the individual’s explicit consent.

If this is the case, the law requires businesses to give individuals specific information about the process (about the logic involved, the significance and the envisaged consequences). Businesses will need to introduce methods for any relevant individuals to request human intervention or challenge a decision which impacts them and will need to carry out regular check to ensure the systems are working as intended.

  • Bias and discrimination: If left unchecked, AI systems can inadvertently lead to bias and discrimination. Bias can arise because of the contents of the training data or the way it has been labelled by humans. Deploying an AI system with underlying bias increases the risk of an AI system making a discriminatory decision, especially in the context of hiring, promotions, or performance assessments. This exposes the business to claims of discrimination. It could impact the day to day operations of the business if an AI system needs to be removed or fixed to resolve the issue and it may lead to internal or external reputational damage. Businesses deploying AI systems will benefit from testing the decisions made by the AI system for different groups to assess whether the outcomes are acceptable. It may also be appropriate to carry out or request an audit of the underlying data used to obtain a clear understanding of how the AI system has been trained.
  • Supply chain due diligence: The majority of businesses will procure AI systems from a third party. Businesses that develop AI may obtain training data from an external source. Wherever a business sits within the AI value chain, carrying out due diligence checks on any third party providers to ensure compliance with data protection legislation is key.

Photo by Tim Mossholder on Unsplash

YOU MIGHT ALSO LIKE

Leave a Reply

Your email address will not be published. Required fields are marked *