Insights Generative AI: ICO publishes fifth and final consultation on allocating controllership

Contact

The Information Commissioner’s Office (“ICO”) has published the fifth and final consultation in its series on generative AI and data protection, focusing on “allocating controllership across the generative AI supply chain”.

The consultation document states that, in the context of generative AI, allocating accountability is not straightforward because of the “different ways in which generative AI models, applications and services are deployed, used and disseminated, [and] also the different levels of control and accountability that participating organisations may have”.

Nevertheless, the ICO stresses that accountability is a central principle of data protection law, both in terms of ensuring that organisations are responsible for complying with data protection law, and also so that they can demonstrate their compliance. As the ICO explains, demonstrating compliance “hinges on the accurate allocation of responsibility between the three roles an organisation can play when processing data”, namely as (1) a controller, (2) a joint controller, and (3) a processor.

The consultation document explains that “allocation of controller, joint controller or processor roles must reflect the actual levels of control and influence for each different processing activity taking place”. However, in the context of generative AI, these designations do not neatly map on to the taxonomy used in the context of generative AI systems, namely that of ‘developers’ and ‘deployers’. The ICO states that it understands that “many players in the market have sought to frame their processing relationships as one of controller and processor, where in fact joint controllership may more accurately reflect the parties’ respective roles for particular processing activities”. As a result, it seeks evidence on the criteria organisations use to identify their role as controller/processor/joint controller and how they separate the different processing activities when assessing their roles.

The consultation also considers what it refers to as the ‘generative AI supply chain’, a term aimed at capturing both the development of the AI model and the processing operations necessary to maintain it (the so-called ‘AI lifecycle’), and also a range of further activities such as problem solving, model improvement, or the creation of new applications and services built on top of the model. The ICO states that “in generative AI supply chains, we see increasing interdependencies between various entities making decisions about purposes. These entities should carefully consider the nature of the processing activities and the level of control and influence they can exercise, to ensure they have correctly identified whether they are a controller, joint controller or processor, and therefore the responsibilities that they, and other organisations in the supply chain, have”.

The ICO proceeds to examine a number of scenarios in order to determine whether an entity will likely be deemed a controller, joint controller, or processor. For example, it states that where an organisation is developing a base generative AI model to provide as a product or service, it will be a controller “for a lot of the development-related processing where [it has] influence and control over the purposes and means”. However, if that organisation makes its model available to be deployed by a separate entity, the analysis would change. The ICO provides three possibilities in that scenario:

  • the developer and deployer may be joint controllers, if they are both involved in the decisions regarding how and why each processing operation is designed and executed;
  • the developer may be a processor, if they are acting on instruction of a third party who is able to influence both the purposes and means of the processing. In practice this may be possible where a developer provides sufficient information to the deployer about the processing so they can verify compliance with the data protection legislation;
  • the developer may play no role in the processing, as it has merely produced an application, product or service that deployers decide to use in their processing activities. For example, the developer may not have trained the model using personal data. The deployers may be controllers or processors for the processing of personal data by the model, depending on the circumstances.

In the light of this analysis, the ICO seeks “tangible evidence on how organisations have undertaken, or have instructed other entities to undertake, fine-tuning of a generative AI model”. It is also interested in “the data used for the fine-tuning, the allocation of controllership for that processing along with whether or not it was possible to evaluate if that changed the behaviour of the model.

The consultation then moves on to consider which entities in the supply chain take the “overarching decisions” relating to the nature, scope, context and purpose of the processing so as to be designated a controller, and how that analysis might change both as the model is deployed by a third party and also depending on whether the third parties’ access to the model is, as the ICO puts it, “open” or “closed”. As the ICO explains, in a “closed access model”, the third party deployer could face difficulties in understanding, controlling and influencing the processing of data, and also be unlikely to have sufficient influence or control to change or understand the decisions behind the processing. Similarly, those third party deployers who seek to deploy generative AI for their purpose as controllers may “in practice face constraints (for example because of lack of in-house expertise or access to the necessary information) in terms of influencing overarching decisions”.

To that end, the consultation states that “the relationship between developers and third-party deployers in the context of generative AI will mean there are often shared objectives and influence from both parties for the processing, which means it is likely to be a joint controllership instead of a processor-controller arrangement…Determining with clarity the different processing activities will help all entities demarcate which processing they are controllers, joint controllers or processors for and justify why. Different processing activities should not be lumped together when they serve different objectives or have distinct data protection risks. For example, search engines built on top of a variety of algorithmic systems or lately LLMs can have different capabilities, functions and risks than ‘traditional’ search engines mainly using ranking systems. Distinct DPIAs may help demarcate the boundaries between them”.

The consultation concludes by providing further ‘Generative AI accountability scenarios’ and urging generative AI developers to “examine joint controllership when considering their relationship with third parties that deploy their models”. As the ICO argues, “joint controllership can be a useful approach for all parties (including data subjects) as it clarifies accountability and can mitigate compliance and reputational risks that could undermine trust in generative AI”.

The consultation is open until 18 September 2024 and can be found here