Insights Challenging the logic behind the UKGC’s requirement on operators to manually review automated processes identifying significant indicators of harm

The UKGC has recently published its guidance for SR Code 3.4.3. You can read our Betting & Gaming team’s thoughts on the guidance and the code generally here.

One aspect that operators may well be wondering about is how SR Code 3.4.3 (11) (Requirement 11) ties in with data protection, particularly given that it is the UKGC’s view that the requirement is consistent with data protection requirements, as detailed in its accompanying guidance.  This assurance from the UKGC is perhaps aimed to provide comfort to operators, however it is difficult to take much comfort from the statement when it appears that the UKGC has misinterpreted the data protection article to which Requirement 11 borrows from so heavily, Article 22

Article 22 of the GDPR relates to automated decision-making. In the UKGC’s defence, it is widely established that Article 22 is not very clearly drafted. However, this does not help operators who will now have to understand and implement exactly what Requirement 11 is asking of them and then ensure that the processes put in place comply with GDPR.

In essence, if an operator undertakes automated decision-making (that is automated decisions about individuals made with no human intervention) and such decisions have a legal effect or a similar significant effect on the individual then this will only be permitted where:

  • the individual has given their consent;
  • the decision is necessary for the performance of a contract with the individual; or
  • the decision is authorised by law.

Where the decision is permitted under the first two reasons (consent or contract), the individual has the right to contest the decision and the right for the decision to be subject to human intervention.  It has never been entirely clear what is meant by a legal effect or similar significant effect, but the European Data Protection Board gives the following examples: denial of housing benefits; denial of citizenship; denial of credit; denial of employment; and cancellation of a contract. It is worth noting that Article 22 also, as a sort of cumbersome afterthought, extends its scope to profiling through the nifty use of parenthesis.

Requirement 11 of SR Code 3.4.3 clearly envisages that operators implement automated processes early in the journey of players and it is difficult to imagine how these processes could identify significant indicators of harm without a degree of profiling. Article 22, therefore, should be firmly in the forefront of operators’ minds (it certainly will be in the mind of their DPOs).

The key step for operators will be to identify how exactly the automated processes (and any associated profiling) will be implemented. This may be done in-house, but also operators may turn to large data holders with expertise in credit analysis of individuals. Whichever path an operator chooses, it will be vital to conduct a data protection impact assessment, which should highlight and consider (inter alia): the suitability of their chosen provider; data minimisation; data security; transparency; and the appropriate lawful basis. It will also be important for the impact assessment to consider if consequential measures proposed by operators where significant indicators of harm have been identified will have a ‘legal effect’ or ‘similar significant effect’ on players.

In addition to a data protection impact assessment, operators will also need to consider updating their privacy notices and data subject access request response forms to satisfy the GDPR’s requirement to provide relevant information about Article 22 decision-making / profiling as well as “meaningful information” about the logic involved and the consequences of such processing.

Before we leave you, however, there are some curiosities when looking at Requirement 11 alongside Article 22, which we would be remiss in not mentioning.

Automated decision-making under GDPR only exists where there is no human intervention. It is therefore interesting that UKGC appears to indicate that all applied automated processes must be manually reviewed with each player. It is also curious that the UKGC specifically calls out the right for individuals to contest the automated process, because: (a) Requirement 11 has already required the automated process to be manually reviewed (query how this is true automation); and (b) the players do not have a right to contest automated decisions under Article 22 where the processing is authorised by law, as is the case with Requirement 11.

To summarise our view, if an operator does implement automated processes to identify significant indicators of harm, that operator will need to do the following in respect of data protection:

  • Identify a lawful basis (necessary to comply with a legal obligation will work here).
  • Update privacy notice to inform individuals of the automated process, the consequence of the process and the logic behind the decision-making.
  • Update DSAR response forms to notify individuals of the automated process, the consequence of the process and the logic behind the decision-making.
  • Carry out a data protection impact assessment if any automated decision-making is being put in place, particularly if it is being outsourced to a third party.

As discussed above, there are several contradictions between Article 22 and Regulation 11 and, in our view, the two do not fully align. We would therefore not advise that the UKGC’s comments that Requirement 11 is compliant with data protection are taken at face value.