
How much information needs to be provided to data subjects about automated decisions?
The CJEU has ruled on balancing access to information about automated decision making vs. protecting trade secrets.
Earlier this year, we gave an overview of the Attorney General’s opinion on Case C-203/22 (Dun & Bradstreet Austria) regarding information required to be provided to data subjects about automated decisions. The CJEU affirmed the Attorney General’s opinion in its recent judgment, finding that where data subjects request access to information about an automated decision that a controller considers is protected by a trade secret, controllers must disclose the trade secret to a competent court or supervisory authority for assessment. In this briefing, we discuss the practical implications of this decision for organisations that utilise automated decision making processes (including for credit scoring decisions).
Background
A mobile phone operator declined a contract with a data subject due to insufficient creditworthiness, based on an automated credit assessment by Dun & Bradstreet. The data subject requested “meaningful information” (pursuant to Article 15(1)(h) GDPR) about the logic involved in Dun & Bradstreet’s automated decision. In response, Dun & Bradstreet withheld certain information on the basis that the algorithm involved was protected as a trade secret. The data subject’s action to obtain sufficient “meaningful information” was subsequently referred to the CJEU.
Article 22(1) GDPR gives individuals the right not to be subject to decisions based solely on automated processing, including profiling, that have legal effects or similarly significant impacts on them, unless an exception under Article 22(2) applies.
In its 2023 judgement regarding SCHUFA Holding (Scoring), the CJEU held that Article 22(1) should be interpreted broadly, finding that automated credit scoring will constitute an automated decision for the purposes of Article 22(1) GDPR where a third-party “draws strongly” on that probability value to establish, implement or terminate a contractual relationship with a person (such as in the present Dun & Bradstreet case). However, that judgment did not determine the scope of data subjects’ right of access to “meaningful information about the logic involved” in automated decision making pursuant to Article 15(1)(h) GDPR.
The Dun & Bradstreet case builds on the SCHUFA decision, with the CJEU considering the extent and degree of detail which controllers must provide about automated decision making, and how this interacts with controllers’ interests in protecting trade secrets and copyright in software.
Key findings by CJEU
What constitutes “meaningful information”?
The Court held that in order to meet the requirement of “meaningful information”, controllers must give concise, transparent, intelligible and easily accessible information about the “procedures and principles” applied to obtain the automated decision in line with the transparency requirements under the GDPR. Such information should be sufficient to enable data subjects to effectively exercise their rights to express their view on the automated decision and contest it (per Article 22(3) GDPR).
Providing an algorithm will not be sufficient
The CJEU emphasised that mere communication of an algorithm or a “detailed description of all the steps in automated decision making” will not be sufficient. Rather, the rationale or criteria relied on in making the automated decision should be communicated in a simple way, that enables the data subject to understand which of their personal data was used. In the context of profiling (such as credit scoring), it may be sufficient to explain how “a variation in the personal data taken into account would have led to a different result”. The controller may also be required to provide access to the personal data it has created itself, such as the actual credit profile generated about the data subject.
Trade secrets
The CJEU held that controllers cannot rely on the existence of a trade secret as a basis for refusing to provide all information to a data subject under Article 15(1)(h) GDPR. Where a controller considers that information to be provided to the data subject contains trade secrets, the controller must provide the allegedly protected information to the competent court or supervisory authority to balance the data subjects’ right of access against the controller’s interest in protecting its trade secret. The court or supervisory authority will then determine the extent of information that must be provided to the data subject.
What does this mean for companies using automated decision making?
Controllers must be prepared to provide data subjects with an explanation of how automated decisions are made in a clear accessible format. The data subject must be able to understand from the information which of their personal data was used and in what way. Controllers will not automatically be expected to provide the algorithms involved – and indeed providing algorithms or complex mathematical formulas alone would be insufficient to meet the controller’s obligations.
For automated decisions used in the context of profiling to assess creditworthiness, controllers should be prepared to explain the extent to which a variation in the personal data taken into account would have led to a different result and to provide access to the data subject’s credit profile.
While it is concerning that controllers may be required to provide information about trade or business secrets to a competent authority or court, controllers can hopefully seek to mitigate this risk by ensuring they are prepared to provide data subjects with a simple explanation of the processes and personal data involved in an automated decision – i.e., to avoid the need to disclose technical information such as algorithms in the explanation.
The authors would like to thank Emily Birchall for her contribution to this briefing.