Key Points

  • Since approval of the European Union General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that a ‘right to explanation’ of all decisions made by automated or artificially intelligent algorithmic systems will be legally mandated by the GDPR once it is in force, in 2018.

  • However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive meaningful, but properly limited, information (Articles 13–15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’.

  • The ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the protection actually afforded to data subjects.

  • These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless.

  • We propose a number of legislative steps that, if implemented, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

Introduction

In recent months, researchers,1 government bodies,2 and the media3 have claimed that a ‘right to explanation’ of decisions made by automated and artificially intelligent algorithmic systems is legally mandated by the forthcoming European Union General Data Protection Regulation4 2016/679 (GDPR). The right to explanation is viewed as a promising mechanism in the broader pursuit by government and industry for accountability and transparency in algorithms, artificial intelligence, robotics, and other automated systems.5 Automated systems can have many unintended and unexpected effects.6 Public assessment of the extent and source of these problems is often difficult,7 owing to the use of complex and opaque algorithmic mechanisms.8 The alleged right to explanation would require data controllers to explain how such mechanisms reach decisions. Significant hype has been mounting over the empowering effects of such a legally enforceable right for data subjects, and the disruption of data intensive industries, which would be forced to explain how complex and perhaps inscrutable automated methods work in practice.

However, there are several reasons to doubt the existence, scope, and feasibility of a ‘right to explanation’ of automated decisions. In this article, we examine the legal status of the ‘right to explanation’ in the GDPR, and identify several barriers undermining its implementation. We argue that the GDPR does not, in its current form, implement a right to explanation, but rather what we term a limited ‘right to be informed’. Here is a quick overview.

In section ‘What is meant by a right to explanation?’, we disentangle the types and timing of explanations that can be offered of automated decision-making. The right to explanation, as popularly proposed, is thought to grant an explanation of specific automated decisions, after such a decision has been made.9

In section ‘Why there is no ‘right to explanation’ in the GDPR’, we assess three possible legal bases for a right to explanation in the GDPR: the right not to be subject to automated decision-making and safeguards enacted thereof (Article 22 and Recital 71); notification duties of data controllers (Articles 13–14 and Recitals 60–62); and the right to access (Article 15 and Recital 63).

The aforementioned claim for a right to explanation10 muddles the first and second legal bases. It conflates (i) legally binding requirements of Article 22 and non-binding provisions of Recital 71 and (ii) notification duties (Articles 13–14) that require data subjects to be provided with information about "the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject" (emphasis added).

Having challenged the legal basis for a right to explanation, we then consider whether the right of access in Article 15 provides a stronger legal basis. Following our analysis of the implementation and jurisprudence of the 1995 Data Protection Directive (95/46/EC), we argue that the GDPR’s right of access allows for a limited right to explanation of the functionality of automated decision-making systems—what we refer to as the ‘right to be informed’. However, the right of access does not establish a right to explanation of specific automated decisions of the type currently imagined elsewhere in public discourse. Not only is a right to explanation of specific decisions not granted by the GDPR, it also appears to have been intentionally not adopted in the final text of the GDPR after appearing in an earlier draft.

In section ‘What if a right to explanation were granted?’, we consider the limitations of scope and applicability, if a right to explanation were to exist. We show that a ‘general’ right to explanation, applicable to all automated decisions, would not exist even if Recital 71 were legally binding. A right to explanation, derived from the right of access (Article 15) or safeguards described in Article 22(3), would only apply to a narrow range of decisions ‘solely based on automated processing’ and with ‘legal’ or ‘similarly significant’ effects for the data subject (Article 22(1) GDPR). We examine the limited cases in which the right would apply, including the impact of a critical ambiguity of language that allows the broader ‘right not to be subject to automated decision-making’ (Article 22 GDPR) to be interpreted either as a prohibition, or right to object.

The last section concludes the article with recommendations for a number of legislative and policy steps that, if implemented, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

What is meant by a right to explanation?

Before examining whether the GDPR specifies a right to explanation, it is necessary to examine what one may mean by an ‘explanation’ of automated decision-making. Two kinds of explanations may be in question, depending on whether one refers to:

  • system functionality, ie the logic, significance, envisaged consequences, and general functionality of an automated decision-making system, eg the system’s requirements specification, decision trees, pre-defined models, criteria, and classification structures; or to

  • specific decisions, ie the rationale, reasons, and individual circumstances of a specific automated decision, eg the weighting of features, machine-defined case-specific decision rules, information about reference or profile groups.11

Furthermore, one can also distinguish between explanations in terms of their timing in relation to the decision-making process:

  • an ex ante explanation occurs prior to an automated decision-making taking place. Note that an ex ante explanation can logically address only system functionality, as the rationale of a specific decision cannot be known before the decision is made;

  • an ex post explanation occurs after an automated decision has taken place. Note that an ex post explanation can address both system functionality and the rationale of a specific decision.

An example may help clarify how these distinctions interact. Take an automated credit scoring system. Prior to a decision being made (ex ante), the system provider can inform the data subject about the system functionality, including the general logic (such as types of data and features considered, categories in the decision tree), purpose or significance (in this case, to assign a credit score), and envisaged consequences (eg the credit score can be used by lenders to assess credit worthiness, affecting the terms of credit such as interest rate). After a decision has been made (ex post), an explanation of system functionality can still be provided to the data subject. However, the provider can also explain to the data subject the logic and individual circumstances of their specific decision, such as her credit score, the data or features that were considered in her particular case, and their weighting within the decision tree or model. In other words, the provider can explain how a particular score was assigned. Further, when pre-defined simplistic or linear models are used and fully disclosed, predictions about the rationale of a specific decision are possible in principle ex ante. However, in both cases the provider’s ability to offer an explanation of the rationale of a specific decision may be limited by several legal (see section ‘What if a right to explanation were granted?’) and technical factors, including the use of complex probabilistic analytics and decision-making methods.12

These distinctions between two kinds and two different timings of explanations are implicit in the GDPR. Their importance will be highlighted as we examine the possible legal bases for a right to explanation.

Why there is no ‘right to explanation’ in the GDPR

Three distinct possible legal bases for a right to explanation of automated decision-making can be found in the GDPR. A right to explanation can possibly be derived from: safeguards against automated decision-making as required under Article 22(3), and commented upon by Recital 71; notification duties under Articles 13–14 commented upon by Recitals 60–62; or the right of access under Article 15, and commented upon by Recital 63.

These bases are respectively referred to as a right to explanation derived from (i) safeguards, (ii) notification duties, and (iii) the right of access. We will assess each in turn. On the whole, the claim that a right is granted by the GDPR to an ex post explanation of specific decisions (at a minimum) that seemingly applies to any instance of automated decision-making is based on a combination of safeguards and notification duties. It combines non-binding Recital 71 with binding provisions of Articles 13–14 and 22 to argue that "The law will […] effectively create a "right to explanation," whereby a user can ask for an explanation of an algorithmic decision that was made about them."13 This claim is incorrect for several reasons, explained below.

A right to explanation derived from safeguards against automated decision-making

Starting with the claim14 for a right to explanation derived from safeguards, Article 22 (see Figure 1) and Recital 71 of the GDPR address a data subject’s right not to be subject to automated decision-making. Article 22(3), which addresses safeguards against automated decision-making, states that:

the data controller shall implement suitable measures to safeguard the data subject's rights and freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision. (emphasis added)

Article 22 of the General Data Protection Regulation.
Figure 1.

Article 22 of the General Data Protection Regulation.

Critically, a right to explanation is not mentioned. Rather, after a decision has been made, and assuming the decision meets a condition specified in Article 22(3)a (to enter or fulfil a contract) or Article 22(3)c (with explicit consent), data subjects are granted additional safeguards to obtain human intervention, express views, or contest a decision (Article 22(3)), but not to obtain an explanation of the decision reached.

In all of the GDPR, a right to explanation is only explicitly mentioned in Recital 71, which states that a person who has been subject to automated decision-making:

should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision. (emphasis added)

If legally binding, this provision would require an ex post explanation of specific decisions, as Recital 71 addresses safeguards to be in place once a decision has been reached. To show why Recital 71 does not establish a legally binding right, a brief aside into the legal status of Recitals is required.

Recitals provide guidance15 on how to interpret the Articles, but are not themselves legally binding.16 As Klimas and Vaiciukaite explain, "Recitals have no positive operation of their own" and "cannot cause legitimate expectations to arise."17 Baratta further expands:

In principle the ECJ does not give effect to recitals that are drafted in normative terms. Recitals can help to explain the purpose and intent behind a normative instrument. They can also be taken into account to resolve ambiguities in the legislative provisions to which they relate, but they do not have any autonomous legal effect.18

Jurisprudence of the European Court of Justice (ECJ) shows that the role of Recitals is to dissolve ambiguity in the operative text of a framework. The ECJ has commented directly on the legal status of Recitals, clarifying that: "Whilst a recital in the preamble to a regulation may cast light on the interpretation to be given to a legal rule, it cannot in itself constitute such a rule."19

Returning to the GDPR, Article 22(3) lists the minimum requirements that have to be met for lawful automated decision-making. There are no ambiguities in the language that would require further interpretation with regard to the minimum requirements that must be met by data controllers. As long as these requirements are met, automated decision-making is lawful and in compliance with the GDPR. With this said, future jurisprudence (see section ‘What if a right to explanation were granted?’) can still interpret the meaning of ‘suitable measures to safeguard’, and establish future mandatory or case-to-case requirements to be met by data controllers, including a right to explanation. This is, however, only one possible future. A right to explanation is thus not currently legally mandated by the requirements set in Article 22(3).

In addition, rights have to be explicitly legally established prior to their enforcement. This idea stems from the relationship between legal rights and duties. The scope of a right can be subject to interpretation; a legal basis for its existence must, however, first be beyond doubt. Rights of data subjects typically correspond with a duty on the side of the data controller.20 Negligence in relation to legal duties can be punished through fines and other procedures. It would be highly controversial to impose fines on data controllers without having previously clarified explicitly and beyond doubt what duties must be met. Doing otherwise would conflict with the principles of fair trial (Article 6 of the European Convention on Human Rights and Article 47 of the Charter of Fundamental Rights of the European Union) and the rule of law.21 Criminal and administrative procedures have to be laid down precisely.

It can be concluded that data subjects will not be granted a legally binding ex post right to explanation of specific automated decisions on the basis of legal safeguards in Article 22 as it currently stands. That this is the case does not appear to be the result of an oversight or fiddling with subtle interpretations (eg the meaning of ‘suitable measures to safeguard’ in Article 22(3)). On the contrary, the omission of a right to explanation from Article 22 appears to be intentional. The safeguards specified in Recital 71 are almost identical to those in Article 22(3), with the significant difference of the further inclusion of a right ‘to obtain an explanation of the decision reached after such assessment’ in Recital 71. The purposeful omission of this text from Article 22 may not be an oversight but suggests that legislators did not intend to implement a right to explanation of specific decisions in the GDPR. What happened?

Looking at previous drafts of the GDPR and commentary from the trilogue negotiations,22 one can see that legislators had stricter safeguards in place on automated decision-making and profiling, but that these were eventually dropped, including a legally binding right to explanation of specific decisions.23 An early indication of the debate around the right to explanation can be seen in the November 2013 report of the European Parliament (EP)24 and the December 2014 report of the European Council in response to the original GDPR text proposed by the European Commission (EC)25 in 2012.

The EC’s proposed text did not contain a right to explanation. The EP proposed the following amendment to Article 20 (now Article 22 in the adopted version of the GDPR), paragraph 5:

Profiling which leads to measures producing legal effects concerning the data subject or does similarly significantly affect the interests, rights or freedoms of the concerned data subject shall not be based solely or predominantly on automated processing and shall include human assessment, including an explanation of the decision reached after such an assessment. The suitable measures to safeguard the data subject's legitimate interests referred to in paragraph 2 shall include the right to obtain human assessment and an explanation of the decision reached after such assessment …. (emphasis added)

The EP’s preferred text mandated a ‘right to obtain human assessment and an explanation of the decision reached after such assessment’. These safeguards would have been part of Article 20, meaning that they would have been legally binding. However, the proposed safeguards were not adopted in trilogue. This change suggests that legislators intentionally chose to make the right to explanation non-binding by placing it in Recital 71.

The European Council’s 2014 draft,26 on the other hand, only required that:

the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, such as the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision. (emphasis added)

The Council suggested to add the text:

to express his or her point of view, to get an explanation of the decision reached after such assessment and the right to contest the decision, (emphasis added)

to Recital 58 (equivalent to Recital 71 GDPR).27 The Council thus suggested to place the right to explanation added in the EP’s draft in a Recital. This approach was eventually taken in the final text adopted in 2016.

Interestingly, despite years of negotiations, the final wording of the GDPR concerning protections against profiling and automated decision-making hardly changed from the relevant Articles and Recitals of the Data Protection Directive 1995. As with the GDPR, a ‘right to explanation’ does not appear in Article 15 of the Directive (see Figure 2), which addresses automated individual decisions.

Article 15 of the 1995 Directive.
Figure 2.

Article 15 of the 1995 Directive.

Although Article 22 GDPR has not greatly changed from Article 15 of the Directive, a few changes are still noteworthy. First, the only safeguard against automated decision-making mentioned in the Directive is the opportunity to express one’s views. Article 22(3) additionally names contesting the decision and the right to obtain human intervention as suitable measures. Secondly, explicit consent is included as a case in which automated decision-making is allowed (Article 22(2)c). Finally, as opposed to the provisions in Article 15 of the Directive, it is no longer necessary that the data subject requests the contract in order for automated decision-making to be lawful.

A right to explanation derived from notification duties

Articles 13 and 14 GDPR specify notification duties for data controllers concerning the processing of data collected from the data subject (Article 13) or from a third party (Article 14). In the aforementioned claim, these Articles are cited as a basis for a right to an ex post explanation of specific decisions. The claim starts with Articles 13(2) and 14(2), which state that data controllers need to:

provide the data subject with the following information necessary to ensure fair and transparent processing.

According to Articles 13(2)f and 14(2)g, this information includes:

the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject. (emphasis added)

This duty applies in cases of automated processing meeting the requirements of Article 22(1) or 22(4) (more on this later).

It has been suggested that the notification duties in Articles 13–14, in combination with the safeguards defined in Article 22(3), grant an ex post right to explanation of the ‘existence of … logic involved … significance … and envisaged consequences’ of automated decision-making.28 This claim is mistaken for two reasons.

First, only an ex ante explanation of system functionality is explicitly required by Articles 13(2)f and 14(2)g. These notification duties precede decision-making. Notification occurs before a decision is made, at the point when data is collected for processing. This holds true even if Article 14 introduces some ambiguities when data are collected from third parties rather than data subjects (insofar as the controller needs only to notify the data subject within 30 days of collection). As explained in section ‘What is meant by a right to explanation?’, only an explanation of system functionality is logically possible prior to decision-making. Therefore, Articles 13–14 cannot be used as evidence of an ex post right to explanation of specific decisions that can logically only be given once a decision has been made (timeline problem).29

Secondly, the claim links Articles 13(2)f and 14(2)g to the safeguards in Article 22(3). This link is not made in the GDPR. Articles 13(2)f and 14(2)g apply only to Articles 22(1) and 22(4), which do not address safeguards against automated decision-making. The supposed link—between notification about the logic involved, significance, and envisaged consequences of automated decision-making in Articles 13–14, and the ex post right to explanation incorrectly attributed to Article 22(3) (which only features in Recital 71)—is therefore untenable and can be dismissed. The claim also conflates the legally binding notification duties, specified in Articles 13–14, and the non-binding right, specified in Recital 71.

It follows that the claim for an ex post right to explanation of specific decisions30 is not correct. Any suggestion to the contrary fails to distinguish between (i) the legally binding duty to notify the data subject of the logic involved, significance, and envisaged consequences of automated decision-making system before decision-making occurs (timeline problem) (Articles 13–14), and (ii) the data subject’s non-binding right to an explanation of specific decisions (Recital 71) after decision-making occurs.

The language used in Articles 13(2)f and 14(2)g also supports the interpretation that only an ex ante explanation is required. Data controllers must inform the data subject about the

existence of automated decision-making, including profiling … [and provide data subjects with] meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing.

The language used suggests that data subjects must be provided with information about how an automated decision-making system works in general, for which purposes, and with what predicted impact, before automated decisions are made. Notably this cannot include any information about how a specific decision was made or reached, but rather addresses how the system itself functions, eg its decision tree or rules, or predictions about how inputs will be processed. For fully disclosed simplistic or linear models, this may show how specific decisions would be reached in the future.31

A right to explanation derived from the right of access

In contrast to prior claims (see ’Introduction’ section), it may also be possible to derive a right to explanation from the right of access established in Article 15 GDPR. Article 15(1)h is identical to Articles 13(2)f and 14(2)h: data subjects are granted a right to be informed about the existence of automated decision-making and to obtain meaningful information about the significance, envisaged consequences, and logic involved. Specifically, the subject should be informed about the existence, purposes, and logic of data processing, and the intentions and legal consequences of such processing. By having this information, the data subject should be able to examine the lawfulness of data processing and invoke legal remedies.32

Together, Articles 13–15 form what has been called the ‘Magna Carta’ of data subject’s rights to obtain information about the data held about them, and to scrutinize the legitimacy of data processing.33 Articles 13–14 create notification duties for data controllers, while Article 15 establishes a corresponding right of access for data subjects.34 In contrast to the notification duties of data controllers in Articles 13–14, the right of access has to be invoked by the data subject. The articles are a unit, insofar as they provide the data subject access to identical information, and use the same language.

Although seemingly insignificant, the change from a notification duty to an access right has important consequences for the timing of explanations required from the data controller. Given that the phrasing of Article 15(1)h is identical to Articles 13(2)f and 14 (2)g, one could assume that the right of access similarly only grants access to an ex ante explanation of system functionality. However, the right of access is dependent upon the request of the data subject and has no deadline; the ‘timeline problem’ of Articles 13(2)f and 14(2)g does not apply. At first glance, the data subject can request this information at any time, including after an automated decision has been made, making an ex post explanation of the rationale of specific decisions plausible.

Nonetheless, it is reasonable to doubt that the right of access grants a right to ex post explanations of specific decisions already reached. Consider the semantics of Article 15(1)h. The phrase ‘envisaged consequences’ is future oriented, suggesting that the data controller must inform the data subject of possible consequences of the automated decision-making before such processing occurs. This interpretation follows the timeline constraints of identical provisions in Articles 13(2)f and 14(2)g discussed above, which only allow for ex ante explanations. Data controllers are required to predict the possible consequences of their automated decision-making methods. The term ‘envisaged’ limits these predictions to ex ante explanations of system functionality, for instance, concerning the general purpose of the system, or the type of impact to be expected from the type of decision it makes. For instance, a credit agency could predict that the scores they produce will impact on credit worthiness (eg interest rates). If applied to decisions already made, the phrasing becomes incoherent.35 It would seem to require data controllers to predict the personal consequences of decision-making for individual data subjects after an automated decision has been made, including how the decision could be used by other data controllers and processors.

The semantics of the German translation of Article 15(1)h GDPR provides further support. The German Article 15(1)h states:

Tragweite und angestrebten Auswirkungen einer derartigen Verarbeitung für die betroffene Person. (emphasis added)

This sentence translates to ‘the scope and intended consequences of such processing for the person concerned’ (authors’ translation, emphasis added). This indicates that the data controller must inform the data subject about the consequences the controller wishes to achieve with automated decision-making. According to this phrasing, the data controller is not asked to predict consequences but rather explain the scope, intention, and the purpose of such processing. This suggests that the right of access is not addressing how an individual decision was reached, but rather the duty of the data controller to provide information about the existence, aims and consequences of such processing. This equates to an explanation of system functionality.36

There are similar reasons to doubt that Article 15(1)h grants an ex post right to explanation of specific decisions. Data controllers are required to provide information about the ‘existence of automated decision-making’ (emphasis added). This phrase does not suggest an explanation of how a decision was reached. Rather, the data controller is only required to inform the data subject that automated decision-making methods are being used to process her data.

The phrasing of Article 15(1)h, as with Articles 13–14, points to an explanation of system functionality. However, data controllers are also required to provide ‘meaningful information about the logic involved’ in automated decision-making. As noted in section ‘A right to explanation derived from notification duties’, this phrase, as used in Articles 13–14, has been argued by others to grant an ex post right to explanation. If correct, Article 15(1)h would grant a right to explanation of specific decisions, not only system functionality, as the data subject can request the relevant information both before and after a decision has been made. However, there are further reasons to doubt that this is the case.

For Article 15(1)h to be coherent as a whole, ‘meaningful information about the logic involved’ must be interpreted in connection with the other terms (existence of, meaningful information about significance and envisaged consequences) used of Article 15(1)h, which are limited to explanations of system functionality. Interpreting ‘logic involved’ to grant an ex post explanation of specific decisions would mean the other terms of Article 15(1)h would be incoherent, if the right of access was invoked after a decision was made. This interpretation is further supported by a comparison of the language used in Article 15(1)h and Recital 71. Data controllers are obligated to provide information about the

existence of automated decision-making … meaningful information about the logic involved, as well as the significance and envisaged consequences of such processing (Article 15(1)h), [as opposed to] an explanation of the decision reached (Recital 71).

The phrasing of Article 15(1)h is future oriented, and appears to refer to the existence and planned scope of decision-making itself, rather than to the circumstances of a specific decision as suggested in Recital 71. If an explanation of specific automated decisions was intended to be granted by Article 15(1)h, as in Recital 71, the usage of different language between the two would be odd.

Nevertheless, given the lack of an explicit deadline for invoking the right of access, one cannot be certain, on the basis of semantics alone, that the right of access is limited to explanations of system functionality. Despite this, we argue that, as with notification duties in Articles 13–14, and regardless of when it is invoked by the data subject, the GDPR’s right of access only grants an explanation of automated decision-making addressing system functionality, not the rationale and circumstances of specific decisions. This conclusion is supported by implementation of the 1995 Directive’s right of access by Member States, which has mostly limited informational obligations to system functionality. If interpretation of the GDPR follows historical precedence, its right of access will be similarly limited. To articulate this claim further, it is necessary to examine in detail Member State implementations and interpretations of the Directive’s right of access.

Right of access in the 1995 Data Protection Directive 95/46/EC

It is important to note that a right of access that grants data subjects some explanation of automated decision-making is not new, and has not proven an effective transparency mechanism.37 Rather, this right has existed since the 1995 Data Protection Directive, and has been implemented in national law by most European Member States.38 Similar to the scope of the GDPR’s right of access, the Directive’s right of access provides means for data subjects to discover whether a controller is processing personal data. If so, the data subject is then entitled to know the extent of data being processed. This shall enable the data subject to scrutinize what data are used and take appropriate action such as requesting rectification or erasure.39 Notably, the Directive’s right of access has generally not been interpreted as granting a right to explanation of specific decisions already reached, as it is not part of the safeguards at the time automated decisions are made in Article 15(2)a of the Directive; this distinction is comparable to the difference between Articles 15 and 22 of the GDPR. The Directive names only one safeguard against automated decision-making, namely the right for the data subject to ‘put his point of view’. A right to explanation of specific decisions as a safeguard to ensure lawful automated decision-making was not envisaged.

The implementation and interpretation of the Directive’s right of access varied across the Member States. Despite much debate,40 consensus has not emerged concerning the type of information data controllers must disclose to provide data subjects with ‘knowledge of the logic involved in any automatic processing of data’ per Article 12(a).41 A report published in 2010 on the implementation of the Directive across Member States suggested that it was left to the Member States to define the scope and requirements of the right of access. The report urges clarification of the requirements and limitations on the right of access concerning information about the ‘logic involved’ due to the growing importance of automated decisions.42 In part, the lack of consensus over the meaning and requirements of ‘logic involved’ owes to the relative lack of jurisprudence on the right of access. Despite the Directive having been in force for over 20 years, the requirements and limitations of the right of access applied to automated decision-making have not been extensively clarified or tested in courts across Europe.43

The limited jurisprudence available reveals limitations on the Directive’s right of access. Several overriding interests and exceptions have been identified that significantly limit both the scope of applicability and content of the explanation. In general, data subjects are entitled to receive some information about the general functionality of an automated decision-making system, but little to no information about the rationale or circumstances of a specific decision. The 2010 report reflects this, noting that the language used in the Directive reflects a very narrow scope of applicability for the right of access due to a number of exceptions and limiting or overriding interests.44 Recital 41 of the Directive clarifies that the right of access can be limited by trade secrets and intellectual property, especially relating to software.45 These interests have proven strong limiting factors on the right of access as implemented and tested by Member States.

Several examples can be offered. French data protection law46 grants data subjects a right to receive information about the ‘logic involved’ as long as it does not contravene copyright regulations. To allow data subjects to challenge decisions, information must be provided about the general logic and types of data taken into account, ‘but not (or at least not fully) of the weight that is attached’ to specific features.47 The full code of the automated decision-making system or algorithm does not need to be revealed.48 A similar approach is taken in the UK’s Data Protection Act 1998, which also limits the right of access to protect trade secrets.49 As with French law, data controllers

must inform data subjects of the factors which they take into account in the "evaluation" underlying the decision, but without having to reveal the exact weight given to each of these factors (i.e. the copyright-protected algorithm used in the automated decision-taking process).50

German data protection law has similarly recognized a distinction between explanations of system functionality and specific automated decisions in section 6(a) of the Bundesdatenschutzgesetz, which jointly implements Articles 12 (right of access) and 15 (safeguards for automated individual decisions) of the Directive.51 Notably, Germany implemented a right allowing data subject’s to request an explanation of automated decisions that are not made in their favour. The right is implemented as an explicit safeguard against automated decisions in section 6(a)2(2) of the Bundesdatenschutzgesetz. The right was voluntarily enacted as a safeguard beyond the requirements set in Article 15 of the Directive, which grants the right to express views as the only safeguard against automated individual decisions. Interestingly, the right to an explanation as an extra safeguard provides some insight into how the Directive’s requirement to explain the ‘logic involved’ was interpreted by German legislators. Section 6(a)3 of the German Data Protection Act separately extends the right of access enshrined in sections 19 and 34, allowing data subjects to obtain information about the ‘logical structure’ of automated processing, which refers back to Article 12(a) of the Directive.52 If ‘knowledge of logic involved’ in Article 12(a) was intended to establish a right to obtain an explanation about decisions reached, it would not have been necessary for German legislators to enact separately a right to explanation (section 6(a)2(2)) in the same Article containing the extended right of access (section 6(a)3), especially considering both rights must be invoked by the data subject. Even if one wishes to argue that sections 6(a)2(2) and 6(a)3 refer to the same type of explanation (ie of specific decisions), the use of different wording across the articles—‘main reasons for the decision and have it explained’ in section 6(a)2(2), ‘logical structure of the automated processing of the data that concerns [the data subject]’ (authors’ translation) in section 6(a)3)—suggests that the two mechanisms entitle the data subject to different types of information.53

Following this, German legal commentary and jurisprudence54 addressing the extended right of access (section 6(a)3) suggest that the information it requires is limited mostly to system functionality. The data controller does not need to disclose the software used, as the software is considered to be a trade secret.55 Some German commentators believe that some (or the ‘top four’) features factored into a decision have to be disclosed, but not the algorithm used due to trade secrets.56 Data controllers are not obligated to explain how the software is working or, especially, to give any details about its code. The data controller is only obligated to explain the logic of the ‘decision tree’. The ‘weighting’ (authors’ translation) of specific features and the parameters used to make the decision do not have to be disclosed. This is meant to protect trade secrets and manipulation of the decision-making system.57

This interpretation of the right of access as being limited to system functionality in order not to contravene trade secrets is also reflected in German jurisprudence. According to several commentators,58 the German SCHUFA59 judgments60 show that data subjects do not have a right to investigate fully the accuracy of automated processing systems (in this case, credit scoring), as the underlying formulas are protected as trade secrets. The protected formula would consist of, for example, statistical values, weighting of certain elements to calculate probabilities (eg the likelihood of loan repayment), and reference or comparison groups.

The judgments indicate that all three elements of the right of access enshrined in Article 12(a) of the Directive aim to provide general information about the usage and purpose of data processing. Concrete elements of the screening procedures do not have to be disclosed.61 The data subject is entitled to know which data and features were taken into account when the decision was made, in order to be able to contest the decision or demand that inaccurate or incomplete data be rectified. However, the weighting of these elements, the method (scoring formula), the statistical values, and the information about the reference groups62 used does not have to be disclosed.63 The judgments state that jurisprudence, academic literature, and legal commentary commonly agree that the abstract methods used to define credit scores do not have to be disclosed, and that this position is in accordance with the intention of German data protection legislation.64

It is worth noting that the SCHUFA judgments do not explicitly address automated decision-making, as the court decided an automated decision was not made because automated processing was only used for preparation of evidence, while the actual decision was made by a human being.65 The judgements are nonetheless insightful insofar as they demonstrate a strong tendency to protect trade secrets in relation to the right of access. As discussed below, this case provides an example of an important limitation on a right to explanation established on any of the three legal bases in the GDPR identified above. Automated decision-making is defined in both the Directive and GDPR as decision-making based solely on automated processes.66 Quite crucially, this creates a loophole whereby even nominal involvement of a human in the decision-making process allows for an otherwise automated mechanism to avoid invoking elements of the right of access (both in the Directive and GDPR) addressing automated decisions.

Finally, Austrian legislators similarly implemented the requirements of Articles 12a and 15 of the Directive in section 49(3)67 of the Austrian Data Protection Act. As opposed to German law, the right to obtain an explanation about how an individual decision was reached was not implemented as a safeguard. Only the right to express one’s view is named as one of the mandatory safeguards, as mandated by Article 15 of the Directive. Section 49(3) establishes an extended right of access (section 26) which is the data subject’s right to know, upon request, about the logic of the process of automated decision-making.68

Austrian jurisprudence69 is very vague on the right of access and automated decision-making. Existing decisions do not fully explain how much the data controller is obligated to disclose under the right of access, and are in some sense contradictory. In most decisions, an obligation was recognized to explain how the system in questions functions.70 In contrast, one decision stated that the right of access according to section 26 and the right to know about the logic of the process (section 49(3)) also includes the criteria and the weighting of the criteria which would then allow the data subject to understand how a decision was reached. However, the Austrian Data Protection Commission simultaneously acknowledged that trade secrets can limit this right. The Commission concluded that the extent to which the data controller needs to disclose decision criteria and weighting must be determined on a case by case basis.71 In another case, the Commission denied the existence of an individual automated decision because the criteria used were based on a large group rather than on the individual. Therefore, the rights of access and to know about the logic of automated processing do not apply, if the basis of that decision is a group (‘peer group’ [authors’ translation]) rather than (data about) the individual.72 This distinction highlights a tension in the definition of automated decision-making and profiling in the Directive, insofar as automated processing of data describing groups, rather than individuals, does not allow for invocation of the right of access.73

From the Directive to the GDPR: the right to be informed

The Directive’s right of access provides an explanation of the system’s functionality which has been heavily limited by trade secrets. The loophole—through which automated processes that merely produce evidence for decision-making (rather than actually making decisions) are not subject to the right of access (specifically, the provision to disclose information about the ‘logic involved’)—has also proven to be a significant limiting factor. A relative lack of jurisprudence across Member States has not helped clarify and unify the requirements. This is problematic given the current and emerging growth in automated decision-making and data processing.

The GDPR appears to offer less protection to data subjects concerning explanations of automated decision-making than some current data protection laws in Europe based on the Directive.74 In particular, the GDPR's right of access appears to not offer more protection for data subjects’ interests than the Directive's right of access.75 The use of future-oriented semantics in the GDPR (unlike the Directive which did not explicitly acknowledge a decision-making timeline), as well as its terminological overlap with notification duties, suggest that the GDPR intends to further limit the right of access regarding automated decision-making to explanations of system functionality.76 The phrasing of Article 15 GDPR in particular points towards a general explanation of the existence and functionality of automated decision-making systems. Article 12(a) of the Directive grants data subjects a right to obtain ‘knowledge of the logic involved in any automatic processing of data concerning him at least in the case of the automated decisions referred to in Article 15(1)’.77 It is interesting to note that this phrase is open to greater interpretation than Article 15 GDPR, 78 which requires only information about ‘the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject’. As argued above, this phrase in the GDPR requires that the data subject be informed merely about the usage and functionality of automated decision-making methods. The change of wording indicates that the intention of the right of access in the GDPR is to grant access to information about the ‘usage’ and functionality of such automated decision-making. Again, this suggests an even stronger intention to limit the right to explanations of system functionality, not the rationale and circumstances of specific decisions.

Legal scholars are already debating the scope of the right of access in the GDPR. According to German commentary79 on the GDPR, it is sufficient to be informed about the envisaged consequences in a very simple manner. For instance, an explanation of how a low rating of creditworthiness can affect the choice of payment options would be sufficient.80 The type of explanation recognized in prior German jurisprudence81 and German commentary82 on the GDPR is limited by overriding interests of the data controller, eg protection of trade secrets, or prevention of ‘gaming the system’ by users. The process that the algorithms use does not have to be disclosed.83 Furthermore, the rating of similar groups has historically not needed to be disclosed.84

These recent commentaries on the GDPR follow the general interpretation and prior jurisprudence on the right of access in the 1995 Directive. According to commentators, data controllers do not need to explain fully the rationale and circumstances of a specific decision to provide data subjects with ‘meaningful information about the logic involved’ (Article 15(1)h GDPR). Rather, the information offered by data controllers will address general system functionality, and could be heavily curtailed to protect the controller’s interests (eg trade secrets, intellectual property; see Recital 63).85 It is worth noting that additional limitations can also be imposed to protect the interests of other parties via Union or Member State law.86 Paal also notes that the purpose of Article 15 GDPR is to allow data subjects to be informed about the usage and functionality of automated decision-making. As the scope of information data controllers are required to disclose in Article 15 is the same as in Article 13, Article 15 similarly requires only limited information about the functionality of the automated decision-making system. Paal also notes that “meaningful information” does not create an obligation to disclose the algorithm, but only to provide basic information about its logic. Schmidt-Wudy argues that if necessary to assess the accuracy of data processing, information about the algorithm could be given, with appropriate limitations to protect trade secrets.87 However, the type of information to be provided is not specified.

As with the Directive, the practical requirements and utility of the GDPR’s right of access will similarly only be revealed through testing and clarification via jurisprudence and expert opinion, such as from the Article 29 Working Party, the new European Data Protection Board established by Article 68 GDPR,88 the European Data Protection Supervisor, or its Ethics Advisory Group89 (see ‘Conclusion’ section). However, the implementation of the Directive’s right of access strongly suggests that the GDPR’s right of access will be far from the ex post ‘general’ right to explanation of system functionality and specific decisions, which we have argued it is mistakenly attributed to the GDPR. Rather, through the right of access, the GDPR will grant a ‘right to be informed’ about the existence of automated decision-making and system functionality, limited in applicability along the lines above and those described in the following section.

What if a right to explanation were granted?

Although a meaningful right to explanation of specific automated decisions will not be introduced by the GDPR, the contribution of such a right to the accountability and transparency of automated decision-making may provide compelling reasons for legislators or data controllers to introduce one in the future. It is possible to envisage at least four main scenarios that may lead to a right of explanation of specific automated decisions in practice:

  • An additional legal requirement is enacted by Member States, separate from the GDPR, granting a right of explanation of specific decisions (similar to actions taken by German legislators under the 1995 Directive) (see also ‘Conclusion’ section).

  • Based on GDPR Article 22 and Recital 71, data controllers voluntary choose to offer a right to explanation of specific decisions as a ‘suitable … safeguard’. The right would be an additional and voluntary safeguard to those already required by Article 22(3). Controllers could do this on the basis that an explanation is required to invoke one of the three legally required Article 22(3) safeguards, ie express their views, obtain human intervention, or contest a decision.

  • Future jurisprudence broadly interprets the safeguards against automated decision-making (Article 22(3)) to establish a right to explanation of specific decisions. This could occur, for example, on the basis that an explanation of the rationale of an automated decision is required in order to contest it or express views. Future guidelines of the European Data Protection Board could support this interpretation.

  • Future jurisprudence establishes that the right of access (Article 15 GDPR) provides a basis for explanations of specific automated decisions, as a requirement to provide information about the ‘existence of … logic involved … significance … [or] envisaged consequences’ of automated decision-making (Article 15(h)1). This interpretation could also be supported in future guidelines of the European Data Protection Board.

Of these scenarios, the third and fourth seem to be the most plausible at the moment. Concerning the third, Article 22(3) guarantees that human intervention is available for automated decisions rendered in fulfilment of a contract or with explicit consent (see below). On this basis, one may argue that, although it is certainly not explicit in the phrasing of Article 22(3), the right to obtain human intervention, express views or contest a decision is meaningless if the data subject cannot understand how the contested decision was taken. The right to contest has already been interpreted by Member States, in enacting the 1995 Directive, as merely a right to force a controller to make a new decision. This interpretation is found in the UK Data Protection Act 1998 (Article 12(2)b): subjects can demand a new decision to be made, albeit without any way to assess the reliability of the old decision. A broad reading of Article 22(3), according to which an explanation is required to contest a decision, would strengthen the right to contest. In this case, the argument for a right to explanation of specific decisions could be further buttressed by drawing on the rights to fair trial and effective remedy enshrined in Articles 6 and 13 of the European Convention on Human Rights and Article 47 of the Charter of Fundamental Rights of the European Union. Without an explanation of how the algorithm works, both rights are hard to enforce, because the decisions/evidence used will be impossible to contest in court.90

Concerning the fourth option, implementation of the right of access in the 1995 Directive has shown the need for interpretation of vague provisions by Member States and national courts. As noted above, consensus has not emerged over the meaning or requirements implied for data controllers when explaining the ‘logic involved’ in automated individual decisions. Austrian jurisprudence has demonstrated that the scope of ‘logic involved’ is sufficiently broad to include that some elements of the rationale or circumstances of a specific decision be explained along with system functionality, albeit limited severely by data controller’s interests (eg trade secrets). Despite aiming to unify data protection law across the Member States, the GDPR’s right of access will need to be similarly interpreted and tested. Given that the reference to ‘logic involved’ occurs in both the Directive and GDPR, it is plausible (but unlikely) that future legal interpretation of the right of access could establish a right to explanation of specific decisions.

Limitations on a right to explanation derived from the right of access (Article 15) or safeguards against automated decision-making (Article 22(3))

Assuming one or indeed a combination of the previous four scenarios occurs, and hence that a right to explanation of specific decisions is granted, other provisions in the GDPR may still limit its scope significantly. A ‘general’ right to explanation as proposed elsewhere (see ‘Introduction’ section), seemingly applicable to all types of automated processing, would not exist. A primary limitation is the narrow definition of automated decision-making in Article 22(1),91 defined as:

a decision based solely on automated processing, including profiling, which produces legal effects concerning [the data subject] or similarly significantly affects him or her.

An automated process must meet this definition for Articles 15(1)h (right of access) or 22(3) to apply, and thus for a future right to explanation established on either basis to be invoked.

Automated decision-making must have legal or other significant effects’, with a decision based ‘solely on automated processing of data’ (Article 22(1)). The latter requirement opens a loophole whereby any human involvement in a decision-making process could mean it is not ‘automated decision-making’.92 While the required level of human involvement is not clear in practice, the phrase ‘solely’ suggests even some nominal human involvement may be sufficient. There is still uncertainty as to whether the usage of automated processing for the preparation of a decision ultimately acted upon by a human constitutes a decision ‘solely based on automated processing’, if the human does not interfere, verify, or modify the decision or decision-making rationale.93 Preparation of evidence for a decision, and making the decision itself, are not necessarily equivalent acts.94 Martini believes that automated processing of data for ‘assistance to make a decision’ or ‘preparation of a decision’ is not within the scope of Article 22.95 Decisions based predominantly on automated processes, but with nominal human involvement, would thus not invoke Article 15(1)h (right of access) or Article 22(3) (safeguards against automated decision-making), and thus would not require an explanation of system functionality or the rationale of specific decisions, assuming that such a right to explanation of specific decisions was established on either basis.96

Interpretation of Article 15 of the Directive, which was also limited to decisions ‘based solely on automated data processing’, does not provide clarification. The strict reading of ‘solely’ by Martini was reflected in the SCHUFA judgements already discussed (see section ‘Right of access in the 1995 Data Protection Directive 95/46/EC’). In contrast, Bygrave argues that a relative notion of ‘solely’ is required for the phrase to be meaningful. According to this position, decisions formally attributed to humans, but originating ‘from an automated data-processing operation the result of which is not actively assessed by either that person or other persons before being formalised as a decision’, would fall under the scope of ‘automated decision-making’.97 It is not clear how this provision in the GDPR will be interpreted in the future.

The scope of data processing to which Article 22 (and Recital 71) applies was narrowed in the adopted version of the GDPR compared to prior drafts. The phrase ‘a decision based solely on automated processing’ proved a point of contention between the EC and EP drafts. Article 20(5) of the EP’s proposed amendments98 to the EC’s draft99 adds the phrase ‘predominantly’ to the measures to which the Article would apply (‘Profiling which leads to measures producing legal effects concerning the data subject or does similarly significantly affect the interests, rights or freedoms of the concerned data subject shall not be based solely or predominantly on automated processing and shall include human assessment …’ (emphasis added)). Following this, the EP wanted to restrict automated decisions on a broader basis than the EC, ie those predominantly and not only solely on automated processes. With ‘predominantly’ not being adopted in the final text of the GDPR, it would appear the strict reading of ‘solely’ was intended.

Questions can also be raised over what constitutes ‘legal effects’ or ‘similarly significant effects’100 required for Article 22 to apply. Recital 71 provides some guidance, as it describes certain situations of ‘significances’ eg online credit applications and e-recruiting practices. Where a decision has no legal or significant effect, Article 22 does not apply. For an automated decision to have legal effects on the data subjects, it would need to affect their legal status.101 Since in most cases the data subject has no legal right to be hired or to be approved for a credit application, cases of being denied an interview or credit by an automated process would not fall under these categories.102 Admittedly, such cases could be considered to have ‘similarly significant’ effects. However, the term ‘similarly significant’ is itself vague and requires interpretation; significance varies on the perception of the data subject (effects of receiving a rejection letter will depend on the economic situation of the data subject, for instance), whereas impacts on legal status can be determined according to the letter of the law.103 Further, in practice it may cause a burden for the data subject to prove that processing affects them significantly.104 Alternatively an external standard for what constitutes significant effects could be defined.

As these constraints demonstrate, the definition of automated decision-making in Article 22(1) significantly narrows the scope of any future right to explanation. Automated decision-making that does not meet the definition provided in Article 22(1) would not be constrained by provisions of Article 22, or the additional measures required as part of notification duties (Article 13(2)f and 14(2)g) or the right of access (Article 15(1)h), including information regarding the ‘logic involved’ (see section ‘A right to explanation derived from the right of access’). A right to explanation implemented through any of the four paths specified above would similarly not apply, still significantly narrowing the right’s potential applicability to a very narrow range of cases meeting all the requirements in Article 22(1) and discussed in this section.

A further factor would constrain the information offered as part of an explanation. As indicated in the discussion of the right of access in the 1995 Directive, any future right to explanation would likely also be limited by overriding interests of the data controller. Recital 63 of the GDPR similarly establishes that the right of access should not infringe upon the rights and freedoms of others, including data controllers. The right can be limited for the sake of trade secrets or intellectual property rights, especially regarding copyright of software. As with the right of access itself, the specific disclosure requirements of Recital 63 require interpretation.105 The Recital notes that:

the result of those considerations should not be a refusal to provide all information to the data subject.

Jurisprudence and legal commentary concerning the Directive’s right of access (see section ‘Right of access in the 1995 Data Protection Directive 95/46/EC’) suggest that the balance between the data subject’s right of access and data controllers’ rights and freedoms will require limited disclosures of the ‘logic involved’ in automated decision-making, primarily concerning system functionality rather than the rationale and circumstances of specific decisions.

Limitations exclusive to a right to explanation derived from safeguards against automated decision-making (Article 22(3))

In addition to the above limitations on a future right to explanation, a number of further limitations are exclusive to a right derived from Article 22(3). In the first instance, Article 22(2) states three conditions that, if met by an automated decision-making process, cause Article 22(1) not to apply:

(a) is necessary for entering into, or performance of, a contract between the data subject and a data controller;

(b) is authorised by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject's rights and freedoms and legitimate interests; or

(c) is based on the data subject's explicit consent.

Article 22(3) specifies that safeguards (ie the rights to human intervention, expression, and contest) only apply when automated decision-making meets Article 22(2)a or c. The scope of any future right to explanation enacted in relation to the safeguards specified in Article 22(3) is therefore limited to cases meeting clause (a) or (c), ie those necessary for entering or performing a contract,106 or with the subject’s explicit consent. It is worth noting that the safeguards in 22(3) do not apply when a decision is made in accordance with Union or Member State law (Article 22(2)b). In the latter case, explicit and specific safeguards are not described. Rather, ‘suitable measures to safeguard the data subject’ must be laid down in the relevant Union or Member State law. This clause potentially excludes a significant range of cases of automated decision-making from the safeguards in Article 22(3) and any right to explanation derived thereof. German commentary on the GDPR has suggested that the ‘suitable measures’ called for in Article 22(2)b do not include the disclosure of the algorithm used due to the risk posed to trade secrets; however, measures to minimize and correct discrimination and biases should be implemented.107

The exemption for automated decisions related to contracts raises a further limitation. Article 22 does not define when automated decision-making is ‘necessary’ for entering or performing a contract, which runs the risk of ‘necessity’ being defined solely by the data controller. Additionally, it is important to note that Article 22(2)a envisions a situation that is different from explicit consent (which is listed as a separate exception in Article 22(2)c). Legislators were contemplating a situation where data controllers make automated decisions that are necessary for a contract, but without seeking consent first. If consent would be necessary, it would have been enough to list the contractual exception under Article 22(2)c. This structure suggests that there can be situations in which the data subject does not consent to an automated decision and, apart from the general notification requirements and right of access in Articles 13–15, does not know about the decision. Data controllers are therefore allowed to decide that automated decision-making is necessary for contractual obligations, while the data subject is unable to object to it. In this case, the data subject retains the right to contest, express views or obtain human intervention for a decision reached under Article 22(3), but not to object to it being made in the first place.

Two interpretations of Article 22

Several other restrictions on Article 22(3) and any future right to explanation derived thereof depend upon whether Article 22 is interpreted as a prohibition or a right to object. Article 22(1) GDPR states that ‘the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her’. Due to its language (‘a right not to’), Article 22(1) can be interpreted in two ways: as a prohibition108 or a right to object to automated decision-making. The two interpretations offer very different protection to the interests of data subjects and data controllers.

The first interpretation reads Article 22(1) as a prohibition, meaning that data controllers would be obligated not to engage in automated decision-making prior to showing that a condition in Article 22(2)a-c is met. The second interpretation reads Article 22(1) as establishing for data subjects a right to object to automated decision-making, which will not apply if one of the requirements in Article 22(2)a-c are met. These interpretations are differentiated by whether action is required by the data subject to restrict automated decision-making. The action in question, a formal objection by the data subject, requires both awareness of the existence of automated decision-making and a willingness to intercede, both of which require intentional effort on the part of the data subject.

Notably, this ambiguity has existed since the Data Protection Directive 1995.109 The wording of Article 15 of the Directive110 allowed the ‘right not to be subject of an automated decision’ referred to in Section 1 (‘Member States shall grant the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing …’ (emphasis added)) to be interpreted as a prohibition or a right to object.111 The ambiguity led Member States to implement this right and associated protections differently.

Article 15 of the Directive has been implemented by Austria, Belgium, Germany, Finland, the Netherlands, Portugal, Sweden and Ireland as a general prohibition,112 with some exceptions. The UK has a different model: data subjects are entitled to request that no automated decision is made about them, but not in the case of so-called ‘exempt decisions’. In cases where data subjects have not lodged such a request, data controllers have to inform them about the fact that an automated decision has been made as well as about the outcome.113

Due to the similarities of language and content between Article 15 of the Directive and Article 22 GDPR, the varying implementation of Article 15 as a prohibition or right to object by Member states supports the interpretation that Article 22 is ambiguous and can be read as a prohibition or right to object. Resolving the ambiguity prior to 2018 is critical, as the two interpretations have very different consequences for data subjects and data controllers.

Impact of the interpretation of Article 22 on a right to explanation

If Article 22 is interpreted as a prohibition, data controllers will not be allowed to make automated decisions about a data subject until one of the three requirements specified in Article 22(2) (necessary to enter or to perform a contract, authorized by law, or explicit consent) is met. Data subjects do not need to act to prevent automated decision-making, but are rather protected by default. Supervisory Authorities would shoulder the burden of enforcing Article 22 by ensuring automated decision-making is carried out legally, and could levy penalties and fines in cases of illegal decision-making. Data controllers, when making automated decisions under Article 22(2)a or c, would need to enact safeguards as specified in Article 22(3). As explained above (see section ‘What if a right to explanation were granted?’), these safeguards could be voluntarily or legally extended to include a right to explanation.

If Article 22 is interpreted as a right to object, automated decision-making is restricted only to cases in which the data subject actively objects. When an objection is entered, decision-making must be shown to meet Article 22(2)a-c. For automated decisions that meet a requirement of Article 22(2), the data subject cannot object. However, when Article 22(2)a or c is met—meaning that the decision is made under contract or with consent—the safeguards specified in Article 22(3) would also apply. In these cases, the data subject would be able to request human intervention, express her views, and contest the decision and, if enacted in the future, demand a right to explanation (see section ‘A right to explanation derived from safeguards against automated decision-making’). Critically, if Article 22 grants a right to object automated decision-making is legally unchallenged by default, even if it does not meet any of the requirements set out in Article 22(2), so long as the data subject does not enter an objection. This limitation increases the burden on data subjects to protect actively their interests relating to profiling and automated decision-making by monitoring and objecting to automated decision-making.

With this comparison in mind, interpreting Article 22 as a prohibition grants greater protections by default to data subject’s interests, at least in the cases in which Article 22(3) would apply. As a prohibition, data controllers would be legally obliged to limit automated decision-making meeting the definition in Article 22(1) to the three cases identified in Article 22(2) (contract, Union or Member State law, consent).

In contrast, a right to object would not pre-emptively restrict the types of automated decision-making undertaken by data controllers to the three cases defined in Article 22(2). Rather, these restrictions would only apply when a data subject lodged an objection against a specific instance of decision-making. At that point, processes not meeting a requirement of Article 22(2) would need to stop, and the safeguards specified in Articles 22(2)b or 22(3) would never be triggered. Article 22 as a right to object would thus circumvent a right to explanation introduced through Article 22(3) by allowing automated decision-making not meeting a requirement in Article 22(2) to occur until the data subject enters an objection. Such ‘legal’ but pre-objection decision-making would not be subject to a right to explanation derived from Article 22(3). With that said, a right to explanation derived from the right of access would not be similarly circumvented. In this case a data subject’s right to explanation would apply to any decision-making meeting the definition provided in Article 22(1), even if the decision-making proved to not meet a requirement of Article 22(2) following the data subject’s objection.

To summarize, if a right to explanation is enacted in the future, at best, data subjects will only deserve an explanation when automated decisions have (i) legal or similarly significant effects, and (ii) are based solely on automated processes. Further, if a right to explanation is derived specifically from Article 22(3), explanations will be required only if automated decision-making is (iii) carried out to enter or under contract, or with explicit consent; and (iv) when overriding interests of the data controller (eg trade secrets) do not exist, as specified in Recital 63. Further restrictions on a right to explanation derived from Article 22(3) depend upon the prevailing interpretation of Article 22 as a prohibition or a right to object.

To disambiguate this limited type of right to explanation from the ‘general’ right to explanation in future discussion of the impact of the GDPR on automated processing of data, and to reflect accurately the scope of limitations on any such right, we recommend addressing instead a ‘right to be informed’ about the existence of automated decision-making and system functionality. The right to be informed addresses the information provided to data subjects about automated decision-making, taking into account all of the limitations on the scope of applicability and type of information to be provided by data controllers as described in the preceding two sections. The right to be informed further accounts for precedents set in the 1995 Directive, and the impact these precedents will likely have on future interpretation of the GDPR’s notification duties (Articles 13–14), right of access (Article 15), and right not to be subject to automated decision-making (Article 22)

Conclusion: future of the accountable automated decision-making

Despite claims to the contrary, a meaningful right to explanation is not legally mandated by the GDPR. Given the proliferation of automated decision-making and automated processing of data to support human decision-making (ie ‘not solely’), this is a critical gap in transparency and accountability. The GDPR appears to give strong protection against automated decision-making but, as it stands, the protections may prove ineffectual. However, transparent and accountable automated decision-making can still be achieved before the GDPR comes into force in 2018.

A right to explanation of specific decisions is not legally mandated by the safeguards contained in Article 22(3), or notification duties in Articles 13 and 14. As proven by the 1995 Directive, the right of access is ambiguous. However, the GDPR's right of access provides a right to explanation of system functionality, what we call a ‘right to be informed’, restricted by the interests of data controllers and future interpretations of Article 15. Any future right to explanation will further be constrained by the definition of ‘automated decision-making’ in Article 22(1), which is limited to decisions based solely on automated processing with legal or similarly significant effects for the data subject. As it stands, a meaningful right of explanation to the rationale and circumstances of specific automated decisions is not forthcoming.

Analysis of prior drafts of the GDPR has revealed several tensions between the EP, Commission, and Council. The placement of the right to explanation in non-binding Recital 71 appears to be a purposeful change deliberated in trilogue. The EP generally sought stronger protections for data subjects against automated decision-making than the EC or Council. Specifically, the EP wanted to include a right to explanation in Article 20,114 whereas the Council would have preferred to have the right to explanation in Recital 58.115 The EC did not include such a right at all. Further, the EP wanted to protect citizens from automated decision that have legal or significant effects when predominantly,116 and not just solely,117 based on automated processes. Human assessment would also have been required.118

As the GDPR is intended to unify data protection law across all European Member States, the interpretation of Article 22 as a prohibition or right to object is critically important. Which interpretation will win out in the implementation of the GDPR in 2018 is not yet clear. Both are viable as suggested by the split in the implementation of Article 15 in the Data Protection Directive 1995 by Member States. Without clarification prior to enforcement, Article 22 will allow for conflicting interpretations of the rights of data subjects and controllers concerning automated decision-making across Member States. Conflicts may soon become inevitable because the two interpretations protect very different interests.

Article 22 interpreted as a prohibition offers greater protection to the interests of data subjects by prohibiting all automated decision-making not meeting a requirement of Article 22(2). In contrast, when interpreted as a right, Article 22 creates a loophole that allows data controllers to undertake automated decision-making without meeting a requirement in Article 22(2), unless the data subject objects. Once an objection is entered, decision-making must be shown to meet one of these requirements or must stop altogether. As a right, the data subject’s interests in not being subjected to automated decision-making are undermined, insofar as significant effort (ie entering an objection) is required from the subject to protect her interests. Article 22 therefore roughly favours the interests of data subjects when interpreted as a prohibition, and the interests of data controllers when interpreted as a right.

The ambiguity of the right not to be subject to automated decision-making (Article 22), and the loopholes and weaknesses it creates, shows that the GDPR is lacking precise language and explicit and well-defined rights and safeguards, and therefore runs the risk of being toothless. Several actions may be recommended to correct some of the weaknesses identified in our analysis. The following recommendations are intended as guidance for legislative and policy steps to correct the deficiencies we have identified in the protections afforded to data subjects against automated decision-making.

Legislative progress can be achieved by modifying the GDPR prior to its enforcement, or passage of additional laws by Member States. Additional legislative steps by Member States are highly likely, as seen with the UK’s House of Commons’ Science and Technology Committee’s recent inquiry on ‘algorithms in decision-making’,119 which was inspired in part by informal consultations by ‘Sense about Science’ (a UK-based charitable trust) with the authors of this article. The inquiry gathers expert opinions on how to achieve accountability and transparency in algorithmic decision-making, including identification of barriers (eg trade secrets), mechanisms for oversight, and requirements to make decisions explainable. As evidence that the recommendations made here can be the starting point for new laws, the inquiry explicitly refers to the rights and duties laid out in the GDPR. On the policy side, the recommendations can influence future guidelines issued by bodies such as the Article 29 Working Party, the European Data Protection Board, the European Data Protection Supervisor, and its Ethics Advisory Group.

We make the following recommendations:

1) Add a right to explanation to legally binding Article 22(3)

If a right to explanation is intended as suggested in Recital 71, it should be explicitly added to a legally binding Article of the GDPR. Such an implementation should clarify the scope of applicability of the right with regard to the impact of Article 22 interpreted as a prohibition or right to object. Alternatively, Member States can be encouraged to implement law on top of the GDPR that requires an explanation of specific decisions. A right to explanation of specific decisions could be considered a suitable safeguard necessitated by Article 22(2)b and 22(3) if an explanation is necessary to contest a decision, as already prescribed in 22(3). The rights to contest a decision, to obtain human intervention or to express views granted in Article 22(3) may be meaningless if the data subject cannot understand how the contested decision was made. To this end, a right to explanation can be introduced requiring data controllers to provide information about the rationale of the contested decision. Clear requirements should be introduced stating the evidence to be supplied by the data controller. Evidence regarding the weighting of features, decision tree or classification structure, and general logic of the decision-making system may be sufficient. However, the risks for innovation and beneficial processing posed by a right to explanation that requires automated decision-making methods to be human interpretable should be seriously considered.120

2) Clarify the meaning of the ‘existence of … significance … envisaged consequences … [and] logic involved’ in Article 15(1)h.

The language and meaning of core concepts in Article 15 is ambiguous. This leaves open the possibility of a right to explanation of the rationale of specific decisions (see section ‘From the Directive to the GDPR: the right to be informed’). However this interpretation is implausible for a number of reasons. As explained in section ‘A right to explanation derived from the right of access’, the semantics and history of the right to access, and the duplication of provisions in Articles 13–14, suggest that the right of access is intended merely as a counterweight to the notification duties of data controllers, and not as a means to introduce a new right (ie a right to explanation of specific decisions) beyond the scope of Articles 13–14.121 Critically, interpreting Article 15 to introduce a right to explanation of specific decisions would not match the intended purpose of the right of access, which according to Recital 63 GDPR is meant to allow the data subject ‘to be aware of, and verify, the lawfulness of processing’. Language should be added to clarify that Article 15 is intended either as a counterweight to Articles 13–14, and thus provides a ‘right to be informed’ about the existence of automated decision-making as well as system functionality, or as a right to explanation of specific decisions. The intended meaning of the five core concepts of Article 15(1)h should be made explicit, and their impact on the information required for data controllers to communicate to data subjects under the right to access (and, similarly, Articles 13–14 notification duties).

3) Clarify the language of Article 22(1) to indicate when decisions are based solely on automated processing

Article 22 is limited in applicability to decisions based solely on automated processing. However, it is unclear what the phrase means in practice. The potential loophole (similarly seen in the German SCHUFA judgements), by which nominal involvement of a human at any stage of the automated process means the process is not solely automated, should be closed. There is still uncertainty if the usage of automated processes for the preparation of a decision constitutes solely automated processes, if the human that takes the final decision does not wish to interfere or to adopt the decision. Clarification can be offered by returning to the phrasing ‘solely or predominantly based on’ proposed by the EP122 in Article 20(5), or by providing specific examples of decision-making based solely and predominantly on automated processing of data.

4) Clarify the language of Article 22(1) to indicate what counts as a legal or significant effect of automated decision, including profiling

Article 22 only applies for automated decision-making with ‘legal effects’ or ‘similarly significant effects’.123 Recital 71 only names two examples of such effects: automatic refusal of an online credit application and e-recruiting practices. The scope of these phrases should be made explicit: do they, for instance, refer only to effects identified in the Articles of the GDPR, or to some broader definition? At a minimum, the perspective to be taken in defining "significant effects" should be identified. Do effects need to be significant from the subjective perspective of the data subject, or according to some external standard?

5) Clarify the language of Article 22(2)a, ‘necessary for entering, or performance of a contract’

Article 22(2)a names this case as an exception of either the prohibition of automated decision-making or the right to object to automated decision-making. Since it is likely that the necessity of such measures will be defined by the data controller and lit (a) does not require consent of the data subject (since this is a separate exception listed under lit (c), this exemption runs the risk of weakening the rights of data subjects.

6) Clarify the language of Article 22 to indicate a prohibition

Ideally, the language that allows for two plausible interpretations should be clarified prior to 2018 when the GDPR comes into force. Due to the number of loopholes and weakening of Article 22(3) safeguards introduced if Article 22 is interpreted as a right to object, as well as wide implementation of Article 15 of the 1995 Directive as a prohibition, we recommend that the language used in Article 22(2) (‘Paragraph 1 shall not apply if the decision:’) be revised to indicate clearly and explicitly that Article 22 is intended as a prohibition against automated decision-making.

7) As a counterweight to trade secrets, introduce an external auditing mechanism for automated decision-making, or set internal auditing requirements for data controllers

Both the right of access and any future right to explanation will face significant limitations due to the sensitivity of trade secrets and intellectual property rights. As our examination of the 1995 Directive shows, explanations granted under the right of access are normally limited to system functionality and significantly limited to protect data controller interests. An ideal solution would allow for examination of automated decision-making systems, including the rationale and circumstances of specific decisions, by a trusted third party. This approach limits the risk to data controllers of exposing trade secrets, while also providing an oversight mechanism for data subjects that can operate when explanations are infeasible or too complex for lay comprehension. The powers of Supervisory Authorities could be expanded in this regard. Alternatively, a European regulator could be created specifically for auditing algorithms, before (certifications) and/or after algorithms are being deployed.124

8) Support further research into the feasibility of explanations alternative accountability mechanisms

Even if a right to explanation is legally granted in the future, the feasibility and practical requirements to offer explanations to data subjects remain unclear. In line with current work on interpretable automated decision-making and machine learning methods,125 research needs to be conducted in parallel to determine whether and how explanations can and should be offered to data subjects (or proxies thereof) with differing levels of expertise and interests. What counts as a meaningful explanation for one individual or group may not be meaningful for another; requirements for ‘meaningful explanations’ must be set if a legal right to explanation is to be practically useful. The right to explanation is also not the only way to achieve accountability and transparency in automated decision-making.126 Further attention should be given to the development and deployment of alternative legal safeguards that can supplement the protections offered by the GDPR. Data controllers working in highly sensitive or risky sectors could, for instance, be required to use human interpretable decision-making methods.127 Methods and (ethical) requirements for auditing algorithms128 should also be further developed, both as standalone accountability tools and as mechanisms to provide an evidence trail for providing explanations of automated decisions.

As the ambiguities highlighted in these recommendations indicate, the GDPR can be a toothless or powerful mechanism to protect data subjects depending on its eventual legal interpretation. The effectiveness of the new framework will largely be determined by Supervisory Authorities, the Article 29 Working Party, the European Data Protection Board, the European Data Protection Supervisor, its Ethics Advisory Group,129 as well as national courts and their future judgments.130 As it stands, transparent and accountable automated decision-making is not yet guaranteed by the GDPR; nor is a right to explanation of specific decisions forthcoming. At best, data subjects will be granted a ‘right to be informed’ about the existence of automated decision-making and system functionality. These shortcomings should be addressed before the GDPR comes into force in 2018.

Footnotes

1

See eg Bryce Goodman and Seth Flaxman, ‘EU Regulations on Algorithmic Decision-Making and a “right to Explanation”’ [2016] arXiv:1606.08813 [cs, stat] <http://arxiv.org/abs/1606.08813> accessed 30 June 2016; Francesca Rossi, ‘Artificial Intelligence: Potential Benefits and Ethical Considerations’ (European Parliament: Policy Department C: Citizens’ Rights and Constitutional Affairs 2016) Briefing PE 571.380 <http://www.europarl.europa.eu/RegData/etudes/BRIE/2016/571380/IPOL_BRI(2016)571380_EN.pdf> accessed 20 April 2017; Mireille Hildebrandt, ‘The New Imbroglio - Living with Machine Algorithms’, The Art of Ethics in the Information Society (2016) <https://works.bepress.com/mireille_hildebrandt/75/> accessed 28 December 2016; IEEE Global Initiative, ‘Ethically Aligned Designed - A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems’ (IEEE 2016) Version 1 <http://standards.ieee.org/develop/indconn/ec/ead_v1.pdf> accessed 19 January 2017; Ben Wagner, ‘Efficiency vs. Accountability? – Algorithms, Big Data and Public Administration’ <https://cihr.eu/efficiency-vs-accountability-algorithms-big-data-and-public-administration/> accessed 14 January 2017; Fusion, ‘EU Introduces “right to Explanation” on Algorithms | Fusion’ (2016), quoting Ryan Calo <http://fusion.net/story/321178/european-union-right-to-algorithmic-explanation/> accessed 10 November 2016.

2

See eg Information Commissioner’s Office, ‘Overview of the General Data Protection Regulation (GDPR)’ (Information Commissioner’s Office 2016) 1.1.1 <https://ico.org.uk/for-organisations/data-protection-reform/overview-of-the-gdpr/individuals-rights/rights-related-to-automated-decision-making-and-profiling/> accessed 10 November 2016; House of Commons Science and Technology Committee, ‘Robotics and Artificial Intelligence’ (House of Commons 2016) HC 145 <http://www.publications.parliament.uk/pa/cm201617/cmselect/cmsctech/145/145.pdf> accessed 10 November 2016; European Parliament Committee on Legal Affairs, ‘Report with Recommendations to the Commission on Civil Law Rules on Robotics’ (European Parliament 2017) 2015/2103(INL) <http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-//EP//NONSGML+REPORT+A8-2017-0005+0+DOC+PDF+V0//EN> accessed 11 November 2016.

3

See eg Joon Ian Wong, ‘The UK Could Become a Leader in AI Ethics—if This EU Data Law Survives Brexit’ <http://qz.com/807303/uk-parliament-ai-and-robotics-report-brexit-could-affect-eu-gdpr-right-to-explanation-law/> accessed 10 November 2016; Cade Metz, ‘Artificial Intelligence Is Setting Up the Internet for a Huge Clash With Europe’ WIRED (2016) <https://www.wired.com/2016/07/artificial-intelligence-setting-internet-huge-clash-europe/> accessed 10 November 2016; Fusion (n 1); Bernard Marr, ‘New Report: Revealing the Secrets of AI Or Killing Machine Learning?’ <http://www.forbes.com/sites/bernardmarr/2017/01/12/new-report-revealing-the-secrets-of-ai-or-killing-machine-learning/#258189058e56> accessed 14 January 2017; Liisa Jaakonsaari, ‘Who Sets the Agenda on Algorithmic Accountability?’ (EURACTIV.com, 26 October 2016) <https://www.euractiv.com/section/digital/opinion/who-sets-the-agenda-on-algorithmic-accountability/> accessed 3 March 2017; Nick Wallace, ‘EU’s Right to Explanation: A Harmful Restriction on Artificial Intelligence’ <https://www.datainnovation.org/2017/01/eus-right-to-explanation-a-harmful-restriction-on-artificial-intelligence/> accessed 3 March 2017.

4

Reg (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Dir 95/46/EC (General Data Protection Regulation) 2016.

5

The proliferation of unaccountable and inscrutable automated systems has proven a major concern among government bodies, as reflected in numerous recent reports on the future ethical and social impacts automated systems. See, for instance, Catherine Stupp, ‘Commission to Open Probe into Tech Companies’ Algorithms next Year’ (EurActiv.com, 8 November 2016) <http://www.euractiv.com/section/digital/news/commission-to-open-probe-into-tech-companies-algorithms-next-year/> accessed 11 November 2016; Partnership on AI, ‘Partnership on Artificial Intelligence to Benefit People and Society’ (Partnership on Artificial Intelligence to Benefit People and Society, 2016) <https://www.partnershiponai.org/> accessed 11 November 2016; National Science and Technology Council Committee on Technology, ‘Preparing for the Future of Artificial Intelligence’ (Executive Office of the President 2016) <https://www.whitehouse.gov/sites/default/files/whitehouse_files/microsites/ostp/NSTC/preparing_for_the_future_of_ai.pdf> accessed 11 November 2016; European Parliament Committee on Legal Affairs (n 2); House of Commons Science and Technology Committee (n 2); Government Office for Science, ‘Artificial Intelligence: An Overview for Policy-Makers’ (Government Office for Science 2016) <https://www.gov.uk/government/publications/artificial-intelligence-an-overview-for-policy-makers> accessed 11 November 2016.

6

Brent Mittelstadt and others, ‘The Ethics of Algorithms: Mapping the Debate’ [2016] 3 Big Data & Society 2.

7

Christian Sandvig and others, ‘Auditing Algorithms: Research Methods for Detecting Discrimination on Internet Platforms’ [2014] Data and Discrimination: Converting Critical Concerns into Productive Inquiry <http://social.cs.uiuc.edu/papers/pdfs/ICA2014-Sandvig.pdf> accessed 13 February 2016.

8

Mike Ananny, ‘Toward an Ethics of Algorithms Convening, Observation, Probability, and Timeliness’ (2016) 41 Science, Technology & Human Values 93.

9

This is the type of explanation of automated decision-making imagined in Recital 71 GDPR, which states ‘In any case, such processing should be subject to suitable safeguards, which should include specific information to the data subject and the right to obtain human intervention, to express his or her point of view, to obtain an explanation of the decision reached after such assessment and to challenge the decision.’

10

Goodman and Flaxman (n 1).

11

This is specifically a kind of explanation possible only once a decision has been taken. It refers to a particular decision, not the decision-making method or system itself. This is the type of explanation imagined in Recital 71 GDPR, which calls for ‘an explanation of the decision reached after such assessment’. The Recital explicitly refers to a singular decision that has been reached.

12

Jenna Burrell, ‘How the Machine “Thinks:” Understanding Opacity in Machine Learning Algorithms’ [2016] 3 Big Data & Society 1.

13

Goodman and Flaxman (n 1). The ‘right to explanation’ is proposed as follows. ‘The law will also effectively create a “right to explanation,” whereby a user can ask for an explanation of an algorithmic decision that was made about them’ (1). Further, ‘Paragraph 71 of the recitals (the preamble to the GDPR, which explains the rationale behind it but is not itself law) explicitly requires data controllers to “implement appropriate technical and organizational measures” that “prevents, inter alia, discriminatory effects” on the basis of processing sensitive data’ (3) Further, ‘The provisions outlined in Articles 13-15 specify that data subjects have the right to access information collected about them, and also requires data processors to ensure data subjects are notified about the data collected. However, it is important to distinguish between these rights, which may be termed the right to access and notification, and additional “safeguards for the rights and freedoms of the data subject” required under Article 22 when profiling takes place. Although the Article does not elaborate what these safeguards are beyond “the right to obtain human intervention”, Articles 13 and 14 state that, when profiling takes place, a data subject has the right to “meaningful information about the logic involved.” This requirement prompts the question: what does it mean, and what is required, to explain an algorithm’s decision?’ (6).

14

Ibid; Rossi (n 1).

15

‘Recitals explain the background to the legislation and the aims and objectives of the legislation. They are, therefore, important to an understanding of the legislation which follows.’ EUROPA, ‘Guide to the Approximation of EU Environmental Legislation ANNEX I’ (Environment, 2015) <http://ec.europa.eu/environment/archives/guide/annex1.htm> accessed 3 March 2017. See also Case C-355/95 P Textilwerke Deggendorf GmbH (TWD) v Commission of the European Communities and Federal Republic of Germany [1997] ECR I (15 May 1997), p. I-02549 para 21: ‘In that regard, it should be stated that the operative part of an act is indissociably linked to the statement of reasons for it, so that, when it has to be interpreted, account must be taken of the reasons which led to its adoption.’

16

For a detailed overview of the jurisprudence of the European Court of Justice on the limited role of Recitals in EU law, see Roberto Baratta, ‘Complexity of EU Law in the Domestic Implementing Process’ (2014) 2 The Theory and Practice of Legislation 293. An opposing view is offered by Pagallo, who claims that secondary rules of law (eg Recitals) can alter primary rules of law. Ugo Pagallo, ‘Three Lessons Learned for Intelligent Transport Systems That Abide by the Law’ (2016) Jusletter IT RZ 13, November 2016 <http://jusletter-it.weblaw.ch/issues/2016/24-November-2016/three-lessons-learne_9251e5d324.html> accessed 14 March 2017.

17

Tadas Klimas and Jurate Vaiciukaite, ‘The Law of Recitals in European Community Legislation’ (2008) 15 ILSA Journal of International & Comparative Law 32 <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1159604> accessed 22 January 2017. The paper discusses in detail the legal status of Recitals in European law.

18

Baratta (n 16) 17.

19

Case 215/88 Casa Fleischhandels [1989] ECR-2789, para 31; See also Baratta (n 16) 13.

20

Peter Jones, ‘Group Rights’, The Stanford Encyclopedia of Philosophy (Summer 2016 edn (2016, forthcoming)) <http://plato.stanford.edu/archives/sum2016/entries/rights-group/>.

21

Christoph Grabenwarter, The European Convention for the Protection of Human Rights and Fundamental Freedoms: A Commentary (01 edn, Verlag C.H. Beck 2014).

22

The ‘trilogue negotiations’ describe a series of meetings between the European Commission, Council, and Parliament to adopt a final text for the GDPR. For an introduction and discussion of the legal basis of trilogue, see Oliver Proust, ‘Unravelling the Mysteries of the GDPR Trilogues’ (Privacy, Security and Information Law, 2015) <http://privacylawblog.fieldfisher.com/2015/unravelling-the-mysteries-of-the-gdpr-trilogues/> accessed 16 December 2016.

23

Rita Heimes, ‘Top 10 Operational Impacts of the GDPR: Part 5 - Profiling’ <https://iapp.org/news/a/top-10-operational-impacts-of-the-gdpr-part-5-profiling/> accessed 10 November 2016: ‘A hotly contested provision of the GDPR, the “profiling” restrictions ultimately adopted were narrower than initially proposed.’

24

European Parliament Committee on Civil Liberties, Justice and Home Affairs, ‘Report on the Proposal for a Regulation of the European Parliament and of the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (General Data Protection Regulation) - A7-0402/2013’ (European Parliament 2013) A7–0402/2013 <http://www.europarl.europa.eu/sides/getDoc.do?type=REPORT&reference=A7-2013-0402&language=EN> accessed 10 November 2016.

25

European Commission, ‘Regulation of the European Parliament and the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data (General Data Protection Regulation)’ (European Commission 2012) 2012/0011 (COD) <http://ec.europa.eu/justice/data-protection/document/review2012/com_2012_11_en.pdf> accessed 10 November 2016.

26

European Digital Rights, ‘Comparison of the Parliament and Council Text on the General Data Protection Regulation’ (European Digital Rights International 2016) 140 <https://edri.org/files/EP_Council_Comparison.pdf> accessed 20 November 2016. This source provides a side-by-side comparison of the aforementioned drafts from the European Parliament (n 24).and European Commission (n 25), as well as amendments to the Commission’s text proposed by the European Council.

27

European Digital Rights, ibid 40.

28

Goodman and Flaxman (n 1).

29

See also Suzanne Rodway, ‘Just How Fair Will Processing Notices Need to be under the GDPR’ (2016) 16 Privacy & Data Protection 16. Note the paper focused on the EC draft but talks in general about the aim and purpose of notification duties. The author explains that these provisions mainly mean that data controllers have to update their privacy notices. Further: ‘whether any automated decisions will be made using the data (including for profiling purposes) and, if so, a meaningful explanation about the logic used in those decisions and the possible consequences of those decisions for the data subject. Examples include whether a credit card application might be declined or a job application rejected’. This suggests that Articles 13–14 only create a notification duty to inform about the general usage of automated decision-making before a decision has been made, and to inform about the possible future consequences. Further support for this argument can be found in Recitals 60–62 GDPR.

30

Goodman and Flaxman (n 1).

31

Burrell (n 12).

32

Boris P. Paal, ‘DS-GVO Art. 15 Auskunftsrecht der betroffenen Person’ in Boris P. Paal and Daniel Pauly (eds), Datenschutz-Grundverordnung (1st edn, beck-online 2017) Rn 3. Recital 63 GDPR also supports this interpretation in stating that ‘A data subject should have the right of access to personal data … and to exercise that right … in order to be aware of, and verify, the lawfulness of processing’ (emphasis added).

33

Florian Schmidt-Wudy, ‘DS-GVO Art. 15 Auskunftsrecht der betroffenen Person’ in Heinrich A. Wolff and Stefan Brink (eds), Datenschutz-Grundverordnung (18th edn, beck-online 2016) Rn 2.

34

Mario Martini, ‘DS-GVO Art. 22 Automatisierte Entscheidungen im Einzelfall einschließlich Profiling’ in Boris P. Paal and Daniel Pauly (n 32) Rn 4–6.

35

Peter Bräutigam and Florian Schmidt-Wudy, ‘Das geplante Auskunfts- und Herausgaberecht des Betroffenen nach Art. 15 Der EU-Datenschutzgrundverordnung’ (2015) 31 Computer und Recht 56, 62 supports this interpretation in commenting on the EP’s draft of the GDPR. The EP’s draft contains the same phrasing as the final adopted text: Art 15(h) requires information about ‘the significance and envisaged consequences of such processing’. The authors note that the phrasing is very imprecise. An example is given of an Internet provider being obligated to inform that automated processing methods are being used to determine creditworthiness, which could lead to the consequence that the person has to pay in advance (rather than being offered credit). This example suggests that the authors believe that art 15(h) aims to inform about system functionality rather than to provide information about how an individual decision was reached.

36

Prior drafts of art 15 also support this view. The German translation of the EC draft stated in art 15(h) ‘die Tragweite der Verarbeitung und die mit ihr angestrebten Auswirkungen, zumindest im Fall der Maßnahmen gemäß Artikel 20’, which translates to ‘the scope [rather than significance] of the data processing and its intended consequences’. In addition, the EP draft stated in art 15(h) ‚die Tragweite der Verarbeitung und die mit ihr angestrebten Wirkungen’. The phrase ‘angestrebten Wirkungen’ translates to ‘the scope and its intended effects’, not consequences. Even though the adopted language in the GDPR is vaguer, prior drafts demonstrate art 15 was intended to inform data subjects about data processor’s ‘intended effects’ for the data subject by using automated decision-making methods. For further discussion, see: ibid 61ff.

37

C-141/12 and C-372/12 [2014] European Court of Justice ECLI:EU:C:2014:2081 <http://curia.europa.eu/juris/document/document.jsf?text=&docid=155114&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=420117> accessed 12 January 2017.

38

Douwe Korff, ‘New Challenges to Data Protection Study - Working Paper No. 2: Data Protection Laws in the EU: The Difficulties in Meeting the Challenges Posed by Global Social and Technical Developments’ (European Commission DG Justice, Freedom and Security 2010) <http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1638949> accessed 8 December 2016.

39

See Recital 41 of the Directive ‘Whereas any person must be able to exercise the

right of access to data relating to him which are being processed, in order to verify in particular the accuracy of the data and the lawfulness of the processing’ See also Paal (n 32) Rn 19–22, who notes that the general purpose of the right of access according to art 15 GDPR is the realization of the so called ‘two step model’. In a first step data subjects have to right to (i) know if is data being processed and (ii) if so, what data is used and in some cases data controllers have to provide additional information (such as the logic involved in automated processing).

40

See, for instance, debate in the UK House of Lords concerning the meaning of ‘logic involved’ and ‘trade secrets’ in the 1998 Data Protection Act: Grand Committee on the Data Protection Bill, ‘Official Report of the Grand Committee on the Data Protection Bill [HL] (Hansard, 23 February 1998)’ (UK Parliament - House of Lords 1998) <http://hansard.millbanksystems.com/grand_committee_report/1998/feb/23/official-report-of-the-grand-committee#S5LV0586P0_19980223_GCR_1> accessed 15 December 2016. See also Philip Coppel, Information Rights: Law and Practice (Bloomsbury Publishing 2014) ch 5, s 3, which discusses how trade secrets limit the right of access and to know about the logic involved in automated processing, and provides an overview of the right of access as implemented by Member States.

41

As an example, Council of Europe, ‘The Protection of Individuals with Regard to Automatic Processing of Personal Data in the Context of Profiling’ (Council of Europe 2010) Recommendation CM/Rec(2010)13, 138 argues that the right of access in art 12 of the Directive equates to a right to be informed, not a right to an explanation of a decision reached: ‘Principle 5.1 states that the data subject should be entitled to know about the personal data concerning him or her and the logic which served as a basis for the profiling. It is indeed essential that a data subject exercising the right of access should be informed of the statistical method and inferences used for his or her profiling, the logic underpinning the processing and the envisaged consequences of the profile’s attribution’ (emphasis added).

42

Korff (n 38) 86.

43

Ibid 85.

44

Ibid 86.

45

Note that Recital 41 of the Directive also states in relation to trade secrets that ‘these considerations must not, however, result in the data subject being refused all information’ (emphasis added). See also Lee A Bygrave, ‘Automated Profiling: Minding the Machine: Article 15 of the EC Data Protection Directive and Automated Profiling’ (2001) 17 Computer Law & Security Review 17. The author notes that arts 12 and 15(1) considered together suggest that the data controller must understand and document the logic involved in an automated decision, including the categories of data considered, and their role in the decision-making process. However, the extent to which this information must be given to the data subject can be limited by overriding interests of the controller, including trade secrets.

46

Korff (n 38) 86.

47

Douwe Korff, ‘New Challenges to Data Protection Study - Country Report: France’ (European Commission DG Justice, Freedom and Security 2010) 27 <https://papers.ssrn.com/abstract=1638955> accessed 15 December 2016.

48

Ibid.

49

Art 8(5) of the UK Data Protection Act 1998 states that ‘Section 7(1)(d) is not to be regarded as requiring the provision of information as to the logic involved in any decision-taking if, and to the extent that, the information constitutes a trade secre’ (emphasis added).

50

Douwe Korff, ‘New Challenges to Data Protection Study - Country Report: United Kingdom’ (European Commission DG Justice, Freedom and Security 2010) 48 <http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1638938> accessed 15 December 2016.

51

Concerning how an explanation is required both as a safeguard against automated decision-making and through the right of access, see: Douwe Korff, ‘New Challenges to Data Protection Study - Country Report: Germany’ (European Commission DG Justice, Freedom and Security 2010) 27 <http://ec.europa.eu/justice/data-protection/document/studies/files/new_privacy_challenges/final_report_country_report_a4_germany.pdf> accessed 15 December 2016.  Concerning s 6(a)2(2) right to explanation: Kai von Lewinski, ‘BDSG § 6a Automatisierte Einzelentscheidung’ in Wolff and Brink (eds), Beck’scher Online-Kommentar Datenschutzrecht (17th edn, beck-online 2016) Rn 45–49; ibid Rn 47–48.1. states the required explanation can be short and must only include the main reason for the decision. The data subject must be able to understand why a decision has not been made in her favour.

For discussion of s 6(a)3 (the extended right of access and its limitations due to trade secrets), see Peter Gola, Christoph Klug, and Barbara Körffer, ‘BDSG § 6a Automatisierte Einzelentscheidung’ in Peter Gola and Rudolf Schomerus (eds), Bundesdatenschutzgesetz (12th edn, Verlag C.H. Beck 2015) Rn 18–19. Lewinski, ibid Rn 50–53 commenting on s 6(a)3 (the extended right of access) explains that the data subject needs to have a basis to evaluate that an automated decision is accurate. This suggests that there is a least some basis to obtain an explanation after the decision has been made under the extended right of access. However, it is noted that trade secrets restrict this right: only the basis of decision parameters have to be disclosed, but not details of the parameters. The ‘logical structure’ must be disclosed, which refers to the ‘decision tree’, but not the software or the code. Lewinski, ibid Rn 47–48.1 also notes that the scope (the extent to which information must be disclosed) of the right of access and the safeguards in s 6(a)2 are comparable.

On safeguards in s 6, see Gola, Klug and Körffer, ibid Rn 1–20, commenting on s 6(a), explains a right to explanation is granted under s 6(a)2(2), which is one of the safeguards relating to the second exemption of the general prohibition of automated decision-making. Safeguards in this article require the data controller to inform about how the decision was reached (three-step model: to be informed about the fact that such a decision was taken and, upon request of the data subject, to receive an explanation of the decision reached and the right to contest the decision). For a discussion, see Gola, Klug and Körffer, ibid, Rn 12–14c. The first exception under s 6(a)2(1) (performance of a contract and if the decision has been made in favour of the data subject) does not explicitly require an explanation (unlike s 6(a)2(2)). Rather, the right of access in s 6(a)3 will apply in these cases which, per above, could be interpreted as a right to obtain an explanation after the decision has been made. The phrasing of s 6(a)3 (extended right of access) can be interpreted both ways: as granting an explanation both before and after a decision has been made; see Lewinski, ibid Rn 1–4. Further, the SCHUFA judgments (see section ‘Right of access in the 1995 Data Protection Directive 95/46/EC’) show that judges interpreted the right of access to grant a limited right to obtain an explanation after a decision has been made; see Lewinski, Rn 50–51.

For further discussion of the overlap of automated decision-making under s 6a and scoring provisions under s 28b see Gola, Klug and Körffer, ibid Rn 6–7, 15–17. Note that the German commentators mentioned do not see a difference between s 6(a)2 right to explanation and s 6(a)3 right of access when discussing the limitations imposed by trade secrets on information given to the data subject. See also Lewinski, ibid Rn 1–4; Gola, Klug and Körffer, ibid Rn 14–14a.

52

Philip Scholz, ‘BDSG § 6a Automatisierte Einzelentscheidung’ in Spiros Simitis (ed), Bundesdatenschutzgesetz (8th edn, Nomos 2014) Rn 38.

53

Lewinski (n 51) Rn 50–52 states that the wording of the German Data Protection Act is not clear regarding whether the right of access refers to information about the ‘process’ (meaning the system) or ‘a decision made’.

54

BGH: kein umfassender Auskunftsanspruch gegen SCHUFA 2014 (VI ZR 156/13) BDSG s 34 Abs 4; Mario Martini, ‘Big Data als Herausforderung für den Persönlichkeitsschutz und das Datenschutzrecht’ [2014] DVBI 1481.

55

Gola, Klug and Körffer (n 51) Rn 18–19 ‘Über die allgemeinen Auskunftsansprüche nach § 19 bzw. § 34 sind nach Absatz 3 auch Angaben zu machen über den logischen Aufbau der automatisierten Verarbeitung. Dem Betroffenen soll in erster Linie veranschaulicht werden, was mit seinen Daten geschieht. Er soll in die Lage versetzt werden, Gesichtspunkte vorzubringen, die inhaltliche Überprüfung der automatisiert vorgenommen’ vermuteten ‘Bewertung ermöglichen. Unter dem Gesichtspunkt des Schutzes von Geschäftsgeheimnissen und des Urheberrechtsschutzes umfasst die Auskunftspflicht jedoch nicht die verwendete Software (zur sog. Scoreformel als Geschäftsgeheimnis vgl. BGH, NJW 2014, 1235, der die Frage der Reichweite des Auskunftsanspruchs über den logischen Aufbau der automatisierten Verarbeitung mangels Vorliegens einer automatisierten Einzelentscheidung dahinstehen ließ).’

56

Korff (n 51) 27 ff.

57

Lewinski (n 51) Rn 50–53

58

Bräutigam and Schmidt-Wudy (n 35) 62; Jens Hammersen and Ulrich Eisenried, ‘Ist „Redlining” in Deutschland erlaubt? Plädoyer für eine weite Auslegung des Auskunftsanspruchs’ [2014] ZD Zeitschrift für Datenschutz 342.

59

Amongst others, see judgment of the German Federal Court BGH, ZD 2014, 306. It is important to note that the German court refused to talk about the extent to which the data subject is entitled to know about the logic involved as the Court ruled that in this case there was no automated decision, as explained in: Gola, Klug and Körffer (n 51) Rn 18–19.

60

Judgment of the German Federal Court Bundesgerichtshof 28 January 2014 – VI ZR 156/13. Also LG Gießen 6 March 2013 – 1 S 301/12. Also, AG Gießen 11 October 2014 – 47 C 206/12.

61

Judgment of the German Federal Court: Scoring und Datenschutz BGH, 28 January 2014 - VI ZR 156/13 (LG Gießen, AG Gießen) 169.

62

‘Reference groups’ refer to profiles or classifications that inform the assessment of creditworthiness. For a discussion, see for instance: Mittelstadt and others (n 6); Mireille Hildebrandt and Serge Gutwirth, Profiling the European Citizen (Springer 2008).

63

Judgment of the German Federal Court : BGH: Umfang einer von der SCHUFA zu erteilenden Auskunft BGH, Urteil vom 28 January 2014 - VI ZR 156/13 (LG Gießen, AG Gießen) 490. The judgments show that the right of access is very limited.

64

The court, however, acknowledged that there is discussion about whether or not information about the weighting of features and reference groups should be included in disclosures, and to what extent.

65

Reflecting this, the court subsequently refused to discuss the extent to which the logic involved needed to be disclosed by the data controller. Rather, it addressed the general obligation of data controllers to provide information about the data being processed, derived from the right of access.

66

Art 15(1) of the Directive defines ‘automated individual decisions’ as ‘a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.’ Similarly, Art 22(1) GDPR defines ‘automated decision-making’ as ‘a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.’

67

Bundesgesetz über den Schutz personenbezogener Daten (Datenschutzgesetz 2000 - DSG 2000) 2000 (DSG 2000) 49 Abs (3). – ‘Dem Betroffenen ist bei automatisierten Einzelentscheidungen auf Antrag der logische Ablauf der automatisierten Entscheidungsfindung in allgemein verständlicher Form darzulegen. § 26 Abs. 2 bis 10 gilt sinngemäß.’

68

Decision of the Austrian Data Protection Commission 24 April 2009, App no K121.461/0003-DSK/2009.

69

Amongst others, Decisions of the Austrian Data Protection Commission: 24 April 2009 App no K121.461/0003-DSK/2009, addressing the need to explain the system used; 27 August 2010 App no K121.599/0014-DSK/2010; 22 May 2013 App no K121.935/0006-DSK/2013; 25 April 2008 App no 121.348/0007-DSK/2008, addressing the need to explain the system used; 8 May 2009 App no K121.470/0007-DSK/2009, addressing whether a process counts as an automated decision; 20 March 2009 App no K121.467/0007-DSK/2009; 25 April 2008 App no K121.348/0007-DSK/2008; 25 April 2008 App no K121.348/0007-DSK/2008; 25 May 2012 App no K121.791/0008-DSK/2012; 9 June 2009 App no K121.460/0008-DSK/2009; 19 June 2009 App no K121.494/0013-DSK/2009; 2 February 2007 App no K121.238/0006-DSK/2007; Austrian Administrative Court judgments 11 December 2009 App no 009/17/0223; 15 November 2012 App no 2008/17/0096; 20 February 2008 App no 2005/15/0161.

70

According to a decision of the Austrian Data Protection Commission 25 April 2008 App no K121.348/0007-DSK/2008, the obligation is with the data controller to inform about the procedure of automated decision-making in an understandable manner: “die Pflicht, dem Betroffenen den Ablauf der automatisierten Entscheidungsfindung in allgemein verständlicher Form darzulegen.”

71

Decision of the Austrian Data Protection Commission 12 December 2007 App no K121.313/0016-DSK/2007. See also 12 December 2007 App no K121.313/0016-DSK/2007. It is important to note that in the latter decision the Commission talked about a hypothetical obligation of the data controller, since the applicant did not lodge a request under s 49(3) but rather invoked his general right of access under s 26. The Commission stated that if the data subject had lodged a complaint under s 49(3), the data controller would need to disclose this information, but how far trade secrets would limit the disclosure would need to be examined on a case to case basis, therefore there is no precedent yet:.

72

In this decision it was found that there is no automated decision because the decision was based on a group (‘peer group’) rather than the individual and it was stated that such an automated decision (marketing purposes) would not have enough significant effects and consequences to have s 49 (3) apply; see: Decision of the Austrian Data Protection Commission 10 March 2016 App no DSB-D122.322/0001-DSB/2016.

73

This loophole points towards the need to recognise some type of group privacy right in data protection law, as processing of identifiable data is not required to learn about and take actions towards an individual. For further discussion, see: Mittelstadt and others (n 6); Brent Mittelstadt, ‘From Individual to Group Privacy in Big Data Analytics’ [2017] Philosophy & Technology 1; Linnet Taylor, Luciano Floridi and Bart van der Sloot (eds), Group Privacy: New Challenges of Data Technologies (1st edn, Springer 2017); Alessandro Mantelero, ‘Personal Data for Decisional Purposes in the Age of Analytics: From an Individual to a Collective Dimension of Data Protection’ (2016) 32 Computer Law & Security Review 238; Lee A Bygrave, Data Protection Law: Approaching Its Rationale, Logic and Limits (Kluwer Law Intl 2002) ch 15.

74

Martini (n 34) Rn 42–44 explains how other provisions of the GDPR fall behind and weaken the current data protection standards, e.g. in terms of contractual relations as a legitimate reason for automated decisions, in that regard, see also Alexander Roßnagel, Philipp Richter and Maxi Nebel, ‘Besserer Internetdatenschutz für Europa. Vorschläge Zur Spezifizierung Der DS-GVO’ (2013) 3 Zeitschrift für Datenschutz 103.

75

Hammersen and Eisenried (n 58), commenting on the EC’s original 2012 draft, note that the interpretation of the Directive’s right of access through jurisprudence suggests that the right grants a very weak type of explanation of automated processing of data. The data subject is not provided a basis to scrutinize the outcome of automated processing of data, including the method or algorithm used, or reference groups. The GDPR has not strengthened the right of access compared to the Directive in any notable way, meaning similar limitations are likely to apply.

76

The Directive’s right of access does not refer to the future, or use identical language to notification duties. The latter point is unremarkable, as the Directive did not contain notification duties. We can thus only discuss whether a right to explanation of system functionality or specific decisions was derived by Member States from the right of access in art 12 of the Directive, as opposed to ex ante or ex post explanations.

77

Recital 41 of the Directive makes a similar claim.

78

The right of access only exists if the data controller has personal data of the data subjects, see: Mireille Hildebrandt, ‘The Dawn of a Critical Transparency Right for the Profiling Era’, Digital Enlightenment Yearbook 2012 (IOS Press 2012). Further, the right of access is limited as far as data of other data subjects are concerned, see Hildebrandt and Gutwirth (n 62).

79

Paal, ‘DS-GVO Art. 13 Informationspflicht bei Erhebung von Personenbezogenen Daten bei der Betroffenen Person’ in Paal and Pauly (n 32); Martini (n 34) Rn 42–44.

80

Paal (n 79).

81

BGH, 2812014 - VI ZR 156/13 - BGH: Umfang einer von der SCHUFA zu erteilenden Auskunft Rn 489-494 [2014] BGH VI ZR 156/13, 2014 MMR Rn 494; Bräutigam and Schmidt-Wudy (n 35) 61.

82

Paal (n 79).

83

Ibid.

84

BGH (n 54).

85

Recitals 47 and 63 GDPR address protection of the interests of data controllers. Recital 63 notes, in relation to the right of access, ‘That right should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property and in particular the copyright protecting the software.’

86

Art 23(1) GDPR addresses possible further limitations on obligations and rights under arts 12–22, including the right of access; art 89(2) similarly allows for limitations on rights and obligations for processing for scientific or historical research or statistical purposes. Finally, art 89(3) addresses limitations for processing for archiving purposes or in the public interest.

87

On “meaningful information” and Article 13, see Paal (n 32) Rn 19–22. The general purpose of the right of access according to art 15 GDPR is the realisation of the so-called ‘two step model’. In a first step data subjects have to right to (i) know if their data is being processed and (ii) if so, what data are used. In some cases data controllers have to provide additional information (such as the logic involved in processing). Further, ibid Rn 31 the author suggests that the content and scope of the disclosure according to art 13 is the same as in art 15. The authors cite Bräutigam and Schmidt-Wudy (n 35) in discussing the scope of art 13 GDPR as one of the sources to limit the data controller’s obligations under art 15. This suggests that arts 13 and 15 GDPR do not differ in the obligation of the data controllers to disclose information. See also Paal (n 79) Rn 31–33. On disclosure of the algorithm, see: Paal Paal/Pauly, Datenschutz-Grundverordnung, DS-GVO Art. 13 Informationspflicht bei Erhebung von personenbezogenen Daten bei der betroffenen Person, Rn. 31–32. On the necessity of disclosure to verify accuracy, see: Schmidt-Wudy in Beck'scher Online-Kommentar Datenschutzrecht, Wolff/Brink 19. Edition, DS-GVO Artikel 15 Auskunftsrecht der betroffenen Person, Rn. 76-80.

88

The Board fills a similar role to the Article 29 Working Party established by the Directive. Interestingly, the Board has explicitly been called upon in art 70(1)f to ‘issue guidelines, recommendations and best practices … for further specifying the criteria and conditions for decisions based on profiling pursuant to Article 22(2).’ In doing so, the GDPR is implicitly acknowledging that the applicability of the three cases specified in art 22(2) (contract, Union or Member State law, or consent) remains an open issue.

89

European Data Protection Supervisor, ‘Ethics Advisory Group’ (2015) <https://secure.edps.europa.eu/EDPSWEB/edps/EDPS/Ethics> accessed 8 March 2017.

90

Tal Zarsky, ‘Transparent Predictions’ (2013) 2013 University of Illinois Law Review <http://papers.ssrn.com/sol3/Papers.cfm?abstract_id=2324240> accessed 17 June 2016. A right to contest realised through expert human intervention may be the most pragmatic safeguard against automated decisions. Elsewhere it has been argued that transparency disclosures prove more impactful if tailored towards trained third parties or regulators as opposed to data subjects themselves.

91

Bygrave (n 45) discusses comparable limitations on the definition of ‘automated individual decisions’ in the 1995 Directive.

92

This position is also adopted in Fusion (n 1); Bygrave (n 45); Hildebrandt (n 78) 51 in reference to the EC’s 2012 draft, explains that human intervention will render art 20 inapplicable.

93

Martini (n 34) Rn 16–19.

94

Bygrave (n 45).

95

Martini (n 34) Rn 20.

96

Possible grounds for opposing views to Martini can be found in Dimitrios Pachtitis v European Commission F-35/08 [2010] European Civil Service Tribunal ECLI:EU:F:2010:51 [63] <http://eur-lex.europa.eu/legal-content/EN/ALL/?uri=CELEX%3A62008FJ0035> accessed 22 February 2017: ‘Furthermore, although it is true that, as the Commission observes, the correction of the admission tests was carried out by computer and that, therefore, it is based on an automated procedure with no subjective discretion, the fact remains that the conduct of that automated procedure involved a decision on the merits, in so far as the ‘advisory committee’ … first, determined the level of difficulty of the multiple choice questions set during the admission tests and, second, cancelled certain questions, as recounted in paragraph 26 of this judgment. Those are evidently tasks to be carried out by a competition selection board.’

97

Bygrave (n 45).

98

European Parliament Committee on Civil Liberties, Justice and Home Affairs (n 24).

99

European Commission (n 25).

100

Martini (n 34) Rn 25–28. Legal effects must influence the legal status of the data subject, whereas significant effects could mean [our translation of Rn 27] ‘being denied to be part of a contract or being denied to choose a payment method e.g. PayPal.’

101

Bygrave (n 45).

102

Lewinski (n 51) Rn 28–31; ibid Rn 32–37.

103

Bygrave (n 45).

104

Hajar Malekian, ‘Profiling under General Data Protection Regulation (GDPR): Stricter Regime?’ [2016] ResearchGate <https://www.researchgate.net/publication/304102392_Profiling_under_General_Data_Protection_Regulation_GDPR_Stricter_Regime> accessed 20 November 2016.

105

Recital 63 suggests the right of access should allow a data subject to ‘know and obtain communication in particular with regard to the purposes for which the personal data are processed, where possible the period for which the personal data are processed, the recipients of the personal data, the logic involved in any automatic personal data processing and, at least when based on profiling, the consequences of such processing’.

106

Martini (n 34) Rn 31–32., according to whom the necessity for the performance of a contract hinges on the agreed goals of the contract between the data controller and the data subject. However, the authors do not consider the vagueness of the passage and fail to address the lack of consent which is not a precondition as is it listed under lit. C.

107

Ibid Rn 33–37.

108

Ibid Rn 1–7, 15 argues that it is a prohibition, however it is acknowledged that the placement of art 22 in the ‘rights section of the data subjects’ causes confusion. The argument is based on the German implementation of the 1995 Directive into national law, which was in fact phrased as a prohibition. However, the author also states that the legal status (right to object or prohibition) of art 15 of the Directive and art 22 GDPR is disputed: see ibid Rn 14 a, 29.

109

Bird & Bird, ‘Profiling and Automated Decision-Taking’ <http://www.twobirds.com/∼/media/pdfs/gdpr-pdfs/35–guide-to-the-gdpr–profiling-and-automated-decisiontaking.pdf?la=en> accessed 10 November 2016 explains how different countries either have prohibitions or rights to object: ‘This could either be read as a prohibition on such processing or that the processing may take place but that individuals have a right to object to it. This ambiguity is also present in the Data Protection Directive and Member States differ in their approaches to the point.’

110

Martini (n 34) Rn 42–44 explains how the Germans made use of the margin of appreciation of art 15 of the Directive and phrased it like a prohibition.

111

Hildebrandt (n 78) 50 hints towards but does not make it explicit ‘it may be that if I don't exercise the right, the automated decision is not a violation of the directive’. Additionally, ‘The draft Regulation, however, stipulates that a person may only be subjected to automated decisions under specified conditions, implying that this right is not merely a right to object.’ She further explains how the same can be true for the original draft art 20 GDPR proposal of 25 January 2012; Bygrave (n 45) 3 sees art 15 of the Directive as sufficiently ambiguous to be interpreted as both a prohibition and a right to object.

112

Korff (n 38) 84. See also page 84 ff for further details on how other Member States implemented the Directive.

113

Korff, ‘New Challenges to Data Protection Study-Country Report’ (n 50) 37 ff.

114

European Digital Rights (n 26) 140.

115

Ibid 40.

116

Ibid 140.

117

Both the EC and European Council only sought protections for decisions solely based on automated processing, see ibid 139.

118

Ibid 140.

119

Commons Select Committee, ‘Algorithms in Decision-Making Inquiry Launched’ (UK Parliament, 2017) <http://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/news-parliament-2015/algorithms-in-decision-making-inquiry-launch-16-17/> accessed 8 March 2017.

120

Mittelstadt and others (n 6).

121

Paal (n 32) Rn 19–22.

122

European Parliament Committee on Civil Liberties, Justice and Home Affairs (n 24).

123

As already proposed by the Article 29 Working Party, see ‘Advice Paper on Essential Elements of a Definition and a Provision on Profiling within the EU General Data Protection Regulation’ (2013) 29 <http://ec.europa.eu/justice/data-protection/article-29/documentation/other-document/files/2013/20130513_advice-paper-on-profiling_en.pdf> accessed 10 March 2017.

124

Comparable approaches to regulating Big Data and algorithms have been suggested by: Viktor Mayer-Schönberger and Kenneth Cukier, Big Data : A Revolution That Will Transform How We Live, Work and Think (John Murray 2013); Andrew Tutt, ‘An FDA for Algorithms’ (Social Science Research Network 2016) SSRN Scholarly Paper ID 2747994 <http://papers.ssrn.com/abstract=2747994> accessed 13 April 2016.

125

For a detailed discussion on regulatory and interpretability issues related to algorithms, see: Danielle Keats Citron and Frank A Pasquale, ‘The Scored Society: Due Process for Automated Predictions’ (Social Science Research Network 2014) SSRN Scholarly Paper ID 2376209 <https://papers.ssrn.com/abstract=2376209> accessed 4 March 2017; Alfredo Vellido, José David Martín-Guerrero and Paulo JG Lisboa, ‘Making Machine Learning Models Interpretable’, ESANN (Citeseer 2012).

126

For additional discussion of transparency and the GDPR, see: Dimitra Kamarinou, Christopher Millard and Jatinder Singh, ‘Machine Learning with Personal Data’ <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2865811> accessed 8 March 2017.

127

Burrell (n 12).

128

Sandvig and others (n 7); Mittelstadt and others (n 6); Brent Mittelstadt, ‘Auditing for Transparency in Content Personalization Systems’ (2016) 10 International Journal of Communication 12. See also the discussion of ex post tests of outcomes, and ex ante acceptability of errors in Zarsky (n 90).

129

Disclosure: Luciano Floridi is a member of the European Data Protection Supervisors’ Ethics Advisory Group.

130

Art 83(5)b invests supervisory authorities with the power to impose fines up to 4 percent of the total worldwide annual turnover in cases where rights of the data subjects (arts 12–22) have been infringed. This lever can be used to enforce compliance and to enhance data protection.

Author notes

*

Sandra Wachter, Oxford Internet Institute, University of Oxford, Oxford, UK; The Alan Turing Institute, British Library, London, UK.

**

Brent Mittelstadt, The Alan Turing Institute, British Library, London, UK; Department of Science and Technology Studies, University College London, London, UK; Oxford Internet Institute, University of Oxford, Oxford, UK.

***

Luciano Floridi, Oxford Internet Institute, University of Oxford, Oxford, UK; The Alan Turing Institute, British Library, London, UK.

We are deeply indebted to Prof Peggy Valcke, Prof Massimo Durante, Prof Ugo Pagallo, Dr Natascha Scherzer and Mag Priska Lueger for their invaluable comments and insightful feedback, from which the paper greatly benefitted. We want to especially thank Dr Alessandro Spina whose intensive review and in-depth comments strengthened the arguments in the paper. Further we are greatly thankful to Dr Joris van Hoboken for the inspiring conversation as well as written feedback on the draft that significantly improved the quality of the paper. We also want to thank Prof Tal Zarsky and Prof Lee Bygrave not only for their pioneering and groundbreaking work that inspired this paper, but also their positive feedback, in-depth review, and invaluable comments. Last but not least, we want to thank the anonymous reviewer for the time spent reading and commenting so thoroughly on the paper. This study was funded by the Alan Turing Institute (Luciano Floridi, Brent Mittelstadt, and Sandra Wachter), the PETRAS IoT Hub - a EPSRC project (Luciano Floridi, Brent Mittelstadt, and Sandra Wachter), and a research grant from the University of Oxford’s John Fell Fund (Brent Mittelstadt). The authors declare they do not have any conflicts of interest.