Hostname: page-component-78c5997874-g7gxr Total loading time: 0 Render date: 2024-11-10T16:29:01.656Z Has data issue: false hasContentIssue false

Enhancing Compliance under the General Data Protection Regulation: The Risky Upshot of the Accountability- and Risk-based Approach

Published online by Cambridge University Press:  19 November 2018

Abstract

The risk-based approach has been introduced to the General Data Protection Regulation (GDPR) to make the rules and principles of data protection law “work better”. Organisations are required to calibrate the legal norms in the GDPR with an eye to the risks posed to the rights and freedoms of individuals. This article is devoted to an analysis of the way in which this new approach relates to “tick-box” compliance. How can the law enhance itself? If handled properly by controllers and supervisory authorities, the risk-based approach can bring about a valuable shift in data protection towards substantive protection of fundamental rights and freedoms. While the risk-based approach has a lot of potential, it also has a risk of its own: it relies on controllers to improve compliance, formulating what it means to attain compliance 2.0.

Type
Articles
Copyright
© Cambridge University Press 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

*

Claudia Quelle is a PhD researcher at the Tilburg Institute for Law, Technology, and Society (TILT), Tilburg University. Please send comments to c.quelle@tilburguniversity.edu. This is a republication of C Quelle, “The risk revolution in EU data protection law: We can’t have our cake and eat it, too” in R Leenes et al (eds), Data Protection and Privacy: The Age of Intelligent Machines (Hart Publishing 2017).

References

1 See in a similar vein Koops, B, “The trouble with European data protection law” (2014) 4(4) International Data Privacy Law 250 CrossRefGoogle Scholar.

2 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L 281/31 (Data Protection Directive).

3 See eg Data Protection Directive, Arts 18–19.

4 Regulation (EU) 2016/679 of the European Parliament and of the Council of 17 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L 119/2 (GDPR).

5 Article 29 Data Protection Working Party, “Statement of the Working Party on current discussions regarding the data protection reform package” (2013) at 2–3.

6 See especially Article 29 Data Protection Working Party and Working Party on Police and Justice, “The Future of Privacy. Joint Contribution to the Consultation of the European Commission on the legal framework for the fundamental right to protection of personal data” WP 168 (2009) at 20 (where accountability was still about taking necessary measures to ensure that the rules are observed); Article 29 Data Protection Working Party, “Opinion 3/2010 on the principle of accountability” WP 173 (2010) at 13 (where accountability is about taking appropriate and effective measures); Article 29 Data Protection Working Party, “Statement on the role of a risk-based approach in data protection legal frameworks” WP 2018 (2014) at 3 (on the risk-based approach in relation to accountability obligations).

7 Supra, note 6 (2010), at 3.

8 Supra, note 6 (2010), at 5.

9 C Kuner, “The European Commission’s Proposed Data Protection Regulation: A Copernican Revolution in European Data Protection Law” [2012] Bloomberg BNA Privacy & Security Law Report 1, at 1.

10 GDPR, Art 24, Art 35 and Art 37; Commission (EC), “A comprehensive approach on personal data protection in the European Union” COM (2010) 609 final section 2.2.4.

11 See GDPR, recitals 84 and 100, speaking of enhancement with respect to data protection impact assessments, certification and data protection seals and marks.

12 Recital 74 clarifies that it is in relation to the “appropriate and effective measures” that the risks to the rights and freedoms of natural persons should be taken into account. See also Gellert, R, “Data protection: a risk regulation? Between the risk management of everything and the precautionary alternative” (2015) 5(1) International Data Privacy Law 3 CrossRefGoogle Scholar at 16.

13 M Macenaite, “The ‘Riskification’ of European Data Protection Law through a two-fold Shift” (2017) 8(3) EJRR 506 at 524–525.

14 Centre for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice” (19 June 2014) at 4.

15 See in this vein M von Grafenstein, The Principle of Purpose Limitation in Data Protection Laws. The Risk-based Approach, Principles, and Private Standards as Elements for Regulating Innovation (Nomos 2018) 599–601.

16 For the link between risk management and privacy by design, see eg Information and Privacy Commissioner of Ontario, Canada, “Privacy Risk Management. Building privacy protection into a Risk Management Framework to ensure that privacy risks are managed, by default” (April 2010).

17 Supra, note 6 (2014), at 3. See also Article 29 Data Protection Working Party, “Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is ‘likely to result in a high risk’ for the purposes of Regulation 2016/679” WP 248 rev.1 (2017) at 22. These Guidelines were endorsed by the European Data Protection Board, which replaces the WP29 since 25 May 2018.

18 See eg GDPR, Arts 24–25, Art 35(1) and Art 35(7)(c), recitals 74–76, 78, 84, 89, 91 and 94.

19 GDPR, recital 75.

20 Supra, note 6 (2014), at 3; supra, note 17 (2017), at 6.

21 GDPR, recital 4.

22 See eg GDPR, recital 75, mentioning the harms of identity theft, fraud, financial loss, damage to the reputation, loss of confidentiality of personal data protected by professional secrecy, and the unauthorised reversal of pseudonymisation, and recital 78, mentioning measures such as pseudonymisation and security features.

23 Supra, note 17 (2017), at 22. The WP29 also focuses on the individual harm that may be caused by security beaches, see 18–19.

24 GDPR, recitals 75 and 76.

25 See eg S Barocas and A Selbst, “Big Data’s Disparate Impact” (2016) 104 California Law Review 671 at 729–730.

26 GDPR, Art 1(2). The WP29 emphasised, in this vein, that the aim of fundamental rights protection “may caution against any interpretation of the same rules that would leave individuals deprived of protection of their rights”, see Article 29 Data Protection Working Party, “Opinion 4/2007 on the concept of personal data” WP 136 (2007) at 4.

27 GDPR, recital 78, regarding data protection by design.

28 See especially supra, note 6 (2010), at 13; European Data Protection Supervisor, “Opinion of the European Data Protection Supervisor on the Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee and the Committee of the Regions – “A comprehensive approach on personal data protection in the European Union”“ (2011) at 22; supra, note 5, at 3; European Data Protection Supervisor, “EDPS Opinion on coherent enforcement of fundamental rights in the age of big data” (Opinion 8/2016) at 7.

29 GDPR, Art 33(1), Art 34(1), Art 35(1) and Art 36(1).

30 See also Centre for Information Policy Leadership, “Risk, High Risk, Risk Assessments and Data Protection Impact Assessments under the GDPR” (21 December 2016) at 3.

31 Supra, note 26 (2007), at 4. See also Purtova, N, “The law of everything. Broad concept of personal data and future of EU data protection law” (2018) 10(1) Law, Innovation and Technology 40 CrossRefGoogle Scholar, making the case that a non-scalable approach to data protection is not feasible and that a “proportionate application” of particular provisions is the way of the CJEU.

32 Treaty on European Union, Art 5(4).

33 Case C-13/06 Rïgas satiksme [2017] ECLI:EU:C:2017:43, Opinion of AG Bobek, para 92. See in this vein E Moerel and J Prins, “Privacy voor de homo digitalis” (2016) 146(1) Handelingen Nederlandse Juristen-Vereniging 9.

34 Commission (EC), “Impact Assessment Accompanying the GDPR” SEC (2012) 72, final at 70, discussing the results of a stakeholder consultation with data protection authorities.

35 Data Protection Directive, Art 13(2) and Art 18(2). Macenaite describes the evolution of the role of “risk” in data protection law in more detail, see supra, note 13, at 522.

36 European Data Protection Supervisor, “EU Data Protection Law: The Review of Directive 95/46 EC and the Proposed General Data Protection Regulation – Peter Hustinx” (15 September 2014) at 38; O Lynskey, The Foundations of EU Data Protection Law (Oxford University Press 2015) at 85. The risk-based approach discussed in this article does not concern the question whether data protection law applies, eg whether or not the data at hand constitutes “personal data”. On this, see eg H Pearce, “Big data and the reform of the European data protection framework: an overview of potential concerns associated with proposals for risk management-based approaches to the concept of personal data” (2017) 26(3) Information & Communications Technology Law 312.

37 Supra, note 12, at 13.

38 Supra, note 14; Centre for Information Policy Leadership, “The Role of Risk Management in Data Protection” (23 November 2014).

40 Hood, C, Rothstein, H and Baldwin, R, The Government of Risk: Understanding Risk Regulation Regimes (Oxford University Press 2001)CrossRefGoogle Scholar at 3.

41 See eg supra, note 13, at 508–509.

42 See eg J Black, “The Emergence of Risk-Based Regulation and the New Public Risk Management in the United Kingdom” (2005) 3 Public Law 510 at 514.

43 Supra, note 12, at 3.

44 Supra, note 15, at 484. See in this vein C Quelle, “Not Just User Control in the General Data Protection Regulation. On the Problems with Choice and Paternalism, and on the Point of Data Protection” in A Lehman et al (eds) Privacy and Identity Management – Facing up to Next Steps (Springer 2017). See also L Jasmontaite and V Verdoodt, “Accountability in the EU Data Protection Reform: Balancing Citizens’ and Business’ Rights” in D Aspinall et al (eds), Privacy and Identity Management. Time for a Revolution? (Springer 2016) at 163.

45 Data Protection Directive, Art 8; GDPR, Art 9; CJEU, Opinion 1/15 EU-Canada PNR [2017] ECLI:EU:C:2017:592, para. 165, discussing the risk that sensitive data is processed contrary to Art 21 of the Charter. See also supra, note 6 (2014), at 2; supra, note 17 (2017), at 6.

46 Data Protection Directive, Art 3(1); GDPR, Art 2(1) and recital 15; Case C-25/17 Tietosuojavaltuutettu [2018] ECLI:EU:C:2018:551, paras 57–62. See also supra, note 26 (2007), at 5.

47 K Irion and G Luchetta, “Online Personal Data Processing and EU Data Protection Reform: Report of the CEPS Digital Forum” (Centre for European Policy Studies Brussels 2013) at 23.

48 Grafenstein proposes that personal data cannot be used for a secondary purpose if this would pose additional risks, relying on the notion of risk to curtail the repurposing of personal data in this manner because of the risk-oriented aim of data protection, see supra, note 15, at 491.

49 Baldwin, R, Cave, M and Lodge, M, Understanding Regulation: Theory, Strategy, and Practice (Oxford University Press, 2012) 281283 Google Scholar.

50 Black, J and Baldwin, R, “Really Responsive Risk-Based Regulation” (2010) 32(2) Law & Policy 181 at 188–189CrossRefGoogle Scholar.

51 Supra, note 36 (2015), at 84; supra, note 12, at 13; Kuner, C et al, “Risk management in data protection” (2015) 5(2) International Data Privacy Law 95 CrossRefGoogle Scholar at 96.

52 Supra, note 6 (2014), at 4.

53 GDPR, Art 39(2); Article 29 Data Protection Working Party, “Guidelines on Data Protection Officers (‘DPOs’)” WP 242 rev.01 (2017) at 18.

54 See eg B Hutter, “The Attractions of Risk-based Regulation: accounting for the emergence of risk ideas in regulation” (2005) 33 ESRC Centre for Analysis of Risk and Regulation Discussion Paper 4–6.

55 See on a decentred understanding of regulation Black, J, “Decentring Regulation: Understanding the Role of Regulation and Self-Regulation in a ‘Post-Regulatory’ World” (2001) 54(1) Current Legal Problems 103 CrossRefGoogle Scholar.

56 Hood, Rothstein and Baldwin do note that risk regulation regimes can be conceived of at different levels, see supra, note 40, at 10.

57 C Quelle, “The data protection impact assessment: what can it contribute to data protection?” (LLM thesis, Tilburg University 2015), available at <arno.uvt.nl/show.cgi?fid=139503> at 112 and 127.

58 GDPR, recital 89.

59 GDPR, Art 36, recitals 89–90 and recital 94.

60 I discuss whether Parker’s model of meta-regulation applies to data protection impact assessments in supra, note 57 (2015), see eg at 103–105 and at 110. See also Binns, R, “Data protection impact assessments: a meta-regulatory approach” (2017) 7(1) International Data Privacy Law 22 CrossRefGoogle Scholar.

61 Gonçalves, M, “The EU data protection reform and the challenges of big data: remaining uncertainties and ways forward” (2017) 26(2) Information & Communications Technology Law 90 CrossRefGoogle Scholar at 99 and 114.

62 Gellert, R, “Rights-Based and the Risk-Based Approaches to Data Protection” (2016) 4 European Data Protection Law Review 482 Google Scholar at 490 and 482.

63 Supra, note 12, at 6–7.

64 Supra, note 62.

65 Gellert, R, “Understanding the notion of risk in the General Data Protection Regulation” (2018) 34 Computer Law & Security Review 279 CrossRefGoogle Scholar at 281–282.

66 Clarke, amongst others, emphasises that impact assessments involve “a much broader study than merely compliance with a specific law”, see C Clarke, “Privacy impact assessment: Its origins and development” (2009) 25 Computer Law & Security Review 123 at 125.

67 See supra, note 65, at 284.

68 GDPR, Art 35(1) and Art 35(7)(b)–(c).

69 See on design, output and outcome obligations, supra, note 49, at 297–298.

70 DIGITALEUROPE, “DIGITALEUROPE comments on the risk-based approach” at 3–4.

71 N Robinson et al, “Review of the European Data Protection Directive” (Rand Corporation 2009) 48–49 and 51.

72 Supra, note 6 (2010), at 17.

73 GDPR, Art 82 and Art 83(2)(a).

74 Supra, note 6 (2014), at 3.

75 GDPR, Art 5(1)(a); Bygrave, L, Data Protection Law: Approaching Its Rationale, Logic and Limits (Kluwer 2002)Google Scholar at 58.

76 Recital 71 concerns automated decision-making and holds that the controller should, in order to ensure fair and transparent processing, “secure personal data in a manner that takes account of the potential risks involved … and that prevents, inter alia, discriminatory effects”.

77 GDPR, Art 5(1)(b), Art 6(4)(d) and recital 50.

78 WP29, “Opinion 03/2012 on purpose limitation” WP 203 (2012) at 25–26.

79 GDPR, Art 6(4)(d) and recital 50.

80 GDPR, Art 6(1)(f).

81 Article 29 Data Protection Working Party, “Opinion 06/2014 on the Notion of Legitimate Interests of the Data Controller under Article 7 of Directive 96/46/EC” WP 217 (2014) at 37.

82 Charter of Fundamental Rights of the European Union [2000] OJ C-354/3; Cases C-468/10 and C-469/10, ASNEF and FECEMD [2011] ECR I-00000, ECLI:EU:C:2011:777, para. 40.

83 The control of the data subject is a relevant consideration under the risk-based approach, see GDPR, recital 75.

84 Some obligations in the GDPR are formulated as obligations de résultat, specifying an outcome that the controller is obligated to attain no matter the circumstances. Other provisions impose an obligation to make reasonable efforts (an obligation de moyens), see B Van Alsenoy, “Liability under EU Data Protection Law: From Directive 95/46 to the General Data Protection Regulation” (2016) 7 JIPITEC 271 at 273.

85 The WP29 does not acknowledge the tension between the scalability of compliance measures and the right of access, see supra, note 6 (2014), at 2–3.

86 GDPR, Art 14(5) and Art 17(2).

87 Supra, note 6 (2014), at 2.

88 Supra, note 6 (2010), at 5.

89 Supra, note 30, at 20. See also supra note 14, at 4.

90 Supra, note 60 (2017), at 33.

91 See in this vein supra, note 12, at 16.

92 Supra, note 61, at 101.

93 Supra, note 6 (2014), at 2; supra, note 5, at 3.

94 I can offer the example, from personal experience, of a data subject who uses her right of access to obtain travel data from the train operator only so that she can be reimbursed for her commute by her employer.

95 Case C-131/12 Google Spain [2014] ECR-I 000, ECLI:EU:C:2014:317, paras 80–81.

96 Once an infringement has been established, the height of the fine can be influenced by a number of factors, including the damage that has materialised and the degree of responsibility and the measures that were taken, see GDPR, Art 83(1), Art 83(2)(a) and Art 83(2)(d).

97 Supra, note 17 (2017), at 6 and at 22.

98 The WP29 DPIA guidelines do not explicitly require an analysis of the proportionality stricto sensu of the processing operation as a whole, see supra, note 17 (2017), at 22. The Belgian supervisory authority does ask controllers to seek an appropriate balance after the necessity test has been carried out, see Commissie voor de Bescherming van de Persoonlijke Levenssfeer, “Aanbeveling uit eigen beweging met betrekking tot de gegevensbeschermingseffectbeoordeling en voorafgaande raadpleging” CO-AR-2018-001 at 18.

99 Supra, note 30, at 10.

100 See in this vein D Kloza at al, “Data protection impact assessments in the European Union: complementing the new legal framework towards a more robust protection of individuals” (2017) d.pia.lab Policy Brief No 1, at 2.

101 GDPR, Art 35(7).

102 GDPR, recital 90. See also supra, note 17 (2017), at 22.

103 GDPR, recital 84. See also supra, note 17 (2017), at 22.

104 See also GDPR, Art 36 and Arts 57–58 on the role of the supervisory authority.

105 See also supra, note 17 (2017), at 9.

106 Appropriate measures to minimise the impact on the rights and fundamental freedoms of data subjects are explicitly required by the Council of Europe Consultative Committee of the Convention for the Protection of Individuals With Regard to Automatic Processing of Personal Data, “Guidelines on the protection of individuals with regard to the processing of personal data in a world of Big Data” T-PD (2017) 01, sections 2.4–2.5.

107 Supra, note 6 (2010), at 17. See also supra, note 100, on the accountability of decision-makers within the organisation recommended by the d.pia.lab.

108 GDPR, Art 35(11).

109 GDPR, Art 36(2).

110 Supra, note 6 (2014), at 4.

111 J Black, “The Rise, Fall and Fate of Principles Based Regulation” (2010) LSE Law, Society and Economy Working Papers 17/2010 at 23.

112 Supra, note 57 (2015), at 114; Parker, C, The Open Corporation: Effective Self-regulation and Democracy (Cambridge University Press 2002) 245 CrossRefGoogle Scholar.

113 Parker, C, “Meta-regulation – legal accountability for corporate social responsibility” in D McBarnet, A Voiculescu and T Campbell (eds), The New Corporate Accountability: Corporate Social Responsibility and the Law (Cambridge University Press 2007)Google Scholar at 207 and 237.

114 ibid, at 231.

115 Supra, note 51, at 96–97.

116 See eg supra note 47, at 50; supra note 75, at 62.

117 H Burkert, “Data-protection legislation and the modernization of public administration” (1996) 62 International Review of Administrative Sciences 557 at 559.

118 Supra, note 100, at 2.

119 Quelle, C, “Privacy, Proceduralism and Self-Regulation in Data Protection Law” (2017) 1 Teoria Critica della Regolazione Sociale 89 Google Scholar at 93 and 96–103.

120 A DPIA is required for any high-risk processing operation, but Art 35 makes specific reference to profiling, special categories of data and systematic monitoring, see GDPR, Art 35(3).

121 N van Dijk, R Gellert and K Rommetveit, “A risk to a right? Beyond data protection risk assessments” (2015) 32(2) Computer Law & Security Review 286 at 294 and 299.

122 See on the right to receive information Eskens, S, Helberger, N and Moeller, J, “Challenged by News Personalization: Five Perspectives on the Right to Receive Information” (2017) 9(2) Journal of Media Law 259 CrossRefGoogle Scholar.

123 A Spina, “A Regulatory Mariage de Figaro: Risk Regulation, Data Protection, and Data Ethics” (2017) 8 EJRR 88 at 89.

124 GDPR, Art 22(2)(c).

125 Black, J, “Forms and paradoxes of principles-based regulation” (2008) 3(4) Capital Markets Law Journal 425 CrossRefGoogle Scholar at 453.

126 J Black, “Managing Discretion” (ARLC Conference Papers. Penalties: Policy, Principles and Practice in Government Regulation, June 2001) at 24.

127 Supra, note 49, at 303.

128 Supra, note 111, at 5–6 and at 8, 11; J Braithwaite, “Rules and Principles: A Theory of Legal Certainty” (2002) 27 Australian Journal of Legal Philosophy 47 at 71–72.

129 GDPR, Art 58(3)(b) and Art 58(3)(d).

130 Supra, note 60 (2017), at 35.

131 Black, J, Hopper, M and Band, C, “Making a success of Principles-based regulation” (2007) 1(3) Law and Financial Markets Review 191 CrossRefGoogle Scholar at 204.

132 cf supra, note 47, at 50. For a critical note on the capacity of the DPIA to bring data protection “to the hearts and minds of controllers”, see supra, note 1, at 254–255. For an optimistic view on “compliance programs” and the accountability principle, see Balboni, P et al, “Legitimate interest of the data controller. New data protection paradigm: legitimacy grounded on appropriate protection” (2013) 3(4) International Data Privacy Law 244 CrossRefGoogle Scholar at 261.

133 Supra, note 17 (2017), at 14.

134 See in this vein supra, note 12, at 14 and 17. To address this issue, the DPIA guidelines require controllers to take “the perspective of the data subjects”, see supra, note 17 (2017), at 17.

135 Supra, note 100, at 2. It may also be that compliance departments lack the time or the skill to do justice to data protection law, see in this vein supra, note 125, at 453.

136 Supra, n 61, at 104.

137 ibid, at 101.

138 Van Dijk, Gellert and Rommetveit suggest that controllers are “a court for upstream adjudication”, see supra, note 121, at 300.

139 Supra, note 61, at 99.

140 See in this vein supra, note 44 (2016), at 161.