Hostname: page-component-cd9895bd7-lnqnp Total loading time: 0 Render date: 2024-12-26T07:20:01.489Z Has data issue: false hasContentIssue false

The Risks of Trustworthy Artificial Intelligence: The Case of the European Travel Information and Authorisation System

Published online by Cambridge University Press:  13 May 2022

Charly Derave*
Affiliation:
Perelman Centre for Legal Philosophy, Faculty of Law and Criminology, Université Libre de Bruxelles (ULB), Brussels, Belgium
Nathan Genicot
Affiliation:
Perelman Centre for Legal Philosophy, Faculty of Law and Criminology, Université Libre de Bruxelles (ULB), Brussels, Belgium
Nina Hetmanska
Affiliation:
Perelman Centre for Legal Philosophy, Faculty of Law and Criminology, Université Libre de Bruxelles (ULB), Brussels, Belgium
*
*Corresponding author. Email: charly.derave@ulb.be

Abstract

In recent years, the European Union (EU) has strongly promoted a human-centric and trustworthy approach to artificial intelligence (AI). The 2021 proposal for a Regulation on AI that the EU seeks to establish as a global standard is the latest step in the matter. However, little attention has been paid to the EU’s use of AI to pursue its own purposes, despite its wide use of digital technologies, notably in the field of border management. Yet, such attention allows us to confront the highly moral discourse that characterises EU institutions’ communications and legislative acts with a concrete example of how the promoted values are realised “on the ground”. From this perspective, this paper takes the case study of the European Travel Information and Authorisation System (ETIAS), an EU information technology system (planned to become operational in May 2023) that will provide travel authorisation to visa-exempt third-country nationals using a profiling algorithm. The paper shows, on the one hand, that ETIAS constitutes another piece in the massive infrastructure of digital surveillance of third-country nationals that the EU has been building for years. On the other hand, ETIAS’s algorithmic process is shown to be an instrument of differential exclusion that could well have an adverse impact on certain groups of foreign travellers. Ultimately, this paper argues that far from falling outside the scope of the trustworthy approach to AI championed by the EU, ETIAS – and more broadly the systematic risk evaluation predominant in the EU’s use of AI – is a constitutive part of it.

Type
Symposium on Algorithmic Regulation and Artificial Intelligence Risks
Copyright
© The Author(s), 2022. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 Commission, “Building Trust in Human-Centric Artificial Intelligence” (Communication) COM (2019) 168 final 1.

2 Laid down in a prior Communication and in the coordinated plan on AI: Commission, “Artificial Intelligence for Europe” (Communication) COM (2018) 237 final; Commission, “Coordinated Plan on Artificial Intelligence” (Communication) COM (2018) 795 final.

3 High-Level Expert Group on Artificial Intelligence, Ethics Guidelines for Trustworthy AI, 9 April 2019.

4 Commission, “White Paper on Artificial Intelligence – A European Approach to Excellence and Trust” COM (2020) 65 final.

5 ibid, 10.

6 Commission, “Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts” COM (2021) 206 final (proposed Regulation on AI).

7 Accordingly, high-risk AI systems are submitted to specific requirements.

8 Commission, “Fostering a European Approach to Artificial Intelligence” (Communication) COM (2021) 205 final 4.

9 Area of EU competence (Article 3(2) of the Treaty on European Union and Articles 67 and 77 of the Treaty on the Functioning of the European Union).

10 D Bigo, “The Socio-Genesis of a Guild of 'Digital Technologies’ Justifying Transnational Interoperable Databases in the Name of Security and Border Purposes: A Reframing of the Field of Security Professionals” (2020) 6 International Journal of Migration and Border Studies 74.

11 Commission, “Smart Borders – Options and the Way Ahead” (Communication) COM (2011) 680 final 3.

12 European Parliament and Council Regulation (EU) 2018/1240 of 12 September 2018 establishing a European Travel Information and Authorisation System (ETIAS) and amending Regulations (EU) 1077/2011, (EU) 515/2014, (EU) 2016/399, (EU) 2016/1624 and (EU) 2017/2226 [2018] OJ L236/1 (ETIAS Regulation).

13 European Parliament and Council Regulation (EU) 2016/399 of 9 March 2016 on a Union Code on the rules governing the movement of persons across borders [2016] OJ L77/1 (Schengen Border Code), Art 6(1)(b).

14 According to the latest information publicly available: <https://ec.europa.eu/home-affairs/policies/schengen-borders-and-visa/smart-borders/european-travel-information-authorisation-system_fr> (last accessed 1 April 2022).

15 The list of visa-exempt countries concerned is set out in Annex II of European Parliament and Council Regulation (EU) 2018/1806 of 14 November 2018 listing the third countries whose nationals must be in possession of visas when crossing the external borders and those whose nationals are exempt from that requirement [2018] OJ L303/39. Family members of EU citizens who are exempt from the visa requirement and do not hold a residence card or permit fall, under certain conditions, within the scope of the ETIAS Regulation (see Art 24 of ETIAS Regulation for more details).

16 However, in July 2021, EU co-legislators adopted a Regulation that sets up screening rules for visa applications, which are also based on a profiling algorithm (Regulation (EU) 2021/1134 of the European Parliament and of the Council of 7 July 2021 amending Regulations (EC) No 767/2008, (EC) No 810/2009, (EU) 2016/399, (EU) 2017/2226, (EU) 2018/1240, (EU) 2018/1860, (EU) 2018/1861, (EU) 2019/817 and (EU) 2019/1896 of the European Parliament and of the Council and repealing Council Decisions 2004/512/EC and 2008/633/JHA, for the purpose of reforming the Visa Information System [2021] OJ L248/11, Art 1(11)). It mirrors ETIAS’s profiling algorithm. Although our contribution focuses on ETIAS, most of the analyses we develop here could be, pending further examination, applicable to visa applications’ screening rules. For more details, see N Vavoula, “Artificial Intelligence (AI) at Schengen Borders: Automated Processing, Algorithmic Profiling and Facial Recognition in the Era of Techno-Solutionism” (2021) European Journal of Migration and Law SSRN: <https://ssrn.com/abstract=3950389>.

17 Commission, “Delegated Decision of 23 November 2021 on further defining security, illegal immigration or high epidemic risks” C(2021) 4981 final (Delegated Decision). The Decision has not been published yet and is available at <https://ec.europa.eu/transparency/documents-register/detail?ref=C(2021)4981&lang=en> (last accessed 15 March 2022).

18 For a recently published book on this topic, see R Gelin, Dernières nouvelles de l’intelligence artificielle (Paris, Flammarion 2022).

19 See, among others, M Ebers, VRS Hoch, F Rosenkranz, H Ruschemeier and B Steinrötter, “The European Commission’s Proposal for an Artificial Intelligence Act – A Critical Assessment by Members of the Robotics and AI Law Society (RAILS)” (2021) 4 J 589; N Smuha, E Ahmed-Rengersb, A Harkens, W Li, J MacLaren, R Pisellif and K Yeung, “How the EU Can Achieve Legally Trustworthy AI: A Response to the European Commission’s Proposal for an Artificial Intelligence Act” (2021) SSRN: <https://ssrn.com/abstract=3899991>.

20 Art 83(1) of the Commission’s proposal for a Regulation on AI states: “This Regulation shall not apply to the AI systems which are components of the large-scale IT systems established by the legal acts listed in Annex IX that have been placed on the market or put into service before [12 months after the date of application of this Regulation referred to in Art 85(2)], unless the replacement or amendment of those legal acts leads to a significant change in the design or intended purpose of the AI system or AI systems concerned”. ETIAS Regulation is listed in Annex IX.

21 S Barocas and AD Selbst, “Big Data’s Disparate Impact” (2016) 104 California Law Review 671.

22 Especially after the entry into force of the 1985 Schengen Agreement (OJ [2000] L239/13) and the 1990 Convention implementing the Schengen Agreement (OJ [2000] L239/19).

23 N Vavoula, “The ‘Puzzle’ of EU Large-Scale Information Systems for Third-Country Nationals: Surveillance of Movement and Its Challenges for Privacy and Personal Data Protection” (2020) 45 European Law Review 348, 6.

24 See Report of the Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance, ET Achiume, “Racial Discrimination and Emerging Digital Technologies: A Human Rights Analysis”, A/HRC/44/57, Human Rights Council, Forty-fourth session, 15 June–3 July 2020.

25 D Broeders, “The New Digital Borders of Europe: EU Databases and the Surveillance of Irregular Migrants” (2007) 22 International Sociology 71, 89; N Gäckle, “Taming Future Mobilities: Biopolitics and Data Behaviourism in the European Travel Information and Authorisation System (ETIAS)” (2020) 15 Mobilities 257, 262.

26 European Parliament and Council Regulation (EC) 1987/2006 of 20 December 2006 on the establishment, operation and use of the second-generation Schengen Information System (SIS II) [2006] OJ L381/4 (SIS II Regulation).

27 European Parliament and Council Regulation (EU) 604/2013 of 26 June 2013 establishing the criteria and mechanisms for determining the Member State responsible for examining an application for international protection lodged in one of the Member States by a third-country national or a stateless person [2013] OJ L180/31 (Dublin III Regulation), which “lays down the criteria and mechanisms for determining the Member State responsible for examining an application for international protection lodged in one of the Member States by a third-country national or a stateless person” (Art 1).

28 European Parliament and Council Regulation (EU) 603/2013 of 26 June 2013 on the establishment of Eurodac for the comparison of fingerprints for the effective application of Regulation (EU) 604/2013 establishing the criteria and mechanisms for determining the Member State responsible for examining an application for international protection lodged in one of the Member States by a third-country national or a stateless person and on requests for the comparison with Eurodac data by Member States’ law enforcement authorities and Europol for law enforcement purposes, and amending Regulation (EU) 1077/2011 establishing a European Agency for the operational management of large-scale IT systems in the area of freedom, security and justice (recast) [2013] OJ L180/1 (Eurodac Regulation). Eurodac has been operational since 2003. See N Vavoula, “Transforming Eurodac from 2016 to the New Pact: From the Dublin System’s Sidekick to a Database in Support of EU Policies on Asylum, Resettlement and Irregular Migration” (2020) ECRE Working Paper <http://www.ecre.org/wp-content/uploads/2021/01/ECRE-Working-Paper-Transforming-Eurodac-from-2016-to-the-New-Pact-January-2021.pdf> (last accessed 14 July 2021).

29 European Parliament and Council Regulation (EC) 767/2008 of 9 July 2008 concerning the Visa Information System (VIS) and the exchange of data between Member States on short-stay visas [2008] OJ L218/60 (VIS Regulation), lastly amended by Regulation (EU) 2021/1134 of the European Parliament and of the Council of 7 July 2021 which extends the scope of VIS to long stay visa and residence permits (Art 1(2)). However, pursuant to Art 12 of the newly adopted Regulation, the Commission shall take a decision setting its date of application. This decision has not been adopted so far.

30 European Parliament and Council Regulation (EU) 2017/2226 of 30 November 2017 establishing an Entry/Exit System (EES) [2017] OJ L327/20 (EES Regulation). This IT system is expected to become operational by the end of September 2022 (according to the latest information publicly available: <https://ec.europa.eu/home-affairs/policies/schengen-borders-and-visa/smart-borders/entry-exit-system_fr> (last accessed 1 April 2022)).

31 European Parliament and Council Regulation (EU) 2019/816 of 17 April 2019 establishing a centralised system for the identification of Member States holding conviction information on third-country nationals and stateless persons (ECRIS-TCN) [2019] OJ L135/1 (ECRIS-TCN Regulation). This IT system is expected to become operational in 2022 (according to the latest information publicly available: <https://ec.europa.eu/info/law/cross-border-cases/judicial-cooperation/tools-judicial-cooperation/european-criminal-records-information-system-ecris_en> (last accessed 1 April 2022)).

32 S Preuss-Laussinotte, “L’élargissement problématique de l’accès aux bases de données européennes en matière de sécurité” (2009) 74 Cultures & Conflicts 81.

33 Commission, “Proposal for a Regulation of the European Parliament and of the Council establishing a framework for the interoperability of EU information systems (borders and visas) and amending Council Decision 2004/512/EC, Regulation (EC) 767/2008, Council Decision 2008/633/JHA, Regulation (EU) 2016/399 and Regulation (EU) 2017/2226” COM (2017) 793 final.

34 ISO/TC 184/SC 5.

35 European Parliament and Council Regulation (EU) 2019/817 of 20 May 2019 on establishing a framework for interoperability between EU information systems in the field of borders and visa [2019] OJ L135/27 and European Parliament and Council Regulation (EU) 2019/818 of 20 May 2019 on establishing a framework for interoperability between EU information systems in the field of police and judicial cooperation, asylum and migration and amending Regulations [2019] OJ L135/85 (Interoperability Regulations).

36 For a detailed explanation of the functioning of interoperability, including illustrative examples, see the Commission’s impact assessment: COM(2017) 793 final; SWD(2017) 474 final.

37 It has the purpose “of facilitating the fast, seamless, efficient, systematic and controlled access of Member State authorities and Union agencies to the EU information systems, to Europol data and to the Interpol databases for the performance of their tasks and in accordance with their access rights and the objectives and purposes of the EES, VIS, ETIAS, Eurodac, SIS and ECRIS-TCN” (Interoperability Regulations, Art 6(1)).

38 “Storing biometric templates obtained from the biometric data … that are stored in the CIR and SIS and enabling querying with biometric data across several EU information systems”, this system “is established for the purposes of supporting the CIR and the MID and the objectives of the EES, VIS, Eurodac, SIS and ECRIS-TCN” (Interoperability Regulations, Art 12).

39 “Creating and storing identity confirmation files … containing links between data in the EU information systems included in the CIR and SIS and allowing detection of multiple identities, with the dual purpose of facilitating identity checks and combating identity fraud”, this system “is established for the purpose of supporting the functioning of the CIR and the objectives of the EES, VIS, ETIAS, Eurodac, SIS and ECRIS-TCN” (Interoperability Regulations, Art 25).

40 Interoperability Regulations, Art 17.

41 Interoperability Regulations, Art 39. See also ETIAS Regulation, Art 84.

42 See M Leese, “Fixing State Vision: Interoperability, Biometrics, and Identity Management in the EU” (2020) 27(1) Geopolitics 113. The author argues that the EU interoperability legal framework will lead to a shift from identity production to identity management “that aspires to simultaneously verify and cross-validate identity records across multiple domains [to] then form[s] a new, allegedly truthful basis for knowledge production and government” (at 127).

43 Vavoula, supra, note 23, 24.

44 M Leese, “Exploring the Security/Facilitation Nexus: Foucault at the 'Smart’ Border” (2016) 30 Global Society 412.

45 Commission, “Preparing the Next Steps in Border Management in the European Union” (Communication) COM (2008) 69 final.

46 The European Data Protection Supervisor delivered an opinion on its own initiative (European Data Protection Supervisor, “Preliminary Observations on the Commission’s Communications COM (2008) 67, 68 and 69 Final” (2008) 2).

47 European Parliament Resolution 2008/2181(INI) of 10 March 2009 on the next steps in border management in the EU and similar experiences in third countries, para 19.

48 Justice and Home Affairs Council, “Internal Security Strategy for the European Union: Towards a European Security Model” (Publications Office of the European Union 2010) 22, 27–28.

49 ibid.

50 PricewaterhouseCoopers, “Policy Study on an EU Electronic System for Travel Authorization (EU-ESTA)” (2011) 292.

51 COM (2011) 680 final 8.

52 Commission, “Stronger and Smarter Information Systems for Borders and Security” (Communication) COM (2016) 205 final.

53 ibid, 3. See also Commission, “The European Agenda on Security” (Communication) COM (2015) 185 final; Commission, “A European Agenda on Migration” (Communication) COM (2015) 240 final 13–14.

54 Travellers crossing external land borders do not generate upstream information, unlike when travelling via air and sea, for which personal data are collected as part of the measures taken by Member States to implement, on the one hand, Council Directive 2004/82/EC of 29 April 2004 on the obligation of carriers to communicate passenger data [2004] OJ L2261/24 and, on the other hand, European Parliament and Council Directive (EU) 2016/681 of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime [2016] OJ L119/132 (PNR Directive).

55 COM (2016) 205 final 13.

56 European Parliament Resolution 2016/2773(RSP) of 6 July 2016 on the strategic priorities for the Commission Work Programme 2017, para 29.

57 Commission, “Proposal for a Regulation of the European Parliament and of the Council Establishing a European Travel Information and Authorisation System (ETIAS) and Amending Regulations (EU) 515/2014, (EU) 2016/399, (EU) 2016/794 and (EU) 2016/1624” COM (2016) 731 final.

58 PricewaterhouseCoopers, “Feasibility Study for a European Travel on Information and Authorization System (ETIAS)” (2016).

59 Interinstitutional Agreement between the European Parliament, the Council of the European Union and the European Commission on Better Law-Making of 13 April 2016 [2016] OJ S123/1, para 13. See S Alegre, J Jeandesboz and N Vavoula, “European Travel Information and Authorisation System (ETIAS): Border Management, Fundamental Rights and Data Protection” (European Parliament 2017) 28. See also European Data Protection Supervisor, “Opinion 3/2017 on the Proposal for a Regulation Establishing a European Travel Information and Authorisation System (ETIAS)” (2017); European Union Agency for Fundamental Rights, “Opinion 2/2017 on the Impact on Fundamental Rights of the Proposal for a Regulation Establishing a European Travel Information and Authorisation System (ETIAS)” (2017).

60 In addition to the feasibility studies on EU-ESTA of February 2011 and ETIAS of November 2016, PwC was also requested by the Commission to carry out a technical study on smart borders (PricewaterhouseCoopers, “Technical Study on Smart Borders” (2014)), which explored the different technical options available to the Commission in the context of the proposals it tabled under the cover of the 2013 “Smart Borders Package” and that aimed in particular at creating an entry/exit system (COM (2013) 96 final; proposal aborted) and a passenger registration programme (COM (2013) 97 final; proposal withdrawn).

61 M Akkerman, “Financing Border Wars. The Border Industry, Its Financiers and Human Rights” (Transnational Institute and Stop Wapenhandel 2021); Claire Rodier, Xénophobie business. À quoi servent les contrôles migratoires? (Paris, La Découverte 2012).

62 The LIBE Committee commissioned a study to assess the necessity, implications in relation to interoperability and impact in terms of fundamental rights of the ETIAS proposal: see Alegre et al, supra, note 59).

63 The European Parliament adopted a common position on 5 July 2018 (European Parliament Resolution P8_TA(2018)0307 of 5 July 2018 on the proposal for a Regulation of the European Parliament and of the Council establishing a European Travel Information and Authorisation System (ETIAS) and amending Regulations (EU) 515/2014, (EU) 2016/399 and (EU) 2016/1624), which the Council unanimously approved on 5 September 2018 (Council Vote No 11890/18 of 6 September 2018).

64 ETIAS Regulation, Art 1(1). Definitions of these concepts are specified in Art 3 and the concrete objectives are detailed in Art 4.

65 ETIAS Regulation, Arts 1 and 4.

66 ETIAS Regulation, Art 5.

67 ETIAS Regulation, Art 6.

68 ETIAS Regulation, Arts 7 and 8.

69 ETIAS Regulation, Art 17.

70 Primary, secondary or higher level or none.

71 Under “job groups”, the applicant chooses from a predetermined (long) list laid down by the Commission in the form of delegated acts. See Commission Delegated Regulation (EU) 2021/916 of 12 March 2021 supplementing Regulation (EU) 2018/1240 of the European Parliament and of the Council establishing a European Travel Information and Authorisation System (ETIAS) as regards the predetermined list of job groups used in the application form [2021] OJ L201/1.

72 Or of third countries whose nationals are exempt from the visa requirement (short stay).

73 ETIAS Regulation, Arts 17(8) and 19(3)(d).

74 ETIAS Regulation, Art 54(1).

75 ETIAS Regulation, Art 20(1).

76 ETIAS Regulation, Arts 21(1) and 36.

77 ETIAS Regulation, Arts 21(2) and 22.

78 ETIAS Regulation, Art 21(3). Art 25 states the criteria to determine the Member State responsible.

79 ETIAS Regulation, Art 26(1–2).

80 ETIAS Regulation, Art 37(3). Moreover, contact details of the relevant European and national data protection authorities shall be given to applicants (ETIAS Regulation, Art 38(2)(e)). See also Art 64.

81 The National Unit specifies grounds for refusal. However, reading ETIAS Regulation, it appears that it is not obliged to explain those grounds. A mere mention seems sufficient. In our view, this is likely to impact the applicant’s right to an effective judicial remedy.

82 ETIAS Regulation, Art 45.

83 ETIAS Regulation, Art 47.

84 ETIAS Regulation, Art 80. Verification by the border guards seems in principle to be formal (there is a travel authorisation or not). However, following Recital 37 of ETIAS Regulation, “if there is a valid travel authorisation, the decision to authorise or refuse entry should be taken by the border guard”. Indeed, border guards should still determine whether entry conditions laid down in Art 6 of the Schengen Border Code are satisfied. Moreover, flags can be attached to a travel authorisation allowing border guards to proceed to second-line checks (further checks carried out in a special location at the external borders) or to access the ETIAS Central Unit to obtain additional information (ETIAS Regulation, Art 47(4)). Finally, when the period of validity of a travel authorisation expires, access to the EU territory is denied (ETIAS Regulation, Art 47(2)(a)).

85 ETIAS Regulation, Art 20(2).

86 If the applicant’s travel document is reported as being lost, stolen, misappropriated or invalidated in SIS or where the applicant is subject to a refusal of entry and a stay alert is entered into SIS, the National Unit shall refuse the travel authorisation without any discretion (ETIAS Regulation, Art 26(3)(a)).

87 ETIAS Regulation, Art 20(4).

88 ETIAS Regulation, Art 34(1) (emphasis added). Following Art 34(2), “the ETIAS watchlist shall be established on the basis of information related to terrorist offences or other serious criminal offences”. Art 34(3) specifies that the information shall be provided by Europol and/or Member States, but there is no further indication of the criteria upon which Europol and the Member States should base their decision to put a person on the watchlist. Art 34(4) enumerates specific personal data fields of which the ETIAS watchlist is composed. It does not contain data about the nature of the offence (terrorist or other serious crimes), which seems prima facie to remain solely in the hands of Europol and/or the Member States.

89 ETIAS Regulation, Art 35(7). It states that the Commission “shall, by means of implementing acts, establish the technical specifications of the ETIAS watchlist”, which, to date, has not even been “planned”.

90 ETIAS Regulation, Art 26(5).

91 ETIAS Regulation, Art 20(5).

92 ETIAS Regulation, Art 26(6).

93 “Automation bias” refers to the tendency of human beings to align themselves with the outcomes the algorithms, notably because their authority. Humans are therefore less inclined to decide differently from algorithms. See J Gerards and R Xenidis, “Algorithmic Discrimination in Europe” (European Commission 2020) 42.

94 Gäckle, supra, note 25, 267–68.

95 Alegre et al, supra, note 59, 23.

96 FF Schauer, Profiles, Probabilities and Stereotypes (Cambridge, MA, Harvard University Press 2006). On profiling, see also M Hildebrandt and S Gutwirth (eds.), Profiling the European Citizen: Cross-Disciplinary Perspectives (Berlin, Springer 2008).

97 See also M Leese, “The New Profiling: Algorithms, Black Boxes, and the Failure of Anti-Discriminatory Safeguards in the European Union” (2014) 45 Security Dialogue 494.

98 Robinson + Yu, “Knowing the Score: New Data, Underwriting, and Marketing in the Consumer Credit Marketplace. A Guide for Financial Inclusion Stakeholders” (2014) <https://www.upturn.org/static/files/Knowing_the_Score_Oct_2014_v1_1.pdf>.

99 This is well described in L Amoore, The Politics of Possibility: Risk and Security beyond Probability (Durham, NC, Duke University Press 2013).

100 Leese, supra, note 97, 497.

101 VIS Regulation, Art 37 and Recital 13. See European Data Protection Supervisor, “Opinion of the European Data Protection Supervisor on the Proposal for a Council Decision on the Establishment, Operation and Use of the Second Generation Schengen Information System” (2006); Vavoula, supra, note 23.

102 PNR Directive, Arts 6(3)(b) and (4): when carrying out an assessment of passengers, Passenger Information Units may “process PNR data against pre-determined criteria”, which should be carried out in a non-discriminatory manner. See N Vavoula, “Prevention, Surveillance, and the Transformation of Citizenship in the 'Security Union': The Case of Foreign Terrorist Fighters” (2018) Queen Mary School of Law Legal Studies Research Paper 19, SSRN: <https://ssrn.com/abstract=3288444>; Leese, supra, note 97.

103 Art 4(4) of GDPR provides that “profiling” means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work or their economic situation, health, personal preferences, interests, reliability, behaviour, location or movements. Note that while ETIAS Regulation mentions GDPR, it is also subject to the European Parliament and Council Regulation 2018/1725 of 23 October 2018 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data, and repealing Regulation (EC) 45/2001 and Decision 1247/2002/EC [2018] OJ L295/29 (EU institutions data protection Regulation). However, it is not the purpose of this paper to address specific issues of data protection.

104 C(2021) 4981 final.

105 ETIAS Regulation, Arts 73 and 74.

106 Email from eu-LISA, 12 May 2021, request No 5/2021 (emphasis added).

107 Email from the European Direct Contact Centre, 26 May 2021, request No 186751. In its reply, the Centre says it consulted the Directorate-General for Migration and Home Affairs of the EU Commission.

108 Email from Frontex, 9 September 2021, PAD-2021-00228.

109 Eu-LISA, “Artificial Intelligence in the Operational Management of Large-Scale IT Systems” (2020) 30.

110 Deloitte, “Opportunities and Challenges for the Use of Artificial Intelligence in Border Control, Migration and Security” (2020) 90.

111 In its proposal of the ETIAS Regulation, the Commission states the following: “The number of visa-exempt third country nationals to the Schengen countries will continue to grow, with an expected increase of over 30% in the number of visa-exempt third country nationals crossing the Schengen borders by 2020, from 30 million in 2014 to 39 million in 2020” (COM (2016) 731 final 2).

112 These assumptions are notably based on interviews with two AI experts that we have conducted.

113 For a general account on algorithmic decision-making systems, see DR Amariles, “Algorithmic Decision Systems: Automation and Machine Learning in the Public Administration” in W Barfield (ed.), The Cambridge Handbook of the Law of Algorithms (Cambridge, Cambridge University Press 2020) p 273.

114 Email from Frontex, 9 September 2021. See also Art 6 of the Delegated Decision.

115 Regarding the high epidemic risks, Art 6(1)(a) of the Delegated Decision states that Member States shall provide information “through the epidemiological surveillance and control of communicable diseases network and the Early Warning and Response System in accordance with Articles 6, 8 and 9 of Decision No 1082/2013/EU”.

116 Email from Frontex, 9 September 2021.

118 See R Paul, “Risk Analysis as a Governance Tool in European Border Control” in A Weinar, S Bonjour and L Zhyznomirska (eds.), The Routledge Handbook of the Politics of Migration in Europe (London, Routledge 2020); S Horii, “The Effect of Frontex’s Risk Analysis on the European Border Controls” (2016) 17 European Politics and Society 242.

119 Art 84(2) of ETIAS Regulation provides that eu-LISA shall store the following data in the CRRS: application status information; nationalities, sex and year of birth of the applicant; the country of residence; education (primary, secondary, higher or none); current occupation (job group); the type of the travel document and three-letter code of the issuing country; the type of travel authorisation; the validity period of the travel authorisation; and the grounds for refusing, revoking or annulling a travel authorisation. It further states that “cross-system statistical data and analytical reporting shall allow [Member States, the Commission, eu-LISA and the ETIAS Central Unit] … to support the implementation of the ETIAS” (emphasis added). Pursuant to Art 39 of the Interoperability Regulations, which gives eu-LISA the mandate to establish the CRRS, the data are anonymised.

120 All of this shows how ETIAS and the EES are interlinked. The EES will be a valuable information resource for ETIAS’s algorithm.

121 This is confirmed by Art 3 of the Delegated Decision.

122 Delegated Decision, Recital 4 (emphasis added).

123 Commission Delegated Regulation (EU) 2021/2223 of 30 September 2021 supplementing Regulation (EU) 2019/817 of the European Parliament and of the Council with detailed rules on the operation of the central repository for reporting and statistics [2021] OJ L448/7. Art 84(4) subpara 2 of ETIAS Regulation is also of relevance and states that “the daily statistics shall be stored in the central repository for reporting and statistics referred to in Article 39 of Regulation (EU) 2019/817” (emphasis added).

124 Alegre et al, supra, note 59.

125 Eg see COM (2020) 65 final 12.

126 The ETIAS Screening Board is established within Frontex and mainly has an advisory role. It is composed of one representative of each ETIAS National Unit, one representative of Frontex and one representative of Europol (ETIAS Regulation, Art 9(1)). Following an amendment proposed by the EU Parliament (see the Draft Report on the proposal for a regulation of the European Parliament and of the Council establishing a European Travel Information and Authorisation System (ETIAS) and amending Regulations (EU) 515/2014, (EU) 2016/399 and (EU) 2016/1624, 4 October 2017, PE605.985v02-00, 11-12), the Regulation also creates an ETIAS Fundamental Rights Guidance Board, with an advisory and appraisal function. According to Art 10(2), it “shall perform regular appraisals and issue recommendations to the ETIAS Screening Board on the impact on fundamental rights of the processing of applications and of the implementation of Article 33”. Art 9(3) states that “when issuing recommendations, the ETIAS Screening Board shall take into consideration the recommendations issued by the ETIAS Fundamental Rights Guidance Board”.

127 However, following Binns and Veale, the question of whether Art 22 GDPR and Art 24 of the EU institutions data protection Regulation are applicable to these types of algorithmic decision-making systems may not be as straightforward as it seems. Indeed, the action performed by the ETIAS screening rules is typically what they describe as triaging, ie “determining which cases get to a human decision-maker or are passed to another automated process”: if a hit is not triggered, the travel authorisation is issued automatically. Therefore, the triggering of a hit could be considered as a decision producing a significant effect since without the intervention of the ETIAS algorithm the refusal decision would not have been issued (for more details, see R Binns and M Veale, “Is That Your Final Decision? Multi-Stage Profiling, Selective Effects, and Article 22 of the GDPR” (2021) International Data Privacy Law 1). That being said, a further discussion on the applicability of Art 22 GDPR and Art 24 of the EU institutions data protection is beyond the scope of this paper.

128 Opinion 1/15 ECLI:EU:C:2017:592; Joined Cases C-511/18, C-512/18 and C-520/18 La Quadrature du Net and Others v Premier Ministre and Others [2020] ECLI:EU:C:2020:791. The opinion and the ruling of the CJEU involve automatic data processing in the fight against terrorism.

129 For more details, see Vavoula, supra, note 16.

130 Emphasis added. Equivalent provisions are contained in the Interoperability Regulations (Art 5, which is an open-ended, non-discriminatory clause), the EES Regulation (Art 10, which states broad fundamental rights guarantees, and Recital 19, which is a non-discriminatory clause with an exhaustive list of protected grounds) and the VIS Regulation (Art 7, which is a non-discriminatory clause limited to protected grounds of secondary EU equality law).

131 Eg see Recitals 10, 13, 15, 17, 33, 35 to 39, 44, 45 and 47 of the proposal.

132 European Union Agency for Fundamental Rights, supra, note 59, 26.

133 ibid, 28.

134 ibid, 29. See also European Data Protection Supervisor, supra, note 59, 11–14.

135 Which we take as our main analysis angle in the following pages to tackle issues raised by ETIAS’s profiling algorithm.

136 T Spijkerboer, “The Global Mobility Infrastructure: Reconceptualising the Externalisation of Migration Control” (2018) 20 European Journal of Migration and Law 452, 467; M-B Dembour, When Humans Become Migrants: Study of the European Court of Human Rights with an Inter-American Counterpoint (1st edition, Oxford, Oxford University Press 2015).

137 K Lippert-Rasmussen, “The Badness of Discrimination” (2006) 9 Ethical Theory and Moral Practice 167.

138 Barocas and Selbst, supra, note 21 (emphasis added). See also R Xenidis and L Senden, “EU Non-Discrimination Law in the Era of Artificial Intelligence: Mapping the Challenges of Algorithmic Discrimination” in U Bernitz et al (eds.), General Principles of EU Law and the EU Digital Order (Alphen aan den Rijn, Kluwer Law International B V 2020) p 156.

139 Broeders, supra, note 25, 88.

140 Indeed, nationality, sex and age range can lead to direct discrimination. Country or city of residence, level of education and current occupation can serve as proxies for protected grounds such as race and ethnic origin, thereby leading to indirect discrimination. We will return to this latter.

141 K Leurs and T Shepherd, “Datafication & Discrimination” in MT Schäfer and K van Es (eds), The Datafied Society (Amsterdam, Amsterdam University Press 2017) p 220.

142 Barocas and Selbst, supra, note 21, 677. Systematisation of sources of algorithmic biases and the terminology used vary in the legal scholarship, but the substance is quite similar. See FJZ Borgesius, “Discrimination, Intelligence Artificielle et Décisions Algorithmiques” (Conseil de l’Europe 2018); P Hacker, “Teaching Fairness to Artificial Intelligence: Existing and Novel Strategies Against Algorithmic Discrimination Under EU Law” (2018) 55 Common Market Law Review 1143; J Kleinberg et al, “Discrimination in the Age of Algorithms” (Social Science Research Network 2019) SSRN Scholarly Paper ID 3329669 <https://papers.ssrn.com/abstract=3329669> (last accessed 30 October 2021); Xenidis and Senden, supra, note 138; FJZ Borgesius, “Strengthening Legal Protection against Discrimination by Algorithms and Artificial Intelligence” (2020) 24 International Journal of Human Rights 1572.

143 In the same vein, see Barocas and Selbst, supra, note 21, 673. As the authors specify, their research subject is broader than discrimination in its strict legal sense and concerns “disproportionately adverse outcomes concentrated within historically disadvantaged groups in ways that look a lot like discrimination”.

144 J Ringelheim, “The Burden of Proof in Anti-Discrimination Proceedings. A Focus on Belgium, France and Ireland” (European Network of Legal Experts in Gender Equality and Non-Discrimination 2019) 2, 51. See also F Palmer, “Re-Dressing the Balance of Power in Discrimination Cases: The Shift in the Burden of Proof” (European Network of Legal Experts in the Non-Discrimination Field (EU Commission) 2006); I Rorive, “Proving Discrimination Cases – The Role of Situation Testing” (Migration Policy Group – Swedish Centre for Equal Rights 2009); L Farkas and O O’Farrell, “Reversing the Burden of Proof: Practical Dilemmas at the European and National Level” (European Network of Legal Experts in the Non-Discrimination Field (EU Commission) 2014); A Baele, “Proving Discrimination: The Shifting Burden of Proof and Access to Evidence” (Cloisters 2016).

145 The CJEU posed the principle of the shared burden of proof for the first time in the Danfoss case (Case 109/88, Handels- og Kontorfunktionærernes Forbund I Danmark v Dansk Arbejdsgiverforening, acting on behalf of Danfoss [1989] ECLI: ECLI:EU:C:1989:383), thereby guaranteeing the effectiveness of the equality provisions. Its case law has remained constant since (eg Case 83/14 CHEZ Razpredelenie Bulgaria AD contre Komisia za zashtita ot diskriminatsia [2015] ECLI:EU:C:2015:480 77-85) and EU secondary law embodies this principle (eg Council Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment between persons irrespective of racial or ethnic origin OJ [2000] L180/22). The CJEU has alleviated the standard rule of burden of proof that states it is for the claimant to establish the facts they allege. Indeed, discrimination can leave no material trace and the standard rule reinforces the unequal distribution of power that exists between the parties. It is now well established that when an alleged victim of discrimination brings evidence of facts from which it may be presumed there is, on the face of things, a discrimination (a prima facie case), the onus shifts to the defendant who will escape liability once showing that there is no discrimination.

146 We contend that the principle of the shared burden of proof applies here. Frontex is responsible for ETIAS’s screening rules (ETIAS Regulation, Art 75(1)(c)) and the processing of applicants’ personal data (ETIAS Regulation, Art 57(1)). The Agency is bound to (theoretically) apply the principle of equality and non-discrimination by virtue of Art 14 of the Regulation. This provision partly mirrors Arts 20 (equality before the law) and 21 (non-discrimination clause) of the EU Charter of Fundamental Rights, to which Frontex is subjected according to Art 51(1) of the Charter. Although to our knowledge the CJEU has not explicitly recognised that Arts 20 and 21 of the Charter standing alone entail the principle of the shared burden of proof – it has only done so by combining these provisions with those of EU secondary law (see Case 83/14, supra, note 145) – there are no reasons for the Court to decide otherwise given the significance it devotes to the effectiveness of equality and non-discrimination as core values of EU law.

147 R Xenidis, “Two Round Holes and a Square Peg: An Alternative Test for Algorithmic Discrimination in EU Equality Law” (on file with the authors).

148 In this context and according to our understanding, the data miner is Frontex (more specifically, the Central Unit that will be created within Frontex’s existing structure) acting as the authority responsible for the screening rules (ETIAS Regulation, Art 75(1)(c)).

149 See also Kleinberg et al, supra, note 142, 139–40 (according to whom “what outcome to predict, or how to weight together different outcomes, is one of the critical choices that must be made” in the building process of a machine learning algorithm); Xenidis and Senden, supra, note 138 (who observe that “stereotyping can influence the framing of the problem posed, and of the output looked for”).

150 Barocas and Selbst, supra, note 21, 679.

151 NP De Genova, “Migrant 'Illegality’ and Deportability in Everyday Life” (2002) 31 Annual Review of Anthropology 419; Y Jansen, R Celikates and J de Bloois, The Irregularization of Migration in Contemporary Europe: Detention, Deportation, Drowning (Lanham, MD, Rowman & Littlefield International 2015).

152 According to Art 33(3) subpara 2 of ETIAS Regulation, specific risks shall be reviewed at least every six months.

153 Barocas and Selbst, supra, note 21, 680.

154 Hacker follows the same path and identifies, among the main causes of algorithmic bias, biased training data, which covers two subcases. The first is “incorrect handling of data”, which is incorrect labelling from implicit bias or sampling bias (misrepresentation of the population in the data set). The second is “historical bias” in the training data. See Hacker, supra, note 142.

155 On the discriminatory border checks see R Bright, “Beware the Border Patrol: The Nasty History of Airport Discrimination” (The Conversation, 14 August 2017) <http://theconversation.com/beware-the-border-patrol-the-nasty-history-of-airport-discrimination-82392> (last accessed 29 June 2021); Y Vázquez, “Race and Border Control: Is There a Relationship?” (Oxford Law Faculty, 6 April 2015) <https://www.law.ox.ac.uk/research-subject-groups/centre-criminology/centreborder-criminologies/blog/2015/04/race-and-border> (last accessed 29 June 2021).

156 European Union Agency for Fundamental Rights, “Fundamental Rights at Airports: Border Checks at Five International Airports in the European Union” (2014) 45.

157 Oberverwaltungsgericht Rheinland-Pfalz (2012). The Court overruled a decision of the Administrative Court of Koblenz that had justified the triggering of further checks based merely on foreign looks, ruling instead that any form of ethnic profiling is inconsistent with Art 3 of the German Basic Law.

158 See eu-Lisa, “2019 Annual Report on Eurodac” (2020) 20. This Report gives a good account of the proportion of erroneous entries in the database. In 2019, 79,595 transactions were rejected due to errors, representing 6.3% of such transactions. Transaction error rates stood at 8% in 2018 and 5.7% in 2017.

159 See EES Regulation, Art 2(3)(c): “This Regulation does not apply to: … holders of residence permits referred to in point 16 of Article 2 of Regulation (EU) 2016/399 other than those covered by points (a) and (b) of this paragraph”” Art 2 of Regulation (EU) 2016/399 (Schengen Border Code) states that: “For the purposes of this Regulation the following definitions apply: … (16) ‘residence permit’ means: … (b) all other documents issued by a Member State to third-country nationals authorising a stay on its territory that have been the subject of a notification and subsequent publication in accordance with Article 39, with the exception of: (i) temporary permits issued pending examination of a first application for a residence permit as referred to in point (a) or an application for asylum” (emphasis added).

160 European Commission and Joint Research Centre, Migration Profile Venezuela (2019) <http://publications.europa.eu/publication/manifestation_identifier/PUB_KJ0618436ENN> (last accessed 15 July 2021).

161 Therefore producing “feedback loops”. See Gerards and Xenidis, supra, note 93, 43.

162 Barocas and Selbst, supra, note 21, 688. In the same vein, Kleinberg et al argue that unfavourable treatment “can also be introduced through decisions about what candidate predictors to collect, construct and give to the … algorithm to consider for possible inclusion in the final statistical model”: see Kleinberg et al, supra, 142, 140–41.

163 Schauer, supra, note 96, 3–7 (cited in Barocas and Selbst, supra, note 21, 688).

164 See Xenidis and Senden, supra, note 138. According to them, “in the absence of perfect information or more granular data and in front of the cost of obtaining such data, stereotypes and generalizations regarding certain groups of population might be relied on as a way to approximate reality”.

165 Art 33(4) of ETIAS Regulation does not provide clear guidance as to the way the Central Unit will establish the specific risk indicators and the criteria it will resort to in order to balance the six attributes. There are some procedural guarantees, such as the consultation of the ETIAS Screening Board, but the Central Unit seems to have (from our reading of the Regulation) a wide margin of appreciation, therefore leading to opacity in the decision-making process.

166 Gerards and Xenidis, supra, note 93, 44.

167 Barocas and Selbst, supra, note 21, 691. See also AER Prince and D Schwarcz, “Proxy Discrimination in the Age of Artificial Intelligence and Big Data” 105 Iowa Law Review 1260–61.

168 For a similar approach, see R Xenidis, “Tuning EU Equality Law to Algorithmic Discrimination: Three Pathways to Resilience” (2020) 27 Maastricht Journal of European and Comparative Law 736.

169 OHCHR, “Thematic Report on Racial Discrimination in the Context of Citizenship, Nationality and Immigration” <https://www.ohchr.org/EN/Issues/Racism/SRRacism/Pages/CitizenshipExclusion.aspx> (last accessed 2 July 2021).

170 S Fredman, “Intersectional Discrimination in EU Gender Equality and Non-Discrimination Law” (European Commission, 2016) 27. For an account of intersectional disadvantage in the field of algorithmic discrimination, see, among others, Xenidis, supra, note 168.

171 K Crenshaw, “Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics” (1989) University of Chicago Legal Forum 1.

172 Note that, in Parris, the CJEU rejected a claim based on intersectional disadvantage. It ruled that “while discrimination may indeed be based on several of the grounds … there is, however, no new category of discrimination resulting from the combination of more than one of those grounds, such as sexual orientation and age, that may be found to exist where discrimination on the basis of those grounds taken in isolation has not been established” (Case 443/15 David L Parris v Trinity College Dublin [2016] ECLI:EU:C:2016:897 80). Therefore, it is impossible from the onset for alleged victims of discrimination to contend that they underwent unfavourable treatment based on a combination of protected characteristics for the time being.

173 A Agrawal, “Removing Bias in AI Isn’t Enough, It Must Take Intersectionality into Account” (Medium, 23 April 2019) <https://atibhiagrawal.medium.com/removing-bias-in-ai-isnt-enough-it-must-take-intersectionality-into-account-e5e92e76233c> (last accessed 9 July 2021).

174 Spijkerboer, supra, note 136.

175 ibid.

176 S Costanza-Chock, “Design Justice, AI, and Escape from the Matrix of Domination” (2018) Journal of Design and Science <https://jods.mitpress.mit.edu/pub/costanza-chock/release/4> (last accessed 22 March 2021).

177 Spijkerboer, supra, note 136, 461.

178 COM (2021) 205 final 3.

179 Annex 3, para 7(b) of the Proposal for a Regulation on AI.

180 Interestingly, in its 2021 Opinion, the European Economic and Social Committee (EESC) “strongly recommends widening the scope of the AIA [Artificial Intelligence Act] so as to include ‘legacy AI systems’, i.e. systems that are already in use or are deployed prior to the coming into force of the AIA, in order to avoid deployers fast tracking any prohibited, high- and medium-risk AI to avoid compliance requirements. Moreover, the EESC strongly recommends not to exclude AI that is a component of large-scale IT systems in the area of freedom, security and justice as listed in Annex IX from the scope of the AIA” ([2021] OJ C517/61).