Hostname: page-component-78c5997874-xbtfd Total loading time: 0 Render date: 2024-11-13T06:04:13.920Z Has data issue: false hasContentIssue false

Law-jobs in the algorithmic society

Published online by Cambridge University Press:  29 November 2022

Pedro Rubim Borges Fortes
Affiliation:
Federal University of Rio de Janeiro (UFRJ), Brazil
David Restrepo Amariles*
Affiliation:
HEC Paris, France
*
*Corresponding author. E-mail: restrepo-amariles@hec.fr
Rights & Permissions [Opens in a new window]

Abstract

It is now well established that algorithms are transforming our economy, institutions, social relations and ultimately our society. This paper explores the question – what is the role of law in the algorithmic society? We draw on the law-jobs theory of Karl Llewellyn and on William's Twining refinement of Llewellyn's work through the perspective of a thin functionalism to have a better understanding of what law does in this new context. We highlight the emergence of an algorithmic law, as law performs jobs such as the disposition of trouble-cases, the preventive channelling and reorientation of conduct and expectations, and the allocation of authority in the face of algorithmic systems. We conclude that the law-jobs theory remains relevant to understanding the role of law in the algorithmic society, but it is also challenged by how algorithms redefine who does or should do what law-jobs, and how they are done.

Type
Special Issue Introduction
Copyright
Copyright © The Author(s), 2022. Published by Cambridge University Press

1 Introduction: algorithmic law-jobs in context

What is the role of law in our contemporary algorithmic societies? The spread of algorithms to an increasing number of areas in our economic, social and political life has led observers to inquire about the characteristics and implications of an algorithmic society (Restrepo Amariles, Reference Restrepo Amariles and Barfield2021; Balkin, Reference Balkin2017). Moreover, algorithmic decision systems are also making human decisions more dependent on algorithms by providing predictive inferences based on the analysis of large amounts of data or by automating in whole or in part the execution of decisions such as the grating of loans or the allocation of social benefits (Ranchordás and Scarcella, Reference Ranchordás and Scarcella2021; Restrepo Amariles, Reference Restrepo Amariles and Barfield2021). We consider it relevant to adopt the concept of law-jobs developed by the legal realist Karl Llewellyn to explore the role of law in this new context. The law-jobs theory received an important statement in the study of the legal experience of the Cheyenne people (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). As a foundational concept for his general sociology of law, the law-jobs are applicable to complex or simple, large or small human groups, and Karl Llewellyn continued to develop them for the rest of his life (Twining, Reference Twining2012). In his original formulation, the law-jobs are part of fundamental bare bones that arrange and adjust behaviour that maintain, co-ordinate and keep a functioning society (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). Based on a functional perspective, Karl Llewellyn argues that the law-jobs hold as basic functions for every human group to prevent them from breaking out and to retain their ‘groupness’, from the disposition of trouble-cases to the job of the juristic method (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). How may the ‘law-jobs’ theory be applied for our understanding of the role of law in our contemporary societies in which algorithms and algorithmic decision systems are becoming omnipresent? Can an algorithmic law perform the function of maintaining our ‘groupness’ in the digital sphere? How may we consider the tasks, the performance and the development of law-jobs of algorithmic law? This paper attempts to shed some light on these questions.

William Twining explains that an important exercise for testing the ‘law-jobs’ theory includes the understanding of how the terms ‘human group’ and ‘dispute settlement’ are used in this context, for checking whether a human group may continue to exist without the job of dispute settlement being done (Twining, Reference Twining2012). In the case of the algorithmic society, the theory of law-jobs may be applied for the maintenance of the ‘groupness’ of the human group of users of digital services and technological devices programmed through algorithms. According to Karl Llewellyn, the basic law-job consists of the production and maintenance of an existing order and he analogises with the job typically produced in a ‘garage-repair work’ (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). In this foundational sense, the basic law-job in the algorithmic society consists of the establishment and maintenance of cyberspace, social networks and other orders that are supported by code and algorithms. A fundamental set of questions become then: What are the specific tasks that need to be done in the garages to repair, fix and maintain the infrastructure of the algorithmic society? Who does or should do these law-jobs? And how are they done? As Karl Llewellyn also referred at a later stage to this conceptual framework as the theory of the institution of law-government, we should consider the institutional perspective of co-ordinating the constitution of these algorithmic orders as the starting point for our reflection (Twining, Reference Twining2012; Llewellyn, 1950, unpublished manuscript; Llewellyn, Reference Llewellyn1934).

Additionally, there is the law-job of preventing conflict that is inimical to group survival through effective channelling, preventively and in advance of people's conduct towards one another (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). One of the important characteristics of algorithmic decisions systems is their capacity to exercise power and control human behaviour through the normative architecture embedded in the code (Gordon et al., Reference Gordon2022; Restrepo Amariles and Lewkowicz, Reference Restrepo Amariles, Lewkowicz, Bouzeghoub and Mosseri2017), which becomes functionally equivalent to the law in Lawrence Lessig's classical statement that ‘code is law’ (Lessig, Reference Lessig2012). Therefore, the architects of digital devices and algorithmic systems in our society should be considered key actors in the performance of this law-job as they codify the institutional rules of the game in a way that redefines the effective content of the law and as they implement compliance by design for humans and algorithms. Finally, the law-jobs include the adjustment of the trouble-cases – offences, grievances and disputes – which could potentially threaten the existence and continuation of group life, if sufficiently multiplied and cumulative (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). The need to adjudicate new forms of high-volume disputes involving algorithms – such as in algorithmic collusion (Ezrachi, Reference Ezrachi2016), content moderation (Cavaliere and Romeo, Reference Cavaliere and Romeo2022) and market abuse in algorithmic trading (Schmidt-Kessen et al., Reference Schmidt-Kessen2022) – and the development of algorithm-supported dispute resolution (Lodder and Zeleznikow, Reference Lodder, Zeleznikow and Rainey2012) illustrate the challenges of this law-job. The rise of new dispute resolution methods, such as predictive analytics, and the role of new actors in the adjudication disputes, from LegalTechs to market-places and social networks (Cavaliere and Romeo, Reference Cavaliere and Romeo2022; Fortes et al., Reference Fortes, Baquero and Restrepo2022), reveal a new agenda for this law-job that involves the adoption of algorithms as a new craft in law's toolbox.

This Special Issue on algorithmic law in context casts new light on the law-jobs in the algorithmic society. The architecture of smart city projects reveals the possibilities and limits of incorporating rules of technological governance and of human rights protection in technological orders that govern human experience like mobility through transportation systems and access to essential services through energy supply (De Jonge, in this issue). The rise of algorithmic systems for rating corporations based on their degree of human rights protection may also influence compliance with laws against forced labour, human trafficking and modern slavery (Catá Backer and McQuilla, in this issue). The design of blockchain platforms may also facilitate inherent compliance with the legal principles of the General Data Protection Regulation (GDPR), but alternative legal and policy decisions are embedded in the normative design of privacy compliance technologies and influence considerations on the implementation of human participation, engagement and contestability (Baquero, in this issue). Functional capacities of artificial intelligence (AI), machine-learning algorithms and data mining may lead to reliable projections of an anticipated future through predictive analytics that influence law, governance and algorithmic regulation (Lazaro and Rizzi, in this issue). This special collection resulted from papers selected for the Algorithmic Law and Society Symposium at HEC Paris in December 2021, which we convened, and the papers were selected because of their fit with the objective in this Special Issue, which is to explore the law-jobs in the context of the algorithmic society.

The first attempt to outline a ‘law in context’ perspective came with William Twining's paper on ‘The camel in the zoo’, examining how the Sudanese dealt with wrongful harms through custom influenced by religious fatalism and the acceptance that the loss lies where it falls with respect to God's will (Twining, Reference Twining1985; Reference Twining1997; Reference Twining2019; Fortes, Reference Fortes2019). Subsequently, he challenged the prevailing orthodoxy in English legal education with the Law in Context book series founded in 1966 and that he supported as one of the co-editors for over fifty years (Twining, Reference Twining2019). The focus on context led him to eventually assimilate, use and even refine Llewellyn's ideas, like the law-jobs theory, for instance (Twining, Reference Twining2019). In his original statement of the law-jobs theory, Karl Llewellyn also highlighted the importance of context, as behaviour may be related to legal, social, economic and technological factors, according to the chosen context (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). Our paper echoes the legal realist insights of both Karl Llewellyn and William Twining, as we hope to address questions on how the law-jobs theory may teach us about the foundation and maintenance of ‘groupness’ in our contemporary societies. This paper is divided into four parts. In addition to this introduction, section 2 explores the insights from legal realism to discuss the transformation of law-jobs in the algorithmic society. Section 3 introduces the reader to the papers published in this Special Issue. Section 4 brings some final remarks with a focus on the ongoing discussion and potential research agenda on algorithmic law in context.

2 Law-jobs in the algorithmic society

In his original statement of the law-jobs theory, the primal function of the law emerged primarily as ‘an aspect of pure survival’ and as the ‘brute struggle for continued existence’ (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941, p. 292). The establishment and maintenance of private orders also became an essential theme, and the law-jobs theory explained the different functions and tasks that could be performed by the ‘legal’ among the Cheyenne people, but could be applied to every human group, from a couple of people as in a marriage to a complex great group as in a society (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). Karl Llewellyn explored the idea of ‘legal’ also outside the frame of the state or any other political unit, reserving the uncapitalised ‘legal’ and ‘law-job’ for general applicability to the functional aspects common to groups of all kinds (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). This legal analysis depends on the concrete group unit that is being observed as a space of social control and depends on the reality of the ‘control of whom, by whom, for what, and within the order-configuration of what entirety?’ (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941, p. 292). In addition to the primary function of preservation of the ‘groupness’ of the group, there is also a questing aspect, which examines the more adequate mode of doing the job with economy, efficiency, smoothness and grace (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). In Karl Llewellyn's own words, ‘the questing aspect looks to the ideal values: justice, finer justice, such organization and such ideals of justice as tend toward fuller, richer life’ (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941, p. 292). Interestingly, he highlighted the conflict between the bare bones of the primary function of group survival with the drive towards fuller life within the great cultural configuration among the Cheyenne way (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). This quest for ideal values of justice should also be a part of our reflection on the law-jobs in the algorithmic society.

Initially, we should reflect on the relevance of pursuing an analysis of contemporary social problems through the lenses of an unfinished project of sociology of law identified with a ‘horse-sense’ in reference to an interpretative social science that understands a social practice by grasping the perspective of the actors with understanding of the context in which they are operating, the role expectations, economic realities, constraining factors and likely consequences (Twining, Reference Twining2002). While discussing Karl Llewellyn's unfinished agenda, William Twining considers the potential of constructing a sophisticated sociology of law and sees unique value in the law-jobs theory and the idea of the juristic method (Twining, Reference Twining2002). He provides a long list of reasons for the value of the law-jobs theory as a general theory of law-government in an established and rich tradition of social theory, generally applicable to all human groups, flexibly interpreted, providing holistic or contextual lenses for looking at institutions, devices, traditions, events and other related phenomena (Twining, Reference Twining2002). By avoiding strict definitions of law and focusing on actual events, disputes and practices, the law-jobs theory provides a basis for concrete accounts of legal experience aligned with the axioms of interpretive sociology (Twining, Reference Twining2002). It is applicable to a variety of contexts, including explanations of problem-solving, behavioural patterns, legal culture and dynamic processes (with participants, choices, contingencies, institutions and traditions viewed over time) and provides a place for the study of rules and other norms as an important legal phenomenon (Twining, Reference Twining2002). In his posthumous and recently published book on The Theory of Rules, Karl Llewellyn did not position law as the central subject matter of jurisprudence, but rather focused on the law-jobs that law helps to get done, crafts as a major tool for doing these law-jobs and rules of law not as its sole subject matter, but a tool for use of the crafts and for control of the craftsmen (Llewellyn, Reference Llewellyn2011). As William Twining puts it, these works on his theory of a general sociology of law reveal a scholar who does not deserve the criticisms of being a ‘rule sceptic’, neither that his legal realism is concerned almost entirely with issues of adjudication (Twining, Reference Twining2002). On the other hand, for William Twining, Karl Llewellyn's work underplays the importance of power, structure and discourse, and would have to be supplemented for a given enquiry by additional questions (Twining, Reference Twining2002).

2.1 The who, how and what of the law-jobs in the algorithmic society?

Karl Llewellyn stated the law-jobs theory with five relevant matters: (1) the disposition of trouble-cases; (2) the preventive channelling and reorientation of conduct and expectations; (3) the allocation of authority and the arrangement of procedures that legitimatise action as being authoritative; (4) the net organisation of the group or society as a whole so as to provide cohesion, direction and incentive; (5) the job of juristic method (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941). Authority is a kind of inequality that is not natural, but rather created by society itself that allocates the power of one member to give orders that another member obeys (Collingwood, Reference Collingwood1992). Our objective with this paper is to ask some additional questions on the law-jobs in the context of the algorithmic society. We consider who compose the algorithmic society? Who are the craftsmen running the machinery of the law-jobs – who does or should do the law-jobs? And finally, what are the new tools of law in an algorithmic context?

Let us consider initially the concrete group of individuals who compose the algorithmic society as its members. If members of a society share social consciousness and decide to behave like partners of each other, they may also anticipate the possibility of breaking down into a non-social community and provide against it (Collingwood, Reference Collingwood1992). In concrete terms, everyone is a member of the algorithmic society, as the global scope of information technology reaches even individuals aware of their informational self-determination and careful about the information they share with corporations and governments (Margetts and Dorobantu, Reference Margetts and Dorobantu2019). However, globalisation processes operate in connection with regional, national and local processes, and the globe as a big political unit may be divided into various subunits for specialised analysis (Twining, Reference Twining2009). One criterion for the consideration of political subunits remains the modern national state and the members of these human groups are the national citizens, who experience a shared bundle of norms formed by the national Constitution, formal laws and informal social norms and practices. However, this traditional organisation of the political space based on the Treaty of Westphalia does not suffice for examining the establishment and maintenance of normative orders, especially because the reality of algorithmic law is forged by transnational social relations across the cyberspace shaped by the fragmented constitutions, laws, standards and social norms of lex internetica (Frydman, Reference Frydman, Cherot and Frydman2012; Reference Frydman, Muir Watt and Arroyo2014; Frydman and Rorive, Reference Frydman and Rorive2002; Teubner, Reference Teubner2012). These normative orders look more like heterarchical networks rather than the hierarchical normative pyramids of positive state law (Ost and van de Kerchove, Reference Ost and Van de Kerchove2010; Kelsen, Reference Kelsen1967).

In this context, certain law-jobs may be performed by those who design and operate digital platforms, develop code and algorithms, and implement algorithmic decision-making. Social networks establish their own institutional rules of the game and develop sets of problem-solving mechanisms for addressing conflicts among group members. These rules are not part of state law supported by a basic norm and are not recognised as part of the municipal law by a formal rule of recognition (Kelsen, Reference Kelsen1967; Hart, Reference Hart2012), but are part of the law in action to deal with actual disputes between group members in these relationships mediated by algorithms. Consider, for instance, the existence of devices to inform the platform about a violation of its terms of use as a tool that resembles the right to petition or to the access to justice within the social network. These complaints are distributed to a team of decision-makers collaborating with the social platforms as the functional equivalent of arbitrators or judges and must examine arguments and factual evidence and decide. While examining potential violations of social media rules in tens of videos and photographs per hour, decision-makers have very limited time to reflect on the justification provided by the complainer and examine the existence of hate speech, explicit imagery or unfair discriminatory content, and their verdicts are reached through heuristics and rules of thumb.

In some of these settings, decisions may draw on code and algorithms as tools to perform the job. For instance, decision-makers may code pre-programmed responses for a given complaint, like the cancellation of a penalty attributed to a user because the sanctioned behaviour resulted from a fault committed by a driver in collaboration with a transportation app. In other cases, automated decision-making systems may define the complaint based on the processing of data and a pre-established rule embedded in the algorithms that already contain the structure of a command like a ban on certain images, symbols or expressions that are internalised in the code of a social network (Restrepo Amariles, Reference Restrepo Amariles and Barfield2021). According to Karl Llewellyn's typology of law-jobs, social networks and digital platforms provide norms and a machinery for the adjustment of the trouble-case (Llewellyn and Hoebel, Reference Llewellyn and Hoebel1941).

In the context of the algorithmic society, we may also analyse how private orders allocate authority for decision-makers and whether there are functional equivalent guarantees to the due process of law and fundamental rights found in constitutional orders. Legitimation by procedure implies the existence of meaningful participation of the parties with capacity to influence the outcome and to receive a written justification from the decision-maker on the reasoning for a sentence (Luhmann, Reference Luhmann1983). In contrast to these fundamental rights and due process guarantees, complaints processed by social networks are not shaped by a dialogic exchange of arguments by the parties, knowledge of the evidence presented by the adversary and the right to a justified decision made by a human judge. Recently, however, social media platforms have created a quasi-judicial bureaucracy that reproduces the structure of an appellate court with a board of experts to review decisions previously made by automated processes or the speedy application of rules of thumb by human decision-makers or, according to The New Yorker, ‘a sort of private Supreme Court … to help govern speech on its platforms’ (Klonick, Reference Klonick2021). From a constitutional perspective, such an allocation of authority could even overrule a decision made by the chief-executive officer (CEO) and constituted a separation of powers between the presidency and the Oversight Board, which operates similarly to the judiciary branch (De Montesquieu, Reference De Montesquieu1989; Ackerman, Reference Ackerman and Rose-Ackerman2010; Carolan, Reference Carolan2009).

The authority of the algorithms emerges from the power of normative commands that are embedded in their formulas and eventually become constitutive of decision-making processes within the algorithmic society. While dealing with e-government and depending on information technology for the provision of public services and allocation of social goods, citizens depend on knowledge, notice and participation in governmental procedures to guarantee the concession of a social benefit, but may experience information asymmetry, algorithmic discrimination and alienation through automated decision-making processes (Eubanks, Reference Eubanks2018; Fortes, Reference Fortes2020a; Reference Fortes2020b; Reference Fortes2021; Fortes et al., Reference Fortes, Martins and Oliveira2021). In their capacity as ‘netizens’, the concrete group of individuals who compose the algorithmic society as digital citizens should reflect on how to constitute mechanisms to dispose of trouble-cases, legitimate action within these private orders and provide cohesion, direction and incentive to the group or society. Importantly, algorithmic norms are also constitutive of digital markets and define the rules of the game for e-commerce, fintech and competition law (Ezrachi, Reference Ezrachi2016; Mehra, Reference Mehra2015). Among the interesting initiatives in our contemporary algorithmic society, ‘Code for America’ provides support for government services by enlisting technology and design professionals to build open-source applications and promote openness, participation and efficiency. Information technology may also facilitate the task of controlling taxation through algorithmic checks of taxpayers’ records, as revealed by the experiences of France, the UK and the Netherlands (Ranchordás and Scarcella, Reference Ranchordás and Scarcella2021; Restrepo Amariles and Lewkowicz, Reference Restrepo Amariles, Lewkowicz, Bouzeghoub and Mosseri2017). Financial markets are also embedded in a particular algorithmic cycle with the triangular structure of ‘fintech’ regulated by ‘regtech’ and supervised by ‘suptech’, so that information technology enables financial services, supports regulatory regimes and carries out financial supervision functions (Restrepo Amariles and Lewkowicz, Reference Restrepo Amariles and Lewkowicz2020; Arner et al., Reference Arner2016).

2.2 Algorithmic law as a craft?

Importantly, Karl Llewellyn's perspective focused on law as a craft or the work that needed to be done. Instead of examining the concept of law or developing a theory of what law is, he was much more interested in addressing the question of ‘what is useful, or the most useful, material and way of organising study about things legal, in order to get significant light on what needs knowing and doing?’ (Llewellyn, 1950, unpublished manuscript). Instead of considering law as a scientific or philosophical activity, most realists saw law as a craft engaged by lawyers and judges, and that learning the actual operations of legal institutions was a prerequisite for improving the functioning of law as an instrument of social engineering (Tamanaha, Reference Tamanaha2006). In his refinement of the law-jobs theory, William Twining warns us that this is an analytical theory devoid of empirical content and not falsifiable (Twining, Reference Twining2009; Reference Twining1994). As an alternative to a grand theory and to general definitions of law, Karl Llewellyn developed the law-jobs theory as the most general part of his ‘working whole view’ of law (Twining, Reference Twining2009). Regarding the criticism that the law-jobs theory suffers from the inconsistencies of functionalism, William Twining considers abandoning ideas related to the purpose of law for the analysis of the ‘point’, which is preferable as it allows for mapping the development of social practices in response to complex processes of interaction that are not the result of deliberate choice (Twining, Reference Twining2009; Reference Twining1994). Refined with Twining's thin functionalism, the law-jobs theory would still allow for the analysis of a full spectrum of possibilities of problem-solving and action through different forms of collective decision-making and various kinds of unconscious and semi-conscious shifts in patterns of behaviour (Twining, Reference Twining2009). On the other hand, however, we would still need to take seriously the criticisms that the idea of law-jobs as functional equivalents to state law may lead to over-inclusiveness and the reduction of law to one single function may lead to under-inclusiveness, because state legal systems are multifunctional (Tamanaha, Reference Tamanaha2017). According to contemporary realistic theories, our understanding of the law depends on the intersubjective perception of the social group and law should be identified as ‘whatever social groups conventionally attach the label “law” to’ (Tamanaha, Reference Tamanaha2017, p. 194).

From this perspective, when the relevant social actors identified as founders, entrepreneurs and architects of the algorithmic society promote the development of magna cartas, Supreme Courts, algorithmic systems and regulation, we may acknowledge the emergence of new legal phenomena in the cyberspace, social networks and virtual realities (Fortes et al., Reference Fortes, Baquero and Restrepo2022). We also consider the existence of a mathematical turn in law, as mathematical formulas, quantitative methods and statistical analysis are currently a decisive part of what contemporary law is and the tasks performed by law-jobs in contemporary societies (Restrepo Amariles, Reference Restrepo Amariles2015; Reference Restrepo Amariles2017a; Reference Restrepo Amariles and Fortes2017b; Fortes, Reference Fortes2015; Restrepo Amariles and McLachlan, Reference Restrepo Amariles and McLachlan2017).

This path of contemporary law originated also in the pioneers of legal realism, especially the call made by Oliver Wendell Holmes that the life of law is experience and that ‘man of the future is the man of statistics and the master of mathematics’ (Holmes, Reference Holmes1897, p. 469). Karl Llewellyn shared this perspective that lawyers needed to develop this statistical knowledge and criticised the ignorance in the legal field, in which legal scholars do not know the rawest of raw facts, nor quantitative relations, percentage of settlements and factors of incidence across regions and according to the size and kind of case (Llewellyn, 1950, unpublished manuscript). Without statistics our knowledge of the legal field remains uncoordinated and partial – different people know law quite differently according to what they observed, with whom they talked, what they read somewhere, grew up with an idea or had a case once (Llewellyn, 1950, unpublished manuscript). The alternative prescribed by Karl Llewellyn was the use of the method of the ‘horse-sense’ as an effort to get the whole view, maintain balance in this view and in dealing with any particular aspect of the things in law – ‘horse-sense’ is not the sense of the horse, but rather a ‘kind of highly informed, distinctly uncommon, better-than-common, expert-but-not-scientifically-demonstrable know-what and know-how’ (Llewellyn, 1950, unpublished manuscript). Karl Llewellyn analogised the job of horse-sense jurisprudence with the job of lyric poetry in turning law real and vibrant with meaning and making people thrilled with what is to be done and how it may be done (Llewellyn, 1950, unpublished manuscript). There is a knowledge that is the ‘forgotten obvious’ that lies in the corner unnoticed and matters for our understanding of legal experience (Llewellyn, 1950, unpublished manuscript). As a participant-oriented disciplined, legal experience depends on the different participatory roles in specified contexts (Twining, Reference Twining1994).

In the contemporary setting of algorithmic societies, individual users may experience algorithmic law as citizens, consumers and avatars of virtual realities, among other potential participatory roles. On the other hand, there are also roles for individuals in positions of authority, like software developers who will program the commands embedded in algorithmic codes and support personnel from social networks who will mediate conflicts and provide dispute resolution, and executives who will take fundamental decisions on how to develop the services provided by digital platforms. These tasks resemble law-making, decision-making and executive action. Our human quest for values of the group or society as a whole so as to provide cohesion, direction and incentive within the organisation reproduces itself as a law-job in algorithmic societies too, as we need to reflect on critical accounts of digital politics, surveillance capitalism, social discrimination and the search for fairness to preserve the integrity of our lex internetica and the groupness of our algorithmic society (Susskind, Reference Susskind2018; Zuboff, Reference Zuboff2019; Perez, Reference Perez2019; O'Neil, Reference O'Neil2016; Zittrain, Reference Zittrain2008).

A decisive element for this quest for values depends on the law-job of the ‘juristic method’ comprising the job of the individuals in the key positions and with the crafts of rendering and keeping the machinery and the personnel responsible for the other law-jobs abreast and moving forward together with balance and ‘strain-to-further-the-quest’ (Llewellyn, 1950, unpublished manuscript). Our goal with this Special Issue was to put together a collection of papers that contribute to our understanding of algorithmic law as a constitutive, resolutive, preventive, allocative and normative element of our contemporary societies and to foster more contributions in the academic literature that may reflect and promote our knowledge of the law-jobs in the algorithmic society.

3 Contributions to algorithmic law: law-jobs and the quest for values

Legal phenomena may have many points, such as to control, to prevent, to punish, to co-ordinate, to constitute, to symbolise, to regulate, to facilitate, to educate and to allocate, just to name a few functions that are identified by jurists (Twining, Reference Twining2009). These points are supported by social practices established or maintained by coercive power that may lead to obedience, consent or normative acceptance of those individuals subject to them (Twining, Reference Twining2009). Once a pattern of behaviour emerges and that pattern becomes broadly recognised through shared intersubjective meaning for the group participants, this social practice could be identified as legal depending on the specific contexts and degrees of institutionalisation, normativity and effectiveness (Twining, Reference Twining2009). Instead of searching for a grand theory and for the rules of recognition of lex internetica, our conception of algorithmic law focuses primarily on the empirical performance of real legal phenomena (‘law in action’) and how the law-jobs in the algorithmic society are addressed by institutions that are recognised as constitutive, resolutive, preventive, allocative or normative.

In this sense, Alice De Jonge examines how to best protect basic human rights of vulnerable minorities in the context of ‘smart cities’ projects in Southeast Asia, by analysing twenty-six smart city projects within the ASEAN's Smart City Network (ASCN) launched in April 2018. By focusing on the human rights impacts of ASCN pilot city transport systems and energy systems, she examines data collection and information use through the perspective of the Knowledge Commons Framework (KFC) and how institutions govern the production and management of knowledge around three pillars – knowledge resources, community attributes, and governance rules and institutions (De Jonge, in this issue). Data-driven technologies transformed cities and negatively affected the lives of vulnerable minority communities through injustices resulting from algorithmically determined social security decisions, predictive policing and surveillance technologies (De Jonge, in this issue). As revealed by the failure of earlier tabula rasa cities such as Brasilia, cities are not technological problems, but rather complex ecosystems of human connections shaped by culture, history and politics (De Jonge, in this issue). One key lesson of smart city experience comes from the analysis of the interconnection and interaction of urban systems as part of the management of knowledge resources, especially the warning that the main goal of planning may not be to achieve idealised goals of competitiveness and efficiency due to risks of unintended social consequences and the perpetuation of institutional bias (De Jonge, in this issue). Each ASCN pilot city project remains located within a unique historical, cultural and political context, being facilitated by high social capital and shared trust that foster the effectiveness of public policies and the resilience of urban operations (De Jonge, in this issue). Even if the academic literature emphasises the importance of citizens’ participation and active involvement of all stakeholders in the smartification of the city, Alice De Jonge found no evidence that these smart city projects benefit the most disadvantaged and criticised the lack of mention of questions of citizen voice, mobilisation or social change in these ASCN pilot projects (De Jonge, in this issue). According to her, ASEAN governance dynamics and institutions have not always been consistent with protection of human rights, and accountability and transparency are central challenges of smart city governance (De Jonge, in this issue). In terms of urban mobility and accessibility, the challenge for providing solutions for making streets navigable for children, the elderly and the disabled are based on political, social and cultural considerations, rather than algorithmic ones (De Jonge, in this issue). Likewise, the ethical dimensions of smart traffic control technology design are often neglected, as true winners and losers are not accurately identified (De Jonge, in this issue). Furthermore, she considers it easier to sell politically irritation over traffic congestion in comparison with the moral indignation with technology related to automobiles (De Jonge, in this issue). In terms of access to reliable energy recognised as a basic human right, diversity and flexibility together with transparency and accountability should become important considerations in areas of regulatory concern for energy systems (De Jonge, in this issue). Once again, decisions are made by human actors in relational contexts and should take into consideration the fact that vulnerable consumers may require greater reliability of energy supply while being less able to pay (De Jonge, in this issue). As quality of life implies the full enjoyment of human rights, smart city planning should consider as a central question the improvement of the life quality and people's experience within the smart city (De Jonge, in this issue).

In their contribution to this collection, Larry Catá Backer and Matthew B. McQuilla analyse the rise of algorithmic systems of data-driven governance in the form of rating systems of business respecting human rights responsibilities. Their objective is to advance the discourse of algorithmic law between the lines of scholarship that pursue pragmatic issues related to the regulation of potential costs and benefits, and the scholarship that investigates the consequences related to the rise of platforms used to support the structures and operations of algorithmic law (Catá Backer and McQuilla, in this issue). Ratings-based regulatory structures function as a gateway for developing predictive analytics with regulatory potential and advance a discussion on algorithmic law's role in international human rights law, especially because a set of normative ideals may be reduced to a set of measurable inputs (Catá Backer and McQuilla, in this issue). The fusion of traditional law and algorithmic analytics takes place in a context of norm identification, data accessibility and integrity, and the consequences of these ratings and the derived judgments (Catá Backer and McQuilla, in this issue). In terms of functions, traditional law provides a constitutive and quality-control structure and the regulatory-administrative operation situated within the rating systems provides orientation for the necessary conformity demanded for the pursuit of higher ratings (Catá Backer and McQuilla, in this issue). Their research focuses on the ratings-based regulation of human trafficking, because of the normative consensus on core normative principles reached by states and public international bodies (Catá Backer and McQuilla, in this issue). The sources for forced labour rating systems come typically from a pool of data and records semi-voluntarily disclosed by corporations in the mode of self-regulation and eventually the rating systems will share close indicators with the International Labour Organization (ILO) standards. Some governments also provide support with coercive laws for data collection, transparency and preventive measures, but state that participation remains marginal and supplementary because these national regimes are not sufficiently consistent or co-ordinated (Catá Backer and McQuilla, in this issue). In the case of the Financial Times Stock Exchange (FTSE), the methodology requires external sources and referenced indicators, and comprises six rating sections, which include business and supply-chain structure, policies, due diligence, risk assessment, effectiveness and training. Importantly, the methodology for scoring records a cumulative score based on an equation that processes structured and detailed quantitative information that is transferred from qualitative data (Catá Backer and McQuilla, in this issue). In the case of Know The Chain (KTC), the methodology results from a non-disclosed formula that processes an amalgamation of twenty-three indicators based on the management of a diffused system of data generation, analytics and ratings construction, by tracking information from thousands of companies across the globe on their performance in handling human rights (Catá Backer and McQuilla, in this issue). In the case of Green America, the methodology consisted of a grading system scorecard with a non-disclosed formula that rates chocolate companies for their commitment to using certified cocoa in their production and their supply chain (Catá Backer and McQuilla, in this issue). Their study reveals a regulatory space composed of a complex combination of state actors, non-governmental organisations and private corporations that are affected by their reciprocal interaction as stakeholders of the rating systems, as rules are simultaneously being derived from collected data, their analysis and production that lead to rule-making and accountability together in a style of governance that provides power for standard setting and normative sanctioning through reflexive feedback loops (Catá Backer and McQuilla, in this issue).

A further contribution to our Special Issue on the law-jobs in the algorithmic society comes from Pablo Marcello Baquero, who explores the rising concerns about data protection that led to the identification of challenges and potential solutions to guarantee privacy in blockchain platforms, especially through the incorporation of GDPR compliance technologies into its design. Baquero reminds us that these regulatory choices involve legal and policy decisions related to the implementation of human participation and contestability and the protection of the rule of law within these platforms (Baquero, in this issue). From his perspective, design should not aim for complete automation without human participation, engagement and overview, so that these systems preserve human agency, checks and balances, and social participation (Baquero, in this issue). Examining the vulnerabilities of data in blockchain platforms, he considers that information will often be considered personal data and that privacy is far from being guaranteed (Baquero, in this issue). As main obstacles for the implementation of the data protection rules to blockchains, the lack of clear identifiable controller liable for privacy violations in permissionless blockchains and its allegedly ‘immutable’ character emerge as sources of burdensome processes for the protection of rights to rectify inaccurate data, to be forgotten and to the application of established GDPR principles (Baquero, in this issue). In the context of the EU, therefore, data protection literature has either considered the GDPR inadequate or sought alternatives for these different challenges, which fosters a discussion on the potential contributions of privacy-by-design within blockchain applications that may embed different legal and policy choices for data protection (Baquero, in this issue). By distinguishing between trust in the technical system and trust as reliance on human individuals, Baquero problematises the sophistication of full anonymisation and secure storage off-chain of data in the blockchain as detrimental for different actors and calls for a reflection on the important policy decisions behind technological architecture (Baquero, in this issue). Strategies for regulation could include certification of blockchain platforms and the determination of who will be responsible for potential privacy violations in the blockchain (Baquero, in this issue). The discussion on liability should define which parties are responsible for privacy violations in different contexts and eventually require the indication of the responsible entity or party (Baquero, in this issue). In his opinion, the current disregard of the social and human element by the prevailing techno-regulatory approach risks undermining the role of privacy compliance technologies in the blockchain (Baquero, in this issue).

Finally, Christophe Lazaro and Marco Rizzi contributed with their work on predictive analytics and governance as a new sociotechnical imaginary for uncertain futures. Lazaro and Rizzi examine the functional capacities of algorithmic devices as methods for optimisation of decision-making processes and anticipating risks that reveal the obsession of our algorithmic society with prediction and anticipation of the future. Adopting the terminology ‘predictive analytics’ to refer to technologies like predictive modelling, machine learning and data mining that facilitate analysis of past and present data to make prediction on the future, Lazaro and Rizzi map the landscape of the constitutive threads of predictive analytics as an instrument for governance. Inspired by the insight from François Ost that law may also be understood as a mode for anticipation, Lazaro and Rizzi invite us to reflect on the legal operations as a cognitive and pragmatic resource that serves as a guide, a constraint and a vector of anticipation too, by supporting human co-ordination as a key instrument of governance (Lazaro and Rizzi, in this issue). Algorithmic anticipatory logics may allow governance to move beyond a risk-based approach and towards the point of ‘what is not and may never happen’ through predictive analytics in a new context of ‘ontopolitics’ and selection of forms of life to be valorised and eventually preserved to cope with future uncertainty (Lazaro and Rizzi, in this issue). Their mapped landscape is composed of the emerging sociotechnical imaginary and the context of official documents and discourses on AI and predictive analytics within European and state institutions that provide fertile ground to grasp the collective imagination on the emergence, tensions and reconfiguration of temporalities related to ‘predictive analytics’ as it affects links between past, present and future with normative consequences (Lazaro and Rizzi, in this issue). A world shaped by contingency requires real-time prediction as a tool for control and autonomy, as revealed in discourses and documents by the European Commission and the Council of Europe (Lazaro and Rizzi, in this issue). Paradoxically, however, predicting in real time sounds contradictory, because the actual future always triumphs over countless possible alternatives by becoming actualised and yet the ambition to ‘predict the present’ resulted in the neologism ‘nowcasting’ to supplement the conventional ‘forecasting’ (Lazaro and Rizzi, in this issue). In their description of this sociotechnical imaginary, ‘presentism’ and ‘de-futurisation’ appear as new ways of articulating categories of time, our temporal horizons and our normative assessment of a life lived in permanent and vigilant adaptation with demands of productivity, flexibility and mobility (Lazaro and Rizzi, in this issue). The politics of temporality may foster the development of mathematical formulas to calculate all phenomena and establish a predictive score for any entity as part of organisational processes typical of a score society (Lazaro and Rizzi, in this issue). Referring to the plurality of forms of knowledge inferred from AI systems as ‘epistemological heterogeneity’, this knowledge enables performative operations that establish the presence of the future in different ways and questions the type of rationality behind ‘predictive analytics’ (Lazaro and Rizzi, in this issue). Likewise, predictive practices are also normative, because the authority of these machine-learning algorithms both requires and justifies the use of these logics of action (precaution, pre-emption, preparedness) in the here and now (Lazaro and Rizzi, in this issue). In this new sociotechnical imaginary, anticipatory techniques may allow humans to regain a measure of control and autonomy in a contingent world, but there are critical points related to (1) temporality and materiality; (2) knowledge and action; (3) subject and object; (4) the virtual and the possible; (5) the past, the present and the future (Lazaro and Rizzi, in this issue). Their recommendation points towards pluralising the future and preserving an ‘ecology of futures’ for policy-makers, lawyers, experts, stakeholders and citizens to make decisions in the context of the contingent life of our societies (Lazaro and Rizzi, in this issue).

4 Final remarks: an agenda for law-jobs in the algorithmic society

Karl Llewellyn considered that jurisprudence would be more fruitful by focusing ‘on the jobs which law is there to help get done’, which ‘viewed the crafts as a major tool for doing the jobs’ and the rules of law ‘as a major tool for the use of the crafts, and for partial control of the craftsmen’ (Llewellyn, Reference Llewellyn2011, p. 64). In the algorithmic society, the theory of the law-jobs may help lawyers, legal scholars, software developers and policy-makers to understand that embedding normative commands within mathematical formulas may get the job done in terms of providing access to transportation and energy in smart cities, protecting human rights and privacy rights, or examining the optimisation of predictive analytics in decision-making processes. Our Special Issue brings a collection of papers that analyse powerfully the ‘point’ behind innovative technologies, revealing challenges of constituting a normative space, orienting expectations, providing solutions for trouble-cases, arranging procedures and allocating authority that may eventually legitimate power. With the exponential growth of digital technologies, the agenda for law-jobs in the algorithmic society will also evolve and become more diverse and richer. The ‘smartification’ of cities, labour, contracts and decision-making challenges us to reflect on the law-jobs as what needs to get done, how to get it done and who should be involved in using the crafts and the control of craftsmen. Inspired by these insights, our own prediction is that the algorithmic society has a large agenda for the law-jobs theory, and we used this paper and the Special Issue a first step to advance our understanding about the who, how and what of the law-jobs in the context of our contemporary algorithmic society. We hope this perspective will attract more scholarship in the years to come.

Acknowledgements

We are grateful to William Twining for his comments on the final draft of this paper. We would also like to thank all the authors and speakers who joined the Algorithmic Law and Society Symposium at HEC Paris in December 2021. Their presentations and remarks were useful to refine the ideas presented in this paper. It goes without saying that all remaining errors, mistakes and controversial points of view remain our own.

Conflicts of Interest

None

References

Ackerman, B (2010) Good-bye Montesquieu. In Rose-Ackerman, S et al. (eds), Comparative Administrative Law. Cheltenham: Edward Elgar Publishing, pp. 128133.CrossRefGoogle Scholar
Arner, D et al. (2016) Fintech, regtech, and the reconceptualization of financial regulation. Northwestern Journal of International Law and Business 37, 371413.Google Scholar
Balkin, J (2017) The three laws of robotics in the age of big data. Ohio State Law Journal 78, 12171241.Google Scholar
Carolan, E (2009) The New Separation of Powers: A Theory for the Modern State. Oxford: Oxford University Press.CrossRefGoogle Scholar
Cavaliere, P and Romeo, G (2022) From poisons to antidotes: algorithms as democracy boosters. European Journal of Risk Regulation, 1–25.Google Scholar
Collingwood, R (1992) The New Leviathan: Or Man, Society, Civilization, and Barbarism Goodness, Rightness, Utility' and What Civilization Means. Oxford: Oxford University Press.Google Scholar
De Montesquieu, C (1989) Montesquieu: The Spirit of the Laws. Cambridge: Cambridge University Press.Google Scholar
Eubanks, V (2018) Automating Inequality: How High-tech Tools Profile, Police, and Punish the Poor. New York: St. Martin's Press.Google Scholar
Ezrachi, A (2016) Virtual Competition: The Promise and Perils of the Algorithm-driven Economy. Cambridge: Harvard University Press.CrossRefGoogle Scholar
Fortes, P (2015) How legal indicators influence a justice system and judicial behavior: the Brazilian National Council of Justice and ‘justice in numbers’. The Journal of Legal Pluralism and Unofficial Law 4, 3955.CrossRefGoogle Scholar
Fortes, P (2019) An explorer of legal borderlands: a review of William Twining's jurist in context, a memoir. REI-Revista Estudos Institucionais 5, 777790.CrossRefGoogle Scholar
Fortes, P (2020a) O consumidor contemporâneo no Show de Truman: a geodiscriminação digital como prática ilícita no direito brasileiro [The contemporary consumer in the Truman Show: geo-discrimination as an illegal practice in Brazilian law]. Revista de Direito do Consumidor 124, 235260.Google Scholar
Fortes, P (2020b) Paths to digital justice: judicial robots, algorithmic decision-making, and due process. Asian Journal of Law and Society 7, 453469.CrossRefGoogle Scholar
Fortes, P (2021) Hasta la vista, baby: reflections on the risks of algocracy, killer robots, and artificial superintelligence. Revista de la Facultad de Derecho de México 71, 4572.CrossRefGoogle Scholar
Fortes, P, Baquero, PB and Restrepo, Amariles D (2022) Artificial intelligence risks and algorithmic regulation. European Journal of Risk Regulation 13, 357372.CrossRefGoogle Scholar
Fortes, P, Martins, GM and Oliveira, PF (2021) Digital geodiscrimination: how algorithms may discriminate based on consumers’ geographical location. Droit et société 107, 145166.CrossRefGoogle Scholar
Frydman, B (2012) Comment penser le droit global? [How to think about global law?]. In Cherot, J-Y and Frydman, B (eds), La science du droit dans la globalisation, 1748.Google Scholar
Frydman, B (2014) A pragmatic approach to global law. In Muir Watt, H and Arroyo, D (eds), Private International Law and Global Governance. Oxford: Oxford University Press, 182–201.Google Scholar
Frydman, B and Rorive, I (2002) Regulating Internet content through intermediaries in Europe and the USA. Zeitschrift für Rechtssoziologie 23, 4160.CrossRefGoogle Scholar
Gordon, G et al. (2022) On mapping values in AI governance. Computer Law and Security Review 46, published online 22 September 2022, https://doi.org/10.1016/j.clsr.2022.105712.CrossRefGoogle Scholar
Hart, HLA (2012) The Concept of Law, 3rd edn. Oxford: Oxford University Press.CrossRefGoogle Scholar
Holmes, OW (1897) The path of the law. Harvard Law Review 10, 457478.Google Scholar
Kelsen, H (1967) Pure Theory of Law. Berkeley: University of California Press.CrossRefGoogle Scholar
Klonick, K (2021) Inside the making of Facebook's Supreme Court. The New Yorker, 12 February.Google Scholar
Lessig, L (2012) Republic, Lost: How Money Corrupts Congress – and a Plan to Stop It. New York: Twelve.Google Scholar
Llewellyn, K (1934) The Constitution as an institution. Columbia Law Review 34, 140.CrossRefGoogle Scholar
Llewellyn, K (2011) The Theory of Rules. Chicago: University of Chicago Press.CrossRefGoogle Scholar
Llewellyn, K and Hoebel, A (1941) Cheyenne Way: Conflict & Case Law in Primitive Jurisprudence. Norman: University of Oklahoma Press.Google Scholar
Lodder, A and Zeleznikow, J (2012) Artificial intelligence and online dispute resolution. In Rainey, D et al. (eds), Online Dispute Resolution: Theory and Practice: A Treatise on Technology and Dispute Resolution. The Hague: Eleven International Publisher, pp. 7394.Google Scholar
Luhmann, N (1983) Legitimation durch verfahren [Legitimacy through Procedure]. Frankfurt am Main: Suhrkamp.Google Scholar
Margetts, H and Dorobantu, C (2019) Rethink government with AI. Nature 568, 163165.CrossRefGoogle ScholarPubMed
Mehra, S (2015) Antitrust and the robo-seller: competition in the time of algorithms. Minnesota Law Review 100, 13231375.Google Scholar
O'Neil, C (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York: Broadway Books.Google Scholar
Ost, F and Van de Kerchove, M (2010) De la pyramide au réseau? Pour une théorie dialectique du droit [From Pyramid to Network? For a Dialectical Theory of Law]. Brussels: Presses de l'Université Saint-Louis.CrossRefGoogle Scholar
Perez, C (2019) Invisible Women: Data Bias in a World Designed for Men. New York: Abrams.Google Scholar
Ranchordás, S and Scarcella, L (2021) Automated government for vulnerable citizens: intermediating rights. William & Mary Bill of Rights Journal 30, 373428.Google Scholar
Restrepo Amariles, D (2015) Legal indicators, global law and legal pluralism: an introduction. The Journal of Legal Pluralism and Unofficial Law 47, 921.CrossRefGoogle Scholar
Restrepo Amariles, D (2017a) Supping with the Devil? Indicators and the rise of managerial rationality in law. International Journal of Law in Context 13, 465484.Google Scholar
Restrepo Amariles, D (2017b) Transnational legal indicators: the missing link in a new era of law and development. In Fortes, P et al. (eds), Law and Policy in Latin America. London: Palgrave Macmillan, pp. 95111.CrossRefGoogle Scholar
Restrepo Amariles, D (2021) Algorithmic decision systems: automation and machine learning in the public administration. In Barfield, W (ed.), The Cambridge Handbook of the Law of Algorithms. Cambridge: Cambridge University Press, pp. 273300.Google Scholar
Restrepo Amariles, D and Lewkowicz, G (2017) De la donnée à la décision: comment réguler par des données et des algorithms [From data to decision-making: how to regulate through data and algorithms?]. In Bouzeghoub, M and Mosseri, R (eds), Les Big Data à Decouvert. Paris: CNRS Editions, pp. 9293.Google Scholar
Restrepo Amariles, D and Lewkowicz, G (2020) Unpacking smart law: how mathematics and algorithms are reshaping the legal code in the financial sector. Lex Electronica 25, 171185.Google Scholar
Restrepo Amariles, D and McLachlan, J (2017) Legal indicators in transnational law practice: a methodological assessment. Jurimetrics 58, 163209.Google Scholar
Schmidt-Kessen, MJ et al. (2022) Machines that make and keep promises: lessons for contract automation from algorithmic trading on financial markets. Computer Law & Security Review 46, published online September 2022, https://doi.org/10.1016/j.clsr.2022.105717.CrossRefGoogle Scholar
Susskind, J (2018) Future Politics: Living Together in a World Transformed by Tech. Oxford: Oxford University Press.Google Scholar
Tamanaha, B (2006) Law as a Means to an End: Threat to the Rule of Law. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Tamanaha, B (2017) A Realistic Theory of Law. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Teubner, G (2012) Constitutional Fragments: Societal Constitutionalism and Globalization. Oxford: Oxford University Press.CrossRefGoogle Scholar
Twining, W (1985) Talk about realism. NYUL Review 60, 329.Google Scholar
Twining, W (1994) Blackstone's Tower: The English Law School. London: Stevens & Son/Sweet & Maxwell.Google Scholar
Twining, W (1997) Law in Context: Enlarging a Discipline. Oxford: Oxford University Press.Google Scholar
Twining, W (2002) The Great Juristic Bazaar: Jurists’ Texts and Lawyers’ Stories. London: Routledge.Google Scholar
Twining, W (2009) General Jurisprudence: Understanding Law from a Global Perspective. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Twining, W (2012) Karl Llewellyn and the Realist Movement, 2nd edn. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Twining, W (2019) Jurist in Context: A Memoir. Cambridge: Cambridge University Press.Google Scholar
Zittrain, J (2008) The Future of the Internet – and How to Stop It. New Haven: Yale University Press.Google Scholar
Zuboff, S (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: Public Affairs.Google Scholar