Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-27T07:54:15.921Z Has data issue: false hasContentIssue false

Smooth Operators, Predictable Glitches: The Interface Governance of Benefits and Borders

Published online by Cambridge University Press:  11 August 2023

Jennifer Raso*
Affiliation:
McGill University Faculty of Law, Canada jennifer.raso@mcgill.ca
Rights & Permissions [Opens in a new window]

Abstract

This article examines the phenomenon of interface governance. It uses two interface technologies—Universal Credit’s digital account (United Kingdom) and ArriveCAN (Canada)—to explore how interfaces and their predictable glitches govern relations between state officials and members of the public. Drawing on tools of government literature, it argues that interfaces do not achieve their stated goals evenly (improved efficiency, digital literacy). Instead, they generate several unintended effects, including heightened bureaucratic intensity, diffused responsibility, and even eroded public trust in state agencies. It urges socio-legal and administrative justice scholars to take interfaces seriously and calls on scholars to adopt socio-legal-technical methods to better conceptualize the effects of infrastructure governance and to imagine other possibilities for public administration.

Résumé

Résumé

Cet article analyse le phénomène de la gouvernance des interfaces. Il utilise deux technologies d’interface – le compte numérique du Crédit Universel (Royaume-Uni) et ArriveCAN (Canada) – pour examiner comment les interfaces et leurs problèmes prévisibles régissent les relations entre les représentants de l’État et les membres du public. S’appuyant sur la littérature relative aux instruments de l’action publique, l’article soutient que les interfaces n’atteignent pas leurs objectifs déclarés (amélioration de l’efficacité, littératie numérique) de manière inégale. Par ailleurs, elles génèrent plusieurs effets inattendus, notamment une lourdeur bureaucratique accrue, une diffusion de la responsabilité et même une érosion de la confiance du public dans les organismes étatiques. L’article invite ainsi les spécialistes de la recherche sociojuridique et de la justice administrative à prendre les interfaces au sérieux, tout comme il invite ces spécialistes à adopter des méthodes socio-juridico-techniques afin de mieux conceptualiser les effets de la gouvernance des infrastructures et d’imaginer d’autres possibilités pour l’administration publique.

Type
Articles
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of the Canadian Law and Society Association

Introduction

Across government agencies, interfaces and their glitches have become features of public administration. We encounter them whenever we apply for benefits or student loans, renew identity documents, file tax forms, cross a border, or discover that we are the subject of a public debt (Eubanks Reference Eubanks2017; Yeung and Lodge Reference Yeung and Lodge2019; Van Den Meerssche Reference Van Den Meerssche2022; Carney Reference Carney2019). This phenomenon is not entirely new (Margetts and Partington Reference Margetts, Partington and Adler2010; Clarke Reference Clarke2020). Yet it has recently accelerated and become entrenched. Scholars studying how states use technology have sought to describe and conceptualize these trends in different ways. For example, Amoore’s studies of risk calculation and cloud computing illuminate how computing and risk management techniques elevate the data attributes of people (Reference Amoore2013, Reference Amoore2020). Tomlinson’s work illustrates how technologists’ agile design approaches challenge the incremental and hierarchical institutional processes of administrative agencies (Reference Tomlinson2019). This article extends this work. Drawing on socio-legal and tools of government studies, it develops the concept of interface governance to analytically unify these phenomena and to theorize the upshots of digital government and predictable glitches for public administration.

As a concept, interface governance describes the situation whereby government agencies administer benefits, services, and sanctions through digital interfaces (websites, smartphone apps). These interfaces mediate interactions between administrative officials and members of the public. Like all technologies, interfaces also malfunction. These “glitches,” too, have governance consequences. While digital government existed before the COVID-19 pandemic, the pandemic accelerated both government use of digital interfaces and individuals’ engagement with those interfaces. By focusing on digital interfaces and their predictable glitches, this article explores how interfaces govern relations between state officials and members of the public and how they affect public administration.

This article argues that interface governance and its predictable glitches have become paradigmatic within many front-line agencies that make decisions about crucial entitlements and penalties. Governments often introduce interfaces into such settings to improve efficiency and increase digital literacy, but interfaces regularly fall short of these goals. Instead, they encourage officials to shift responsibility away from themselves while imposing a unique type of responsibilization on members of the public. Individuals must be regularly available to the interface (and to the officials on the other side of it) despite a lack of reciprocal responsiveness on the part of state officials. Glitches provide officials with an opportunity to further diffuse responsibility. Because they affect large numbers of people, and because public officials are often unprepared or unwilling to deal with their aftermath, glitches also risk fundamentally eroding public trust in state institutions.

Understanding glitches as a central component of interface governance is crucial, and this article uses sociologically minded tools of governance literature to assist in this endeavour (Lascoumes and Le Galès Reference Lascoumes and Le Galès2007; Dunleavy et al. Reference Dunleavy, Margetts, Bastow and Tinkler2006). This literature helpfully attends to both the intended and unintended consequences of policy “instruments.” It is also useful because it understands tools in relation to the networks in which they are embedded, while highlighting how a specific tool influences institutional functioning, rather than losing such details amidst a flattened topography of networked actors. Despite its clear relevance, however, tools of government literature has remained surprisingly underutilized in contemporary studies of digital government and algorithmic decision-making.

To ground its arguments, this article relies upon social benefit and international border governance examples from the United Kingdom and Canada, two states undertaking parallel yet distinct digital government initiatives. It examines Universal Credit, a pathbreaking digital-by-default UK social assistance program, and ArriveCAN, a controversial Canadian border governance tool. These strategically chosen examples allow us to draw lessons across jurisdictions, to highlight similarities, and to signal related concerns. Benefits and border security agencies are both frequent targets of digital reform (Achiume Reference Achiume2020; Alston Reference Alston2019). These reforms are often undertaken to enhance decision-making efficiency, a laudable goal because, in both settings, administrators face high volumes of applications that determine crucial rights and interests (access to last-resort benefits, the ability to cross the border). These administrative decisions are also more or less final because they are incredibly difficult to challenge. As a result, getting these decisions right the first time takes on added importance (Adler Reference Adler, Hertogh, Kirkham, Thomas and Tomlinson2022). At the same time, both programs are politically controversial and vitally important to the people they impact. This feature seems to have heightened technologists’ and senior public officials’ interest in digital reforms to enhance efficiency and standardize decision-making practices. Consequently, today both social assistance and borders feature a range of interfaces governing interactions between administrative officials and members of the public. These interfaces operate from the moment that benefits and services are introduced to the instant that penalties are imposed.

The fact that both programs have a relatively long history of governing through interfaces is another reason they are analyzed together. If glitches were to occur anywhere, we might reasonably expect that they would be better anticipated and addressed in settings where interfaces have a relatively long history. We might even assume that responses to glitches in benefits and border settings might be more coordinated or sophisticated. But these assumptions are misplaced. It is thus useful to consider how glitches form part of the bureaucratic organization of both benefits and border administration.

This article draws on evidence collected from an ongoing study of how digitalization impacts front-line decision-making. It uses a multi-site approach that draws observations from an expansive networked field (Sullivan Reference Sullivan2022). Because digital government tools, such as interfaces, link to and are constituted by broader algorithmic and administrative systems, they cannot be studied in isolation or located at one research site. Instead, they must be examined in the many places where they evolve and where their effects are observable. For this research, these sites include confidential meetings with public officials and technologists and observation of these tools in action from 2020 forward. They also include findings communicated in government documents, media accounts, and the fieldwork reports of others. While these methods may seem idiosyncratic, they have long been used to examine an elusive object or community (Gusterson Reference Gusterson1996; Hannerz Reference Hannerz2003; Coleman Reference Coleman2015). Today, they are increasingly common in qualitative studies of decision-making instruments (Hull Reference Hull2012; Seaver Reference Seaver2017; Burrell Reference Burrell2023).

The argument proceeds in three main parts. The first part introduces, defines, and explains the importance of both interface governance and front-line settings. The second part illustrates how interfaces govern in the context of Universal Credit and ArriveCAN. It details each program before exploring the intended and unintended effects of interface governance and predictable glitches. The last part concludes with recommendations for future research.

Interface Governance and Expansive Front Lines

Today, digital interfaces—websites, smartphone apps—are the front counter of many government agencies. Individuals must engage with these interfaces as they seek to access, or to challenge a refusal of, benefits or services. Interfaces modulate interactions between members of the public and public officials, acting as a screen between them and even “making” decisions themselves. This phenomenon has led socio-legal scholars to describe the contemporary state as distributed. Its institutions exist both in conventional physical locations (administrative offices, for instance) and in seemingly omnipresent digital places, such as laptops and smartphones. As Fourcade and Gordon suggest, the state is increasingly both omnipresent and remote. It exists on personal computers in our homes and in rural data servers (Reference Fourcade and Gordon2020, 83). In theory, the digitalization of government agencies suggests that state actions should be more easily knowable or accessible, although this impression may be generated by an aesthetics of transparency rather than by truly transparent procedures (Valverde et al. Reference Valverde, Johns and Raso2018). The digitalization, or digital interfacing, of government is an international phenomenon. It affects states across the global north/global south spectrum and extends into international governance regimes (Adelmant et al, Reference Adelmant, Alston and Van Veenforthcoming; Johns Reference Johns2023). Although it preceded the COVID-19 pandemic, the transition from in-person to remote service delivery mechanisms has accelerated since early 2020. The effects of governments’ digital interfaces have thus taken on (re)new(ed) significance for socio-legal and administrative justice scholars.

This distribution of front-line agencies through websites, apps, and call centres intensifies the phenomenon of interface governance. Interfaces are a distinct type of “tool” or “instrument” of government (Lascoumes and Le Galès Reference Lascoumes and Le Galès2007). As such, they are socio-legal-technical things. Their functionality is determined by their technical features (computer code, databases, hardware) and their social characteristics, such as how members of the public, state officials, and others engage with the interface. They mediate decisions about individuals’ legal entitlements (to benefits, services, status, etc.) and decisions that impose penalties or restrictions. Anyone who engages with a state agency today is affected by human officials plus one or more interfaces. Officials together with the other elements that make up a digital interface determine the benefits or services to which someone will be entitled. The interface, along with the software, data, hardware, cables, and people who animate it, may identify tasks that an individual must complete to remain eligible for benefits or to comply with border control rules, calculate their entitlements, debts, or penalties, or direct them to engage further with a particular government agency. While officials may help produce these decisions on the other side of the interface, their work becomes less perceptible to those on the public-facing side (Eubanks Reference Eubanks2017). This obfuscation is one method by which the interface mediates and shapes interactions between officials and so-called “program users” (Adler Reference Adler, Hertogh, Kirkham, Thomas and Tomlinson2022). Interfaces are thus distinct governance tools that regulate front-line officials, individuals who interact with them, and the wider public.

Digital interfaces are often first introduced into last-resort front-line settings, which are the focus of this article. Front-line agencies often make complex determinations about crucial programs and services, from income assistance or disability benefits to customs and immigration enforcement. In many instances, the agencies responsible for such decisions handle large volumes of cases. Each decision may significantly impact the rights, privileges, and interests of the persons involved.

Front-line decisions perform many functions. They may evaluate an individual’s eligibility for a benefit or privilege, assess fraud or security risks, or calculate and levy debts and penalties. Front-line decisions may be made when someone requests something from the state, such as emergency financial assistance. For example, during the early days of the COVID-19 pandemic in Canada, a recently unemployed person might have applied for the Canada Emergency Response Benefit (CERB) to receive temporary financial assistance while searching for a new job. Although they would submit their application through an online portal on a government website, their supporting documents were eventually reviewed months later by a public official to confirm eligibility (Robson Reference Robson2020). Front-line decisions also occur when someone is targeted by state officials. In the CERB program, auditing software may identify a CERB recipient whose supporting documents misaligned with income tax returns filed with the Canada Revenue Agency (CRA). A CRA representative may contact this individual by email to request further documentary proof of their previous income. Once these documents are submitted, a civil servant determines whether the available evidence sufficiently establishes eligibility or whether a debt plus interest are owed to the Canadian government.

Front-line decisions, and the bureaucracies that produce them, are regularly critiqued as slow, opaque, biased, and unaccountable. In response, agencies regularly introduce “innovative” interface technologies to minimize backlogs and to increase transparency, impartiality, and public accountability. Twenty years ago, these tools may have accompanied new public management or e-government reforms promising a customer- or client-centred approach to public service delivery (Hood Reference Hood, Goodin, Moran and Rein2008, 473). More recently, they are even accompanied by promises that digital government will “end bureaucracy” (Lascoumes and Simard Reference Lascoumes and Simard2011).

Relying on “hyper-modernist” tools of government to tackle these problems tends to be counterproductive, however, and introduces its own challenges (Margetts et al. Reference Margetts and Hood2010). This phenomenon, which some call “dynamic conservativism” (Schön Reference Schön1971), involves changes that ensure social relationships remain the same, for example by embedding their characteristics within an institution’s technical features. For almost a century, computers have been central to the work of many state agencies. Consequently, these technologies have themselves shaped the bureaucratic characteristics that are often critiqued (Augarten Reference Augarten1984, 183). When government agencies delegate decision-making power to interfaces to escape bureaucracy, these interfaces may take on a mystical aura that makes it difficult to recognize and challenge how they generate or exacerbate some of the problems that they have been introduced to solve (Citron Reference Citron2008). What is more, when governments use interfaces to tackle deep-set bureaucratic dilemmas, these interfaces can actually intensify administration. This phenomenon may be more central to modern states than the size of their bureaucracies or the number of their public officials (Weber Reference Weber and Runciman1978, 348; Valverde Reference Valverde, Valverde and Flynn2021). While administrative intensity may share some of the features of administrative burden, such as increased (digital) paperwork (Herd and Moynihan Reference Herd and Moynihan2018), it is distinct because it expands administrative labour inwards to public officials from other agencies and outwards to others (employers, librarians, and charity workers, for instance).

Interface governance thus seeks to capture these features of contemporary administration and to conceptualize both the anticipated and unanticipated regulatory effects of interface technologies. In doing so, it applies the methods and insights of sociologically oriented tools of government scholars, bringing them into conversation with the growing multidisciplinary literature on digital government to better understand how interfaces mediate relations between public officials and members of the public. Ultimately, this approach aims to distill lessons for governance scholarship more widely and for socio-legal and administrative justice scholars specifically.

As the Universal Credit and ArriveCAN examples will illustrate below, interfaces are a distinct regulatory tool. Typically, they are introduced to manage risk or alter behaviour to achieve some intentional, pre-specified policy goal (Black Reference Black2001). Like the new public management and e-government initiatives that preceded them, digital interfaces may be introduced to increase efficiency, reduce fraud, and improve overall user experience (Margetts and Partington Reference Margetts, Partington and Adler2010; Tomlinson Reference Tomlinson2019; Henman Reference Henman, Hertogh, Kirkham, Thomas and Tomlinson2022). They do so by guiding the behaviour of both officials and members of the public. Yet, like other tools of government, interfaces produce their own effects distinct from their intended policy goals. These effects are often unanticipated by their designers and by policymakers who introduce them. In some cases, such effects result from functioning errors or “glitches,” even though glitches themselves are a predictable feature of the technologies that constitute interfaces. Interfaces thus regulate in a conventional legal sense, as they manage risk or modify behaviours to achieve a pre-defined goal. They also regulate in a governance sense, as they change power dynamics, modify ways of conceptualizing problems, and inspire and entrench particular institutional forms (Lascoumes and Le Galès Reference Lascoumes and Le Galès2007).Footnote 1

Benefits and Borders: Two Examples of Interface Governance

Two examples—Universal Credit and ArriveCAN—illustrate how interfaces govern the front lines of benefits and border decisions. The specifics of each example are detailed immediately below, followed by an analysis of how interface governance functions when things operate smoothly and when glitches arise.

Universal Credit

Universal Credit is the United Kingdom’s flagship digital-first social assistance program. While this program is well known for its digitalized approach to welfare administration, it is a vital last resort program for many UK residents. Like social assistance schemes elsewhere, Universal Credit requires benefits recipients (called “claimants”) to fulfil many conditions to remain eligible for assistance. Those who fail to meet these conditions, such as updating personal information and conducting regular job searches, may have their benefits reduced or be subject to harsh financial penalties (called “sanctions”) (Mantouvalou Reference Mantouvalou2020). First rolled out in 2013, Universal Credit has become prominent for many UK residents since 2020. From mid-March to early April 2020 alone, for example, 1.1 million new Universal Credit claims were filed by people seeking aid due to the COVID-19 pandemic. Between March 2020 and January 2021, the number of people receiving Universal Credit payments doubled from three to six million people (UK DWP 2021).

Although it is a social assistance program, Universal Credit is synonymous with its digital interface. Almost all benefits applications are filed online, and claimants must communicate with agency representatives through the interface (Adelmant et al., forthcoming). This interface consists of a digital account that includes a home page, a to-do list, and a journal (Pope Reference Pope2020). It can be accessed via a smartphone app or a webpage. Officials still operate over the phone and in Jobcentre offices, but assistance seekers’ primary means of interacting with officials is through the digital account. Formally, Universal Credit entitlement decisions are made by the UK Department of Work and Pensions (DWP), after an individual makes an initial application and whenever evidence suggests that person’s circumstances have changed. Benefit calculations are automated and based partly on regular data sharing between tax and DWP authorities (Adelmant et al., Reference Adelmant, Alston and Van Veenforthcoming). Officials and the software powering Universal Credit’s interface continuously receive information, and program administrators track each claimant’s job search activity through the digital account.

Although digital-by-default might suggest that humans are still available to claimants, public officials remain elusive. As Adler notes, “office policy has been to keep ‘face-to-face help’ to a minimum, and ‘digital by default’ has in practice meant ‘digital only’” (Reference Adler, Hertogh, Kirkham, Thomas and Tomlinson2022, 635). This reality creates ongoing uncertainty for claimants. It is often unclear what is happening on the other side of the interface, what specifically is required of claimants, and whom to contact for further information. For instance, claimants must use their digital account’s home page to notify the DWP of any changed circumstances that might affect their entitlements. The DWP does not publish a list of events that could impact an individual’s benefits, however. Without easy access to an official who might provide guidance, claimants are often unsure of which changes to report (UK DWP n.d.). Similarly, claimants may use the interface’s journal to raise their questions to the DWP. It is unclear, though, who (or what) answers these questions. Responses may be created by the same official, different officials may reply to separate questions posed by the same claimant, and some messages may be written by generative artificial intelligence (i.e., chatbots). At the end of each month, claimants receive a statement in their digital account that details how their monthly benefits were calculated. This statement simplifies very complex, individualized payments calculated by software that draws on a range of data sources. When DWP officials are asked to explain how a claimant’s benefits were calculated, they often must rely on this interface-generated statement as they cannot do so on their own (Howes and Jones Reference Howes and Jones2019).

ArriveCAN

ArriveCAN is a recent example of a digital interface that regulates interactions between front-line officials and individuals seeking to cross the Canadian border. This tool includes a website and a smartphone app, both of which govern relations between the traveling public and Canadian federal officials. Through a series of text fields and drop-down menus that travelers must complete, ArriveCAN collects information about anyone arriving in Canada and transmits it to a range of federal authorities.

ArriveCAN’s purpose and its influence have evolved over time. Like Universal Credit’s digital account, ArriveCAN was initially a mandatory tool for reporting health and travel information. Since its introduction in 2020, it has evolved into an optional digital customs declaration that travelers are encouraged (but not obliged) to complete while en route to Canada.Footnote 2 ArriveCAN was first rolled out by the Public Health Agency of Canada to help enforce quarantine orders and COVID-19 vaccination requirements for cross-border travelers (Wylie Reference Wylie2022). Originally, everyone travelling to Canada by air, land, or sea was required to submit personal information (vaccination records, travel itineraries) via ArriveCAN to Canadian border authorities seventy-two hours before physically arriving in Canada (CBSA 2022b). Failure to do so could lead to thousands of dollars in fines. Border officials used the information received from ArriveCAN to streamline border crossing traffic. They also used this data to randomly select people for COVID-19 testing and to impose quarantine orders on those arriving from countries with outbreaks of new variants of concern. In mid-2022, the Canada Border Services Agency (CBSA) adopted ArriveCAN for wider customs and border security purposes (Wylie Reference Wylie2022). Following increasingly vocal opposition by the CBSA union and far-right protesters, however, federal authorities eventually made ArriveCAN an optional mechanism for completing a customs declaration form in October 2022 (Malone, Reference Maloneforthcoming).

As a mandatory interface, ArriveCAN powerfully affected cross-border travelers. Not only were travelers required to share health records and other personal information through it, but ArriveCAN also communicated quarantine orders to travelers. This functionality was not safeguarded by sufficient technical maintenance to prevent wide-scale errors or by a support system to provide travelers with crucial information when such errors occurred. In practice, ArriveCAN suffered from several software “glitches,” one of which generated a massive quarantine order error (discussed further below).

Although ArriveCAN’s public health functionalities have been deactivated, the tool remains one of many instruments mediating interactions between border officials and the traveling public. Presently, ArriveCAN is one interface through which travelers may make a customs declaration, with the now-ubiquitous airport kiosk being the other option for those entering Canada by air (Daly, Raso, and Tomlinson Reference Daly, Raso, Tomlinson and Harlow2022). While few travelers presently use the now-optional ArriveCAN, it lingers (Major Reference Major2022). Given ongoing efforts to digitize more of Canada’s border administration, ArriveCAN may eventually be reinvigorated and linked to other border technologies.

Smooth Operators: Interface Governance and its Intended Effects

Interfaces are often introduced to achieve explicit goals. Yet even when they work relatively smoothly, they may only partly achieve those goals and may instead hasten other outcomes. The governance implications of interfaces are only fully intelligible if we examine their intended and unintended effects together. Doing so implicitly adopts a discrete tools-of-government approach to studying interfaces, one that assumes interfaces are never neutral (Winner Reference Winner1980). By examining interfaces as socio-legal-technical phenomena, then, the method adopted here departs from the (supposedly) politics-free analysis of tools-of-government scholars hailing from public policy circles (e.g. Hood Reference Hood1983). The present approach, rather, fits squarely within the tradition of sociologically driven tools-of-government scholarship. Though they rarely use the term “socio-legal,” these scholars explicitly assess how different policy tools shape socio-legal institutions (Lascoumes and Simard Reference Lascoumes and Simard2011, 12). The focus is not on whether interfaces or administrative agencies alone create the dilemmas explored below, but rather on how predictably glitchy interfaces operate within a given institutional context, and with what governance effects. Rather than identify “non-technical” parts of government as weak links, which politics-neutral tools-of-government scholars might do, a policy instrumentation approach suggests that interfaces cannot be understood separate and apart from the institutions that they constitute.

Governments regularly use interfaces to improve efficiency. For example, interfaces may be introduced to improve the speed and accuracy of routine decisions about benefits claimants or travelers (Veale and Brass Reference Veale, Brass, Yeung and Lodge2019). When Universal Credit was first introduced, the DWP proposed that it would save £99 million per year in benefits administration costs and reduce fraud and error by £1.3 billion annually (UK AG 2018, 9). ArriveCAN was similarly advertised as saving travelers about five minutes in line each time they crossed the border (CBSA 2022b). Presumably, these reduced wait times would also make CBSA operations more efficient. To further streamline decision making, interfaces also commonly purport to make simple bureaucratic decisions. In theory, this functionality ought to redirect front-line workers’ energy to more legally or socially complex tasks (Hood Reference Hood, Goodin, Moran and Rein2008). The algorithmic infrastructure supporting an interface, such as Universal Credit’s digital account or the ArriveCAN app, performs calculations, compares data, and completes computational tasks using data input by officials, members of the public, and others (Malone, Reference Maloneforthcoming).

Additionally, interfaces may be adopted to increase the digital literacy of both the public and civil servants because they require regular user engagement to function. This feature is particularly common in social assistance programs like Universal Credit, which aim to prepare claimants for digital work. In a promotional video, for example, a DWP official describes how Universal Credit’s digital account performs this function, stating, “[It’s] a really good introduction to the digital world … We send them text messages so they can pick that up when they’re on the move. They know there’s an update on their account and they can check it immediately and take action” (UK DWP 2016). Improved digital literacy cuts both ways, though. It also targets public officials who themselves must interact with a mandatory interface. For instance, ArriveCAN is now heralded as part of Canada’s ongoing efforts “to modernize our border,” efforts that presumably centre on upgrading border officials’ digital skills as much as upgrading border technologies (CBSA 2022a).

Digital interfaces may only partially achieve these goals, however. In terms of efficiency, the decisions that interfaces co-produce may be speedy yet inaccurate, creating fast injustice and placing truly transformative efficiency gains out of reach. These shortcomings may occur because digital interfaces rely on incompatible legacy systems or bad data (Margetts Reference Margetts1999; Fry Reference Fry2018). Universal Credit’s Real Time Information system, analyzed in the glitch discussion below, is one such example. In other cases, interfaces malfunction because their software is incompatible with the operating system on a user’s personal computer or smartphone. These glitches can generate errors at a scale and speed that would have been unfathomable just a few decades ago (Yeung Reference Yeung2023). The ArriveCAN errors analyzed in the next section are one such example, where thousands of travelers were ordered to quarantine shortly after a software update went “live.” Importantly, the efficiency gains advertised by high-level government officials are often calculated using methods that fail to account for labour performed outside of brick-and-mortar front-line spaces. The minutes shaved from caseworker meetings or from airport wait times, for example, invisibilize the hours of work that “users” and many others (i.e., family and friends, public library staff, healthcare providers) spend interacting with an interface, entering data, and troubleshooting when things go awry. This administrative burden arises before or after an individual appears in a Jobcentre office or at a CBSA desk (Herd and Moynihan Reference Herd and Moynihan2018). It remains unaccounted for when front-line efficiencies are evaluated.

As for improved digital literacy, interfaces may increase administrative intensity and lead to frustration with, and backlash against, them as government instruments. Digitalization initiatives often coincide with other institutional changes that lead people to spend substantial amounts of time trying to do basic things like file documents or receive updates on their files. This phenomenon is particularly evident in social assistance programs such as Universal Credit, where governments digitalize in settings where they have also defunded and contracted out some service elements and offloaded others onto local governments and charities. Collectively, these changes make it more difficult for Universal Credit claimants to use their digital account and to comply with the program’s complex eligibility rules (Summers and Young Reference Summers and Young2020). Because DWP rarely helps individuals navigate the program’s interface, claimants rely on outside agencies and family members for assistance. Claimants must therefore become experts in navigating the wide web of actors beyond DWP who might help them interact with the Universal Credit system (Edmiston et al. Reference Edmiston, Robertshaw, Young, Ingold, Gibbons, Summers, Scullion, Geiger and de Vries2022). Likewise, front-line officials often describe interfaces as intensifying the technical aspects of their work. When asked about ArriveCAN, for example, a spokesperson for the CBSA officers’ union noted, “the few officers we have working at the front line are spending all of their time acting as IT consultants” (MacMahon Reference MacMahon2022).

Interfaces also have significant unintentional governance effects on officials and members of the public. They transfer responsibility away from officials onto public “users,” while also flipping and clouding individual–state accountability relationships. Each is examined in turn.

Digital interfaces encourage officials to shift responsibility away from themselves. Occasionally, officials may hide behind the interface, blaming “the computer” or “the system” for a controversial result and deflecting from their own role in producing that outcome (Howes and Jones Reference Howes and Jones2019; Fourcade and Gordon Reference Fourcade and Gordon2020). In this way, interfaces enhance the long-standing bureaucratic practice of blame shifting. Yet, interfaces also operate together with policies that disempower workers, such as heavy workloads, reduced staff (due to cutbacks or staff shortages), and strict managerial targets. These features spread opportunities for effective agency thinly across an institution’s front lines. In some cases, it does become nearly impossible for officials to change interface-generated outcomes, when the interface’s functions are unintelligible to them (Burrell Reference Burrell2016). An interface’s rigid analytical categories and limited range of options, meanwhile, may contrast starkly with the formal laws governing a particular program. Universal Credit, like other benefits, often requires officials to exercise discretion (Mears and Howes Reference Mears and Howes2023). Border control decisions similarly require officials to assess risks and reasonableness, issues that rarely have a single right answer (Pratt and Thompson Reference Pratt and Thompson2008). And public officials often insist upon using professional judgment, despite technologies introduced to guide that same judgment (Lipsky Reference Lipsky1980; MacMahon Reference MacMahon2022). Officials may thus develop new(er) conceptions of their identities as public servants and of their own agency and responsibility vis-à-vis interfaces.

As for members of the public, digital interfaces also intensify a peculiar type of responsibilization. Beyond merely offloading responsibility from the state to the individual, interfaces require people to be routinely accessible, to enter new data about themselves, and to reply to interface prompts. On the other side of the interface, however, it is unclear whether state officials and technologists are similarly responsive. Although early literature describes interfaces as “empowering” their users (Chun Reference Chun2006), in both benefits and border security contexts, interfaces seem more likely to confound than to empower. Interfaces intensify the scrutiny that individuals experience when they interact with administrative agencies, while frustrating their attempts to be “heard” by their scrutinizers. In the Universal Credit program, for instance, claimants are required to regularly update their circumstances on their digital account’s home page and to pose questions via the journal. DWP officials, meanwhile, do not use this same interface to prompt claimants to amend their details, even when they have reason to believe that such an update is needed (Pope Reference Pope2020, 39; Mears and Howes Reference Mears and Howes2023). A failure to update data can bar claimants from receiving benefits to which they are entitled and may even lead to sanctions. Additionally, claimants receive generic texts or emails indicating that they should log into their digital account, regardless of whether the actions they must take are benign or significant. These messages create anxiety, as claimants have no way of knowing whether they relate to innocuous or serious matters. Messages also flow one way. Claimants’ only option to communicate with DWP officials is through their digital account. They cannot simply reply to the messages received via the Universal Credit platform (Pope Reference Pope2020, 41–42). Similarly, ArriveCAN never prompted travelers to update their data, although this requirement was nonetheless mandatory. Travelers were expected to navigate the interface alone, which was stressful for anyone with unstable internet access, poorly legible vaccine documents, or an app version that failed to load on their smartphone.Footnote 3 For those crossing the border by air, this obligation to update data was communicated regularly by airlines and on signs posted in airport terminals. At land borders, however, travelers were often less aware of it (La Grassa and Breen Reference La Grassa and Breen2022). In either setting, if a traveler’s ArriveCAN declaration was incomplete when they reached a CBSA officer, the officer would require the traveler to complete these data entry fields before advancing the border crossing process.

Interfaces, by their design, reverse and even occlude accountability relations between the state and the public. Instead of state agencies being obligated to “hear” people fully and fairly, or to reach intelligible and justified decisions, interfaces require individuals to continuously demonstrate their eligibility and responsibility as perpetual “applicants” rather than as “rights-holders” (Alston Reference Alston2019). Digital interfaces not only obstruct individuals from “asserting their rights,” in Alston’s words. They impede individuals from even knowing which obligations they must fulfil to receive a particular benefit or service, or what they must do to avoid penalties, even as the burden of error weighs heavily on them (Yeung Reference Yeung2023, 23–24). Both Universal Credit’s digital account and ArriveCAN’s app and website obscure who or what is hearing the data on the other side of the interface. While these tools are marketed as saving time across an institution, the data supporting such claims obscures how exhausting it is for members of the public to interact with these tools. These effects are particularly acute for anyone who cares about fulfilling the interface’s requirements. Attempts to ensure one’s data is complete, for example, become frustrated when questions about where or how to enter data arise and no authoritative actors are available to provide further information (Summers and Young Reference Summers and Young2020). The stress of this experience influences whether the people involved perceive the administrative state as legitimate or just and can influence the perceptions of those in their wider social circle. Regardless of the intent behind such tools, their design and the lack of accessible technical support both deters people from accessing social assistance and may make them more hesitant to cross the border. With the extended reach of social media, widespread tales of these bad experiences may erode public trust in administrative justice. This risk becomes amplified when we evaluate interface glitches.

Predictable Glitches and Their Effects

Though digital government proponents might suggest otherwise, interfaces malfunction and these glitches have their own governance consequences. The term “glitch” suggests a rare mistake, but such hiccups are a common feature of digitalization. They provide officials with opportunities to further diffuse responsibility when things go wrong. Because they affect large numbers of people, and because they so often find public officials unprepared or unwilling to deal with their aftermath, glitches may also accelerate the erosion of public trust in state institutions.

Glitches tend to surprise government officials and members of the public, yet they can occur whenever interface software is updated or when the software’s algorithmic functions and the data on which the software relies mismatch. Interface software is updated regularly. Universal Credit’s software, for instance, is updated multiple times per week (fieldnotes, 15 March 2021). When these updates occur, the algorithmic system may malfunction due to errors in the code or incompatibility between the new software and the legacy systems with which it interacts. Glitches may also arise when interface software is incompatible with software on the devices members of the public use to access the interface. The ArriveCAN glitch mentioned above is illustrative. In late June 2022, an update to the ArriveCAN smartphone app led at least 10,200 travelers, primarily Apple users, to be wrongly ordered to quarantine. Discrepancies between ArriveCAN’s software and Apple’s internal operating system (or “iOS”) led ArriveCAN to generate false quarantine orders. Soon after the update was released, the ArriveCAN system began sending emails and automated telephone calls that ordered some travelers to quarantine for fourteen days or risk $5000 in fines and prison time (Malone, Reference Maloneforthcoming). When they tried to confirm whether these orders were indeed correct, travelers were met with automated messages or federal officials who could not answer their questions. After reporters began investigating, Canada’s Minister of Public Safety confirmed that these messages were the result of a “technical glitch.” He insisted that messages were only sent to about three per cent of ArriveCAN users at the time (Harris Reference Harris2022). Perhaps because such malfunctions were unanticipated (and perhaps because of the sparse glitch reporting systems in place), federal authorities did not discover the glitch until two weeks after it occurred, and took another ten days to notify affected individuals (Hill Reference Hill2022). By this time, thousands of people had already completed their two-week quarantine.

Glitches may also arise when the data the software interacts with includes errors, or when software otherwise misinterprets that data (e.g., Fry Reference Fry2018). For instance, Universal Credit’s interface automatically calculates monthly benefit payments using the Real Time Information system. Real Time Information was designed and is operated by the United Kingdom’s tax authority, HM Revenue and Customs (HMRC). It includes employer-reported earnings data. When an employer’s payroll date for a claimant conflicts with the date on which the Universal Credit program calculates their monthly earnings to generate a benefits payment, the system often double counts the claimant’s income for that month and records their income as zero in the following month. This design feature was intended to mirror the monthly pay characteristic of many UK employment opportunities, but it clashes with the working conditions of most Universal Credit claimants, who are paid weekly or fortnightly (Adelmant et al., Reference Adelmant, Alston and Van Veenforthcoming). As a result, Universal Credit payments are often incorrect because the Real Time Information system on which Universal Credit’s interface relies conflicts with claimants’ fluctuating work schedules (Pope Reference Pope2020). Claimants are thus over or underpaid benefits due to this predictable, ongoing glitch. Sometimes, automated payments cease entirely and an individual must make a new claim and wait six weeks without benefits while their claim is processed (Jitendra, Thorogood, and Hadfield-Spoor Reference Jitendra, Thorogood and Hadfield-Spoor2018, 18, 20). Despite the clear mismatch between the software’s calculation formula and the data used to make this calculation, this problem remains largely unresolved (Mantouvalou Reference Mantouvalou2020; Mears and Howes Reference Mears and Howes2023, 66–70). Similar problems have arisen elsewhere, including in Australia’s now-abandoned Online Compliance Initiative (or “Robo-debt”; Carney Reference Carney2019) and Michigan’s MiDAS program.

While the existence of glitches is unsurprising, the sheer scale of their effects is consistently shocking. Glitches may not affect everyone, but even those that impact a small percentage of individuals will affect tens or even hundreds of thousands of people in high-volume administrative settings (Yeung Reference Yeung2023). Data problems continue to affect Universal Credit claimants, such that claimants anticipate that their monthly benefits statements will contain errors that might subject them to benefits reductions and sanctions (Howes and Jones Reference Howes and Jones2019). An estimated one in every five Universal Credit recipients is sanctioned (i.e., penalized) due in part to such data errors. Other sanctions may arise because claimants cannot afford the internet or cell phone services required to regularly interact with Universal Credit’s interface; for example, to complete the mandatory thirty-five hours of weekly job searching activities on Universal Jobmatch (Jitendra, Thorogood, and Hadfield-Spoor Reference Jitendra, Thorogood and Hadfield-Spoor2018, 19). As for ArriveCAN, the thousands of people who received messages directing them to quarantine became the subjects of legal orders (however automated and erroneous) made under the federal Quarantine Act. Footnote 4 When interviewed by reporters, many indicated that, while they suspected the orders were incorrect, they feared the threat of fines and potential jail time. Most indicated that they dutifully quarantined for almost two weeks (Harris Reference Harris2022; Hill Reference Hill2022). Glitches can quickly disrupt the lives of thousands of people, a now-common feature of contemporary administration that remains underappreciated by socio-legal scholars and by the officials who manage digital government initiatives (cf Thomas Reference Thomas2021).

When glitches occur, they may turn attention towards the interface and away from the officials who do, or should, bear some responsibility for its malfunctioning. Officials often gesture towards the interface, or to the software and data on which the interface relies, as the source of the problem rather than to their own technological repair or data maintenance practices. In doing so, they engage in a longstanding bureaucratic tradition of blaming tools or numbers to shield their actions from scrutiny (Porter Reference Porter1996). Through this move, officials deflect their own agency onto different elements of the socio-legal-technical network, even though they are themselves an integral part of that network (Johns Reference Johns2023). This deflection becomes apparent to anyone searching for an accountable state official when things go wrong: someone who can tell them what happened, why, and what might be done in response. When Universal Credit claimants believe their benefits calculations are incorrect, for instance, officials may dissuade them from filing a mandatory reconsideration request and propose that the problem is caused by the Real Time Information system (Howes and Jones Reference Howes and Jones2019, 5). Likewise, when people suspected their ArriveCAN-generated quarantine orders were incorrect and tried to reach government officials, they faced dead-end telephone trees or officials who refused to discuss this “technical” issue with them (Hill Reference Hill2022). These barriers to knowledgeable officials behind the interface are regular, frustrating characteristics of interface governance (Raso, Reference Raso, Sullivan, Johns and van den Meersscheforthcoming). As Fourcade and Gordon observe, because administrative settings now regularly use algorithmic tools to generate decisions, public officials “find themselves in a stronger position to claim their subordination to computers and their inability to account for their decisions” (Reference Fourcade and Gordon2020, 84). This problem is more than technological; it is an institutional design issue that exacerbates deep-set bureaucratic challenges (Thomas Reference Thomas2021).

More broadly, glitches reveal just how wide responsibility for an interface stretches especially when things go wrong. When a Universal Credit claimant seeks to correct a benefits payment error, their inquiry often triggers an intricate sequence of blame shifting. DWP officials may point claimants towards the HMRC, the agency managing the Real Time Information system. HMRC, then, may direct claimants to the employers who entered payroll data incorrectly. Employers may then lead claimants back to HMRC and the dance continues. The United Kingdom’s Government Digital Service, which designs and updates much of Universal Credit’s algorithmic infrastructure, may or may not be identified despite its clear influence over the interface (Jitendra, Thorogood, and Hadfield-Spoor Reference Jitendra, Thorogood and Hadfield-Spoor2018; Adelmant et al., forthcoming). Likewise, for much of its existence, ArriveCAN has straddled two agencies—the Public Health Agency of Canada and the Canada Border Services Agency—with the former initiating ArriveCAN’s creation and the latter adopting the tool for longer-term use (Malone, Reference Maloneforthcoming). When it comes to maintaining and updating ArriveCAN’s technical infrastructure, however, ArriveCAN falls under the federal Treasury Board’s authority (Scassa Reference Scassa2021). Its shifting functionality and purposes may also make this interface attractive to other federal agencies involved in digital border governance, such as Immigration, Refugees and Citizenship Canada (Canada SCPA 2020, field notes 20 September 2022). With responsibility this widespread, so many entities appear to have agency that few are accountable. Given the regularity and impact of interface glitches, this situation may fundamentally threaten trust in state agencies. It is thus essential to consider a way forward for researchers.

Conclusion: Towards Socio-Legal-Technical Studies of Interface Governance and Predictable Glitches

Interface governance matters for socio-legal studies and for administrative justice as a normative pursuit. But how should scholars respond? At least four paths forward warrant consideration.

First, researchers must better understand interfaces in relation to the administrative contexts in which they exist. This means avoiding the binary generalizations recycled in some legal and public policy–oriented scholarship on digital government. For example, some analyses of digital state initiatives tend to contrast “automation” or artificial intelligence with an idealized notion of officials exercising unbounded discretion (Casey and Niblett Reference Casey and Niblett2016). Likewise, the digitalization of public life may be distorted when it is described simply as a move from public to private governance (Balkin Reference Balkin2018). Dichotomies also remain popular in some tools-of-government typologies (Lascoumes and Simard Reference Lascoumes and Simard2011, 9–10). They are, however, insufficiently variegated to analyze “the problematics of government in the present” (Rose and Miller Reference Rose and Miller1992, 201). Even conceiving of the entrenchment of digital government as positioned somewhere between two ends of a spectrum—the privatization of the state, a shift from discretion to automation, a move from statistics to data—ensures that the growing importance of a phenomenon like interfaces evades analysis, as its range of salient characteristics extends beyond these conceptual poles. Scholars must therefore continue to refine robust, multi-dimensional frameworks for evaluating exercises of legal and technical power. To this end, both older studies of administrative tools by information and media scholars and the nascent multi-disciplinary literature on digital government are excellent starting points because they reflexively scrutinize categorization practices and link present-day dilemmas with their antecedents (e.g., Blair et al. Reference Blair, Duguid, Goeing and Grafton2021; Burrell, Singh, and Davison, Reference Burrell, Singh and Davisonforthcoming).

Second, to resist the simplicity of dichotomization, scholars would benefit from extending their methods and theoretical approaches from socio-legal to socio-legal-technical ones. Literature in this growing field now more explicitly adopts network analyses to study how administrative agencies operate and how they are reshaped by data-driven tools (Levi and Valverde Reference Levi and Valverde2008; Sullivan Reference Sullivan2022). But more could be done to explicitly link the methods and theories from actor network theory and science and technology studies with those developed by infrastructure studies (Bowker and Star Reference Bowker and Star2000) and sociologically driven tools of government literature (Dunleavy et al. Reference Dunleavy, Margetts, Bastow and Tinkler2006). Admittedly, this project is transsystemic and multi-linguistic, as many policy instrumentation scholars study civil law jurisdictions and publish in languages other than English (Lascoumes and Le Galès Reference Lascoumes and Le Galès2007; Perret Reference Perret2010). But this approach would support a critical evaluation of the state of interface governance and similar developments. For instance, it might equip scholars to recognize and trace how digital technologies, including interfaces, become integrated into the state’s legal and social infrastructure like many tools of government before them (Margetts Reference Margetts1999; Amoore Reference Amoore2013; Van Den Meerssche Reference Van Den Meerssche2022). It would also enable socio-legal researchers to explore how such tools regulate officials and members of the public on both sides of the interface, as well as their wider population-level effects.

Third, original empirical studies are needed to reveal how interfaces (re)form administrative governance. Studies of the technical design and functioning of interfaces would also better explain when and why glitches occur and how these tools and their glitches distribute responsibility across many agencies and actors. This work would be hugely important for socio-legal theory and for administrative justice advocates. At present, fragmented evidence about such tools (i.e., promotional materials, government websites, parliamentary reports, third-party policy papers) is more common than detailed, critical studies (cf Eubanks Reference Eubanks2017; Raso Reference Raso2017; Amoore Reference Amoore2020). Empirical studies must treat glitches and malfunctions as a normal, rather than deviant, component of interface governance. Doing so would allow researchers to fully grasp how glitches contribute to (rather than distract from) governance within administrative settings. For normatively-minded scholars, this knowledge would support informed strategizing about how to anticipate glitches and how to minimize their impact, perhaps through basic reforms such as enhanced maintenance (Pope Reference Pope2020, 92).

Finally, researchers must remain attuned to the practical effects of digital government and how their work contributes to those effects. Better understanding interface governance and its glitches may yield crucial theoretical insights, but it also has tangible macro- and micro-scale consequences. At the macro level, as digital technologies enable state agencies to become more impenetrable and less accountable, a legitimacy crisis may result (Calo and Citron Reference Calo and Citron2021). The contemporary reality of interface governance, with its combination of increased administrative intensity, reversed accountability, and potential for catastrophic glitches, risks eroding faith in government institutions.

While there is good reason to critique benefits and border administration, these agencies still provide some important services and should be improved upon rather than abandoned completely. Interface governance, however, including lackluster responses to predictable glitches, may fundamentally destabilize these institutions through micro-level or incremental effects. Specifically, these problems may lead individuals who previously had (at least some) faith in state agencies to lose that confidence. The examples detailed above have impacted many people, including some of whom previously trusted in government. The ArriveCAN glitch, for instance, affected senior citizens returning from vacation, cross-border shoppers, and travelers who diligently followed erroneous quarantine orders. Errors in Universal Credit benefits, meanwhile, remain notoriously widespread and unresolved (Mears and Howes Reference Mears and Howes2023). One agency described the situation as follows:

[C]laimants’ hardship can be prolonged, with no easy way to fix the situation. Children and families can be left without any income for months on end. Ill and disabled people can be required to look for work they cannot find or maintain. Debts can accrue and housing can be put at risk. (Howes and Jones Reference Howes and Jones2019, 5)

In 2021, an estimated 3.6 million children in the United Kingdom lived in a household receiving Universal Credit benefits (CPAG 2022). What impression do these children have of administrative agencies if they are subject to such routine failures?

While a socio-legal-technical approach may seem an esoteric response to such pressing problems, understanding how governance operates does two important things. First, it demonstrates the contingency and fragility of the relations that create interfaces and their glitches. This fragility suggests that things might be otherwise. Second, and relatedly, this approach can reveal where small changes in the relations and circumstances supporting interface governance may substantially impact the wider system (Hohmann Reference Hohmann2021). Both are necessary to tackle the challenges of interface governance today and in the future.

Footnotes

*

This article has been enriched by the generous feedback of participants in a series of workshops, including Making Waste, Talking Trash (UNSW Sydney School of Law and Justice 2018), Global Governance by Data (University of Edinburgh Law School 2022), and Keywords on the Datafied State (Data & Society 2023), as well as by its anonymous peer reviewers. It has also benefited from ongoing conversations with Victoria Adelmant, Nehal Bhuta, Fleur Johns, Nofar Sheffi, Thomas Streinz, Gavin Sullivan, Dimitri Van Den Meerssche, and Mariana Valverde. Thanks to Emilie Vaillancourt and Gajanan Velupillai for research assistance, and to the Social Sciences and Humanities Research Council for funding.

1 I use “governance” loosely to refer to regulatory practices and their effects, rather than to directly evoke Foucauldian concepts of sovereignty, discipline, and governmentality. Foucauldian notions of governance, while fruitful, sit outside of the scope of the argument advanced here.

2 The latest version of ArriveCAN replicates a previously-abandoned digital customs declaration project known as “eDeclaration” (Malone Reference Maloneforthcoming).

3 The author experienced this situation every time they tested the app on three different Android devices.

4 SC 2005, c 50.

References

Achiume, E. Tendayi. 2020. Report of the Special Rapporteur on Contemporary Forms of Racism, Racial Discrimination, Xenophobia and Related Intolerance. Report no. A/75/590. https://digitallibrary.un.org/record/3893019?ln=enGoogle Scholar
Adelmant, Victoria, Alston, Philip, and Van Veen, Christiaan. Forthcoming. Just digitalization. Oxford: Oxford University Press.Google Scholar
Adler, Michael. 2022. The future of administrative justice. In The Oxford Handbook of Administrative Justice, ed. Hertogh, Marc, Kirkham, Richard, Thomas, Robert, and Tomlinson, Joe, 623–46. New York: Oxford.Google Scholar
Alston, Philip. 2019. Report of the Special Rapporteur on Extreme Poverty and Human Rights. Report No. A/74/493. https://digitallibrary.un.org/record/3834146?ln=enGoogle Scholar
Amoore, Louise. 2020. Cloud ethics: Algorithms and the attributes of ourselves and others. Durham: Duke University Press.Google Scholar
Amoore, Louise. 2013. The politics of possibility: Risk and security beyond probability. Durham: Duke University Press.Google Scholar
Augarten, Stan. 1984. Bit by bit: An illustrated history of computers. Boston: Houghton Mifflin Co.Google Scholar
Balkin, Jack M. 2018. Free speech in the algorithmic society: Big data, private governance, and new school speech regulation. UC Davis Law Review 51, no. 3 (February): 11491210.Google Scholar
Black, Julia. 2001. Decentring regulation: Understanding the role of regulation and self-regulation in a “post-regulatory” world. Current Legal Problems 54, no. 1 (December): 103–46.CrossRefGoogle Scholar
Blair, Ann, Duguid, Paul, Goeing, Anja-Silvia, and Grafton, Anthony, eds. 2021. Information: A Historical Companion. Princeton: Princeton University Press.CrossRefGoogle Scholar
Bowker, Geoffrey, and Star, Susan Leigh. 2000. Sorting things out: Classification and its consequences. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
Burrell, Jenna. 2016. How the machine thinks: Understanding opacity in machine learning algorithms. Big Data & Society 3 (1): 112.CrossRefGoogle Scholar
Burrell, Jenna. 2023. A guide to investigating the datafied state through documents. New York: Data & Society.Google Scholar
Burrell, Jenna, Singh, Ranjit, and Davison, Patrick, eds. Forthcoming. Keywords on the datafied state. New York: Data & Society.Google Scholar
Calo, Ryan, and Citron, Danielle Keats. 2021. The automated administrative state: A crisis of legitimacy. Emory Law Review 70 (4): 797846.Google Scholar
Canada Border Services Agency (CBSA). 2022a. ArriveCAN Costs. Last modified October 24. https://www.cbsa-asfc.gc.ca/agency-agence/reports-rapports/fs-ef/2022/acc-cac-eng.html.Google Scholar
Canada Border Services Agency (CBSA). 2022b. Use ArriveCAN to Enter Canada. Last modified December 22. https://www.canada.ca/en/border-services-agency/services/arrivecan.html#a1Google Scholar
Carney, Terry. 2019. Robo-debt illegality: The seven veils of failed guarantees of the rule of law? Alternative Law Journal 44 (1): 410.CrossRefGoogle Scholar
Casey, Anthony J., and Niblett, Anthony. 2016 Self-driving laws. University of Toronto Law Journal 66, no. 4 (Fall): 429–42.CrossRefGoogle Scholar
Child Poverty Action Group (CPAG). 2022. Nothing left to cut back: Rising living costs and Universal Credit. February. https://cpag.org.uk/sites/default/files/files/policypost/Briefing_UC_Feb_2022_final.pdfGoogle Scholar
Chun, Wendy Hui Kyong. 2006. Programmed visions: Software and memory. Cambridge, MA: MIT Press.Google Scholar
Citron, Danielle Keats. 2008. Technological due process. Washington University Law Review 85 (6): 12491313.Google Scholar
Coleman, Gabriella. 2015. The anthropological trickster. HAU: Journal of Ethnographic Theory 5, no. 2 (September): 399407.CrossRefGoogle Scholar
Clarke, Amanda. 2020. Digital government units: What are they, and what do they mean for digital era public management renewal? International Public Management Journal 23 (3): 358–79.CrossRefGoogle Scholar
Daly, Paul, Raso, Jennifer, and Tomlinson, Joe. 2022. Researching administrative law in the digital world. In Research Agenda for Administrative Law, ed. Harlow, Carol, 255–79. London: Edward Elgar.Google Scholar
Dunleavy, Patrick, Margetts, Helen, Bastow, Simon, and Tinkler, Jane. 2006. New public management is dead—Long live digital-era governance. Journal of Public Administration Research and Theory 16, no. 3 (July): 467–94.CrossRefGoogle Scholar
Edmiston, Daniel, Robertshaw, David, Young, David, Ingold, Jo, Gibbons, Andrea, Summers, Kate, Scullion, Lisa, Geiger, Ben Baumberg, and de Vries, Robert. 2022. Mediating the claim? How “local ecosystems of support” shaper the operation and experience of UK social security. Social Policy & Administration 56:775–90.CrossRefGoogle Scholar
Eubanks, Virginia. 2017. Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.Google Scholar
Fourcade, Marion, and Gordon, Jeffrey. 2020. Learning like a state: Statecraft in the digital age. Journal of Law and Political Economy 1 (1): 78108.CrossRefGoogle Scholar
Fry, Hannah. 2018. Hello world: Being human in the age of algorithms. New York: WW Norton & Co.Google Scholar
Gusterson, Hugh. 1996. Nuclear rites. Berkeley, CA: University of California Press.Google Scholar
Hannerz, Ulf. 2003. Being there…and there…and there! Reflections on multi-site ethnography. Ethnography 4 (2): 201–16.CrossRefGoogle Scholar
Harris, Sophia. 2022. Ottawa admits some travellers were incorrectly told to quarantine due to ArriveCAN app glitch. CBC News, July 22. https://www.cbc.ca/news/business/arrivecan-app-quarantine-glitch-1.6528312Google Scholar
Henman, Paul W. Fay. 2022. Administrative justice in a digital world: Challenges and solutions. In The Oxford Handbook of Administrative Justice, ed. Hertogh, Marc, Kirkham, Richard, Thomas, Robert, and Tomlinson, Joe, 459–80. New York: Oxford University Press.Google Scholar
Herd, Pamela, and Moynihan, Daniel. 2018. Administrative burden: Policymaking by other means. New York: Russell Sage.Google Scholar
Hill, Brian. 2022. Experts warn ArriveCAN app could be violating constitutionally protected rights. Global News, August 10. https://globalnews.ca/news/9047177/experts-warn-arrivecan-app-could-be-violating-constitutionally-protected-rights/Google Scholar
Hohmann, Jessie. 2021. Diffuse subjects and dispersed power: New materialist insights and cautionary lessons for international law. Leiden Journal of International Law 34 (3): 585606.CrossRefGoogle Scholar
Hood, Christopher. 1983. The tools of government. London: Macmillan.CrossRefGoogle Scholar
Hood, Christopher. 2008. The tools of government in the information age. In The Oxford Handbook of Public Policy, ed. Goodin, Robert E., Moran, Michael, and Rein, Martin, 469–81. Oxford: Oxford University Press.Google Scholar
Howes, Sophie, and Jones, Kelly-Marie. 2019. Computer says no! Stage two: Challenging decisions. Child Poverty Action Group. July. https://cpag.org.uk/sites/default/files/files/policypost/Computer%20says%20no%21%202%20-%20for%20web.pdfGoogle Scholar
Hull, Matthew S. 2012. Documents and bureaucracy. Annual Review of Anthropology 41: 251–67.CrossRefGoogle Scholar
Jitendra, Abhaya, Thorogood, Emma, and Hadfield-Spoor, Mia. 2018. Left behind: Is Universal Credit truly universal? The Trussell Trust. April 18. https://s3-eu-west-1.amazonaws.com/trusselltrust-documents/Trussell-Trust-Left-Behind-2018.pdfGoogle Scholar
Johns, Fleur. 2023. # Help: Digital humanitarianism and the remaking of international order. Oxford: Oxford University Press.CrossRefGoogle Scholar
La Grassa, Jennifer, and Breen, Kerri. 2022. Up to 40% of people arriving at Windsor border don’t have ArriveCAN filled out: CBSA union. CBC News, July 22. https://www.cbc.ca/news/canada/windsor/windsor-arrive-can-app-border-1.6527437Google Scholar
Lascoumes, Pierre, and Le Galès, Patrick. 2007. Introduction: Understanding public policy through its instruments—From the nature of instruments to the sociology of public policy instrumentation. Governance 20, no. 1 (January): 121.CrossRefGoogle Scholar
Lascoumes, Pierre, and Simard, Louis. 2011. Public policy seen through the prism of its instruments. Transl. Jill McCoy. Revue Française de Science Politique 61 (1): 522.CrossRefGoogle Scholar
Levi, Ron, and Valverde, Mariana. 2008. Studying law by association: Bruno Latour goes to the Conseil d’État. Law & Social Inquiry 33 (3): 805–25.CrossRefGoogle Scholar
Lipsky, Michael. 1980. Street-level bureaucracy: Dilemmas of the individual in public services. New York: Russell Sage.Google Scholar
MacMahon, Martin. 2022. Customs officer union laments ‘IT consultant’ duties in ArriveCAN era. CityNews Vancouver, June 27. https://vancouver.citynews.ca/2022/06/27/arrivecan-customs-officers-canada/Google Scholar
Major, Darren. 2022. A fraction of air passengers used ArriveCAN in first month app was optional. CBC News, December 8. https://www.cbc.ca/news/politics/few-using-arrivecan-optional-advance-declaration-1.6678081Google Scholar
Malone, Matt. Forthcoming. Lessons from ArriveCAN: Access to information and justice during a glitch. Intellectual Property Journal.Google Scholar
Mantouvalou, Virginia. 2020. Welfare-to-work, structural injustice and human rights. Modern Law Review 183, no. 5 (September): 929–54.CrossRefGoogle Scholar
Margetts, Helen. 1999. Information technology in government: Britain and America. London: Routledge.CrossRefGoogle Scholar
Margetts, Helen, and Partington, Martin. 2010. Developments in e-government. In Administrative Justice in Context, ed. Adler, Michael, 4772. Oxford: Hart.Google Scholar
Margetts, Helen, Perri 6, and Hood, Christopher, eds. 2010. Paradoxes of Modernization: Unintended Consequences of Public Policy Reform. Oxford: Oxford University Press.CrossRefGoogle Scholar
Mears, Rosie, and Howes, Sophie. 2023. You reap what you code: Universal Credit, digitalisation, and the rule of law. Child Poverty Action Group. June. https://cpag.org.uk/sites/default/files/files/policypost/You_Reap_What_You_Code.pdfGoogle Scholar
Perret, Sylvain. 2010. Vers une nouvelle approche instrumentale des politiques publiques de protection de l’environnement. PhD diss. University of Geneva.Google Scholar
Pope, Richard. 2020. Universal Credit: Digital welfare. London: Richard Pope Consulting.Google Scholar
Porter, Theodore. 1996. Trust in numbers: The pursuit of objectivity in science and public life. Princeton: Princeton University Press.Google Scholar
Pratt, Anna, and Thompson, Sarah K. 2008. Chivalry, “race” and discretion at the Canadian border. The British Journal of Criminology 48 (5): 620–40.CrossRefGoogle Scholar
Raso, Jennifer. 2017. Displacement as regulation: New regulatory technologies and front-line decision-making in Ontario Works. Canadian Journal of Law & Society 32 (1): 7595.CrossRefGoogle Scholar
Raso, Jennifer. Forthcoming. Digital border infrastructure and the search for agencies of the state. In Global governance by data: Infrastructures of algorithmic rule, ed. Sullivan, Gavin, Johns, Fleur, and van den Meerssche, Dimitri. Cambridge: Cambridge University PressGoogle Scholar
Robson, Jennifer. 2020. Radical incrementalism and trust in the citizen: Income security in Canada in the time of Covid-19. Canadian Public Policy 46, no. 5 (July): S1S18.CrossRefGoogle Scholar
Rose, Nikolas, and Miller, Peter. 1992. Political power beyond the state: Problematics of government. The British Journal of Sociology 43, no. 2 (June): 173205.CrossRefGoogle Scholar
Scassa, Teresa. 2021. Administrative law and the governance of automated decision-making: A critical look at Canada’s Directive on Automated Decision-making. University of British Columbia Law Review 54 (1): 129.Google Scholar
Schön, Donald A. 1971. Beyond the stable state. New York: Random House.Google Scholar
Seaver, Nick. 2017. Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society 4 (2): 112.CrossRefGoogle Scholar
Sullivan, Gavin. 2022. Law, technology, and data-driven security: Infra-legalities as method assemblage. Journal of Law & Society 49 (1): S31S50.CrossRefGoogle Scholar
Summers, Kate, and Young, David. 2020. Universal simplicity? The alleged simplicity of Universal Credit from administrative and claimant perspectives. The Journal of Poverty and Social Justice 28 (2): 169–86.CrossRefGoogle Scholar
Thomas, Robert. 2021. Analysing systemic administrative justice failures: Explanatory factors and prospects for future research. Journal of Social Welfare and Family Law 43 (3): 339–63.CrossRefGoogle Scholar
Tomlinson, Joe. 2019. Justice in the digital state: Assessing the next revolution in administrative justice. Bristol: Bristol University Press.Google Scholar
United Kingdom, Comptroller and Auditor General (UK AG). 2018. Rolling out Universal Credit.Google Scholar
United Kingdom Department for Work and Pensions (UK DWP). n.d. New to Universal Credit—Your responsibilities. Understanding Universal Credit. Accessed August 12, 2022. https://www.understandinguniversalcredit.gov.uk/new-to-universal-credit/your-responsibilities/Google Scholar
United Kingdom, Department for Work and Pensions (UK DWP). 2016. Universal Credit digital service (2016). YouTube video, 4:07. February 17, 2016. https://www.youtube.com/watch?v=_PwTeW0yqbQGoogle Scholar
United Kingdom, Department for Work and Pensions (UK DWP). 2021. Universal Credit statistics, 29 April 2013 to 14 January 2021. https://www.gov.uk/government/statistics/universal-credit-statistics-29-april-2013-to-14-january-2021/universal-credit-statistics-29-april-2013-to-14-january-2021Google Scholar
Valverde, Mariana. 2021. Smart cities as a civic leaders’ survivors game: The lure of innovation in a competitive world. In Smart cities in Canada: Digital dreams, corporate designs, ed. Valverde, Mariana and Flynn, Alexandra, 2135. Toronto: Lorimer.Google Scholar
Valverde, Mariana, Johns, Fleur, and Raso, Jennifer. 2018. Governing infrastructure in the age of the “art of the deal”: Logics of governance and scales of visibility. PoLAR: Political and Legal Anthropology Review 41, no. S1 (September): 118–32.CrossRefGoogle Scholar
Van Den Meerssche, Dimitri. 2022. Virtual borders: International law and the elusive inequalities of algorithmic association. European Journal of International Law 33, no. 1 (February): 171204.Google Scholar
Veale, Michael, and Brass, Irina. 2019. Administration by algorithm? Public management meets public sector machine learning. In Algorithmic Regulation, ed. Yeung, Karen and Lodge, Martin, 121–49. Oxford: Oxford University Press.CrossRefGoogle Scholar
Weber, Max. 1978. Selections in translation, ed. Runciman, W. G.. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Winner, Langdon. 1980. Do artifacts have politics? Daedalus 109, no. 1 (Winter): 121–36.Google Scholar
Wylie, Bianca. 2022. ArriveCAN and the administrative state: Mundanity and social licence. Medium (blog). July 22. https://biancawylie.medium.com/arrivecan-and-the-administrative-state-mundanity-and-social-license-6f4e6e5d3cc8Google Scholar
Yeung, Karen, and Lodge, Martin. 2019. Algorithmic governance. Oxford: Oxford University Press.Google Scholar
Yeung, Karen. 2023. The New public analytics as an emerging paradigm in public sector administration. Tilburg Law Review 27, no. 2 (February): 132.CrossRefGoogle Scholar