Hostname: page-component-6bf8c574d5-m789k Total loading time: 0 Render date: 2025-03-09T15:22:06.032Z Has data issue: false hasContentIssue false

Big Tech Companies’ Obligations under International Human Rights Law

Published online by Cambridge University Press:  06 March 2025

Yuval Shany*
Affiliation:
Hersch Lauterpacht Chair in Public International Law, Hebrew University of Jerusalem (Israel) Visiting Fellow, Ethics in AI Institute, Oxford University, Oxford (United Kingdom)

Abstract

This article critically evaluates three attempts to overcome the problem of fit between international human rights law (IHRL) and the digital ecosystem, through an expansion of the existing IHRL framework to big tech companies. The attempted expansions considered here include standard-setting initiatives involving the imposition on states and companies – large technology companies and other business enterprises – of certain duties to apply IHRL in connection with potentially rights-infringing business practices. As I discuss below, most of the duties identified and/or developed in this regard within the context of the United Nations Human Rights Council’s Business and Human Rights (BHR) agenda constitute soft law for the time being. Negotiations for a Legally Binding Instrument (LBI) designed to strengthen the applicable legal framework are ongoing, but their prospects of success remain unclear. Another attempted expansion involves self-regulation by big tech companies through corporate policies aimed at incorporating certain IHRL norms into their business practices. The efforts of Meta to incorporate IHRL into its corporate policies and to offer an IHRL grievance mechanism through the operation of the Meta Oversight Body (focusing mainly on protecting freedom of expression, as articulated in the International Covenant on Civil and Political Rights) represent a key case study in this regard. A third attempt to address the aforementioned problem of fit that I consider below involves efforts by special procedures of the Human Rights Council to exercise its standard-setting and monitoring functions in connection with the practices of large technology companies. The work of the Special Rapporteur for Freedom of Opinion and Expression in this area is particularly noteworthy. These three examples of expansion attempts provide useful insights into the potential of IHRL to serve as a legal framework to govern the operations of large technology companies, as well as about the limits of its potential.

The article starts by discussing recent developments in the BHR agenda, including efforts to conclude an LBI. The extent to which this agenda represents a promising avenue for holding large technology companies accountable to IHRL norms is then considered in the second and third parts of the article, which discuss two normative initiatives that derive largely from the BHR agenda: this section specifically examines Meta’s espousal of IHRL as part of its corporate BHR policy, and considers attempts by Human Rights Council special procedures to apply IHRL to technology companies.

Type
Articles
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press in association with the Faculty of Law, the Hebrew University of Jerusalem.

1. Introduction

Modern international human rights law (IHRL) developed after 1945 as an international legal framework designed to limit state power and prevent abuse of governmental authority.Footnote 1 It was created as part of a new world order heralded by the United Nations organisation against the backdrop of the atrocities of the Second World War and the extreme forms of persecution and oppression introduced by totalitarian regimes in the inter-war period. Extensive evidence for the state-centric configuration of IHRL is found in the numerous treaties that constitute this normative regime by placing legal obligations on states,Footnote 2 in the consignment of human rights to the state-centric normative and institutional framework of public international law,Footnote 3 and in the historical linkage between IHRL and domestic human rights norms and institutions, which introduced from the eighteenth century onwards political and legal constraints on governmental power.Footnote 4

Indeed, state-centrism runs deep and wide inside and throughout IHRL. The enforcement machinery developed under global and regional human rights treaties focuses on the legal responsibility of states for human rights violations. States are expected to report periodically to a variety of IHRL monitoring bodies, to reply to individual petitions brought against them by human rights victims or other complainants, and to strive to give effect to country-specific recommendations relating to their human rights records. Even when acknowledging the potential involvement of private and other non-state actors in violations of IHRL norms (such violations are typically referred to, when perpetrated by non-state actors, as ‘abuses’),Footnote 5 IHRL focuses on the obligations of states to prevent such violations/abuses, including on their duty to facilitate and promote the enjoyment of human rights by rights holders through regulating, when necessary, the operations of private and other non-state actors.Footnote 6

As a result of this structural state-centrist configuration, questions arise concerning the fit between IHRL and the vast and diverse array of interactions and applications involving a host of private actors using digital technology and impacted thereby (i.e., the digital ecosystem).Footnote 7 Arguably, the ability to enjoy digital human rights – that is, the human rights that are due to individuals in connection with their use of digital technology or the effects of such technology on themFootnote 8 – depends more on the power relations holding between individual digital technology users or ‘data subjects’ and large technology companies that provide digital services and products, than on the relationship between such individuals and the states in which they live or work. While questions of the fit of IHRL norms and institutions arise in many situations where private companies, through their activities, have an impact on the enjoyment of human rights by individuals, it looks as if big tech companies – i.e., the technology companies that occupy the largest market share or the largest stream of revenues in the technology sector – raise a unique institutional challenge for IHRL. This is because the relationship between large technology companies and their users differs in some important respects from that existing between individuals and other private companies, such as private security companies, banks or industrial factories.

First, large technology companies present a unique regulatory challenge. They operate across a wide variety of national jurisdictions, and their products and services typically involve an internationally diverse supply chain and/or value chain. Such transnational features are facilitated and exacerbated by the highly mobile nature of digital technology, and the global nature of the internet and other online platforms for the dissemination of digital goods and services. The international reach, as well as the sheer economic size of leading technology companies – such as Google, Apple, Meta, Amazon, Microsoft, X (Twitter) and Open AI – might render them both too global and too large to be regulated by most states, as traditional IHRL doctrines on positive state obligations envision.Footnote 9 In addition, the quick and dynamic nature of the development of digital technology itself – including its fast evolution from foundational models to applied products – creates serious challenges in timing any regulatory intervention by any specific states with regard to such moving technological targets, and in enabling states and other actors to evaluate the actual human rights impacts of new and emerging technology.Footnote 10

Second, the interface between the operations of large technology companies and individual users is exceptionally broad in terms of the numbers of individuals affected by digital products and services. The interaction between big tech companies and individual users is also exceptionally consequential for individuals and for societies as a whole, in view of the fundamental nature of the individual and group rights implicated thereby. In fact, certain technology companies have positioned themselves as key ‘gatekeepers’ in establishing or constraining the enjoyment, on a global scale, of fundamental human rights such as freedom of expression, the right to privacy and the right to equality.Footnote 11

Third, the seller–buyer or service provider–customer relationship, which exists between large technology companies and individual users, and the nature of the products and services the former provide, puts big tech companies in a unique position to ensure, on an ongoing basis, the enjoyment of rights by individual users or data subjects.Footnote 12 In fact, with regard to several digital human rights – such as those involving the exercise of control by data subjects over personal data (e.g., the right to data portabilityFootnote 13 or the right to be forgottenFootnote 14 ) and algorithmic decision making (e.g., the right to a human decision makerFootnote 15) – large technology companies might be expected to uphold rights in ways that have no immediate offline equivalent, and which most states were never in a position to uphold.Footnote 16 The lack of offline IHRL experience from which to draw might mean that both states and companies would face numerous difficulties in implementing new digital human rights.

This article critically evaluates three prominent attempts to overcome the problem of fit between IHRL and the digital ecosystem, by way of an expansion of the existing IHRL framework to big tech companies. Through a discussion of these three attempts, I will identify some structural constraints that hamper the application of IHRL to big tech companies: for example, excessive reliance on voluntary commitments, the foregrounding of some digital rights and the backgrounding of others, limited access to effective remedies, and the inadequacy of international enforcement mechanisms to address the violation of digital human rights.

The expansions considered herein include standard-setting initiatives involving the imposition on states and companies – large technology companies and other business enterprises – of certain duties to apply IHRL in connection with rights-infringing business practices. As I discuss below, most of the duties identified and/or developed in this regard within the context of the Business and Human Rights (BHR) agenda of the UN Human Rights Council (HRC) constitute soft law for the time being. Negotiations for a Legally Binding Instrument (LBI) designed to strengthen the applicable legal framework are ongoing, but their prospects of success remain unclear. Another attempted expansion involves BHR-inspired self-regulation by big tech companies through corporate policies aimed at incorporating certain IHRL norms into their business practices. The efforts of Meta to incorporate IHRL into its corporate policies and to offer an IHRL grievance mechanism through the operation of the Meta Oversight Body (MOB) (which focuses mostly on protecting freedom of expression, as articulated in the International Covenant on Civil and Political Rights (ICCPR))Footnote 17 represent a key case study in this regard. A third attempt to address the aforementioned problem of fit that I consider below involves efforts by special procedures of the HRC to exercise their standard-setting and monitoring functions in connection with the practices of large technology companies. The work of the Special Rapporteur for Freedom of Opinion and Expression is particularly noteworthy in this area. These three examples provide useful insights into the potential of IHRL to serve as a legal framework to govern the operations of large technology companies, as well as the structural limits of this potential.

Section 2 of the article discusses recent developments in the BHR agenda, including efforts to conclude a LBI. The extent to which this agenda represents a promising avenue for holding large technology companies accountable to IHRL norms is considered mostly in Sections 3 and 4, which discuss two normative initiatives that largely derive from the BHR agenda: Section 3 considers specifically Meta’s espousal of IHRL as part of its corporate BHR policy, and Section 4 considers attempts by HRC special procedures to apply IHRL to technology companies. Section 5 concludes.

2. The Business and Human Rights agenda and the draft Legally Binding Instrument

The 2011 UN Guiding Principles on Business and Human Rights (UNGP)Footnote 18 constitute an important milestone in ongoing international efforts to hold international businesses accountable under international human rights standards. They were adopted on the basis of the 2008 Protect, Respect, Remedy Framework – a set of principles introduced by John Ruggie, the UN Secretary-General’s Special Representative on Human Rights and Transnational Corporations and Other Business Enterprises,Footnote 19 which were partially built, in turn, on lessons learnt from a previous attempt to promulgate norms of business and human rights.Footnote 20 The Sub-Commission on the Promotion and Protection of Human Rights adopted in 2003 the said norms, but the Commission on Human Rights rejected them shortly thereafter.

Despite their significance as a conceptual framework for articulating desirable standards of conduct, the UNGP do not impose concrete legal obligations on business entities. Rather they call on states to undertake a variety of regulatory and policy functions vis-à-vis business enterprises. The first pillar of the UNGP contains demands from states to afford protection for individuals located within their territory or under their jurisdiction against human rights abuses committed by business enterprises. It also recommends that states set out normative expectations for business enterprises domiciled in their territory or subject to their jurisdiction, take additional steps to control the operations of business enterprises with which they have a special nexus and of business enterprises involved in conflict situations, and extend their legal obligations in the field of BHR also to multilateral settings. States should also take appropriate steps, pursuant to the third pillar of the UNGP, to ensure that those affected by abuses within their territory or jurisdiction have access to an effective remedy. Parts of the UNGP reiterate existing legal obligations that states incur under IHRL, whereas other parts elaborate new soft law standards that should govern the relations between states and business enterprises. Still, in both normative configurations – i.e., when providing greater specificity to lex lata and complementing existing law with additional norms that can be regarded as lex ferenda – the UNGP arguably follow the traditional state-centric logic of IHRL.

The second pillar of the UNGP, which covers corporate responsibility to respect human rights, represents a minor deviation from the dominant state-centric IHRL framework. It calls on business enterprises directly to respect human rights, and proposes in this respect a number of soft law standards of conduct (such as to avoid and address adverse human rights impacts, have in place human rights policies and processes, apply human rights due diligence, and report on how human rights concerns are addressed). In the same vein, the third pillar calls on business enterprises to develop and participate in effective grievance mechanisms.

Particularly significant, for our purposes, is the statement found in the commentary to Principle 11 of the UNGP,Footnote 21 according to which:

The responsibility to respect human rights is a global standard of expected conduct for all business enterprises wherever they operate. It exists independently of States’ abilities and/or willingness to fulfil their own human rights obligations, and does not diminish those obligations. And it exists over and above compliance with national laws and regulations protecting human rights.

The UNGP framework does not identify explicitly the normative source of this independent standard of expected conduct, although Ruggie suggested that it stems from the ‘basic expectations of society’.Footnote 22 Another significant normative statement is found in Principle 12 of the UNGP, which provides:

The responsibility of business enterprises to respect human rights refers to internationally recognized human rights – understood, at a minimum, as those expressed in the International Bill of Human Rights and the principles concerning fundamental rights set out in the International Labour Organization’s Declaration on Fundamental Principles and Rights at Work.

The combined effect of these two statements is that the drafters of the UNGP appear to have considered the responsibility of business enterprises to respect IHRL to derive from the normative justifications or social expectations which underlie basic IHRL norms. Furthermore, the drafters appear to have presumed that IHRL serves as a suitable normative framework to govern broad categories of interaction between business enterprises and individuals impacted by their activities, notwithstanding the fact that these IHRL standards were originally developed for application by states and despite the great diversity of business enterprises and interactions covered by the UNGP. In any event, the responsibility of business enterprises to respect IHRL pursuant to the UNGP does not depend formally on the question of whether or not the states in which they operate or to which they are tied have adopted specific treaties comprising the International Bill of Rights, such as the ICCPR or ICESCR. It is a direct, not derivative, form of responsibility.

Another notable aspect of the UNGP is that the second pillar focuses on the responsibility of business enterprises to respect IHRL (partly corresponding to negative obligations of states to uphold IHRL), and does not introduce a responsibility to protect or fulfil IHRL (which would have partly corresponded to positive obligations of states to uphold IHRL).Footnote 23 In fact, the Commentary to Principle 11 provides explicitly that ‘business enterprises may undertake other commitments or activities to support and promote human rights, which may contribute to the enjoyment of rights. But this does not offset a failure to respect human rights throughout their operations’. This language suggests that the responsibility to respect is qualitatively different from and normatively superior to any other IHRL responsibility that may be actually assumed by business enterprises.

The UNGP have been criticised for their over-reliance on voluntarism, given the lack of an identifiable legal source for the soft ‘responsibilities’ they introduce, the limited effectiveness of the soft enforcement methods for inducing compliance by business enterprises with IHRL standards,Footnote 24 and the analytical unclarity of the distinctions offered between duties of states and business enterprises and between the responsibility to respect IHRL and other potential responsibilities (e.g., to protect and fulfil).Footnote 25 The measures taken by certain domestic legal systems to impose legally binding reporting and due diligence obligations on transnational corporations in connection with certain IHRL norms,Footnote 26 and the willingness of some national courts to impose legal liability on corporations which have failed in doing so,Footnote 27 have further nurtured perceptions about the relative weakness of the UNGP framework.Footnote 28 Such criticisms and perceptions have led the HRC and other international actors to undertake efforts to upgrade the UNGP through the adoption of a LBI on BHR.Footnote 29

The LBI drafting process has generated over the years a number of drafts. The updated third draft from July 2023 contains in Article 2(b) the following statement of aim for the LBI: ‘To clarify and ensure respect and fulfilment of the human rights responsibilities of business enterprises’.In the same vein, Article 2(c) provides as an aim: ‘To prevent the occurrence of human rights abuses in the context of business activities by effective mechanisms for monitoring, enforceability and accountability’. In order to implement these aims, the draft LBI strengthens the first and third pillar of the UNGP. It requires states to ‘adopt appropriate legislative, regulatory, and other measures’ in order to ‘prevent the involvement of business enterprises in human rights abuse’, ‘ensure respect by business enterprises for internationally recognized human rights and fundamental freedoms’ and ‘ensure the practice of human rights due diligence by business enterprises’.Footnote 30 It also provides victims of human rights abuses with the right to bring claims before the domestic courts and ‘non-judicial grievance mechanisms’,Footnote 31 and imposes on states:Footnote 32

[a duty to] adopt such measures as may be necessary to establish a comprehensive and adequate system of legal liability of legal and natural persons conducting business activities, within their territory, jurisdiction, or otherwise under their control, for human rights abuses that may arise from their business activities or relationships, including those of transnational character.

In this last connection, states must ensure that ‘the liability of a legal person is not contingent upon the establishment of liability of a natural person’.Footnote 33 Significantly, the updated third draft also requires states:Footnote 34

[to] take necessary measures to ensure that business enterprises take appropriate steps to prevent human rights abuse by third parties where the enterprise controls, manages or supervises the third party, including through the imposition of a legal duty to prevent such abuse in appropriate cases.

If adopted, the LBI would create extensive and legally binding obligations for states to regulate and control the operations of business entities that have an impact on the enjoyment of IHRL by individuals. While such legal obligations expand upon the first and third pillars of the UNGP, they remain based largely on existing IHRL standards (providing them, however, with more specific and concrete meaning).Footnote 35 Still, they will be monitored by a dedicated treaty body structured in accordance with the model available under other global IHRL treaties.Footnote 36

The LBI will also strengthen indirectly the second and third pillars of the UNGP as applicable to business enterprises. By requiring states to ensure that liability for IHRL abuses is imposed on business enterprises under domestic law and that effective remedies are available to victims, the LBI contributes to the blurring of boundaries between social responsibilities and legal obligations. Still, this does not amount to direct application of IHRL to business entities, or to the provision of a direct right of access by victims of IHRL abuses to international enforcement mechanisms in order to bring legal proceedings against rights-infringing private corporations.

In other words, despite growing acceptance of the premise that business enterprises should comply with IHRL standards, the LBI retains a state-centric approach (in fact, the updated third draft explicitly reaffirms that states have the ‘primary obligation to respect, protect, fulfill and promote human rights and fundamental freedoms’).Footnote 37 The main difference between the UNGP and the LBI appears to be the latter’s introduction of stronger state obligations relating to the manner in which its domestic laws, regulations, policies, remedies and monitoring mechanisms apply to business enterprises.

One interesting normative development in the LBI, which might suggest a gradual transformation in the nature of corporate responsibility existing under the UNGP, is the aforementioned allusion in Article 6.5 to the duty of states to ensure that business enterprises undertake positive measures to prevent abuses by third parties they control, manage or supervise.Footnote 38 Such a development of positive responsibilities by business enterprises mirrors earlier developments in IHRL relating to positive state obligations intended to protect human rights victims from violations by private actors.Footnote 39 Given the dominant role of supply and value chains in the digital economy, such a development could have significant repercussions for the IHRL duties of large technology companies.

3. The Meta experience

Meta’s engagement with IHRL standards represents a unique case study for the application of the UNGP – that is, a high-profile attempt by a major technology company to voluntarily implement the BHR framework, through the direct application of certain IHRL standards to some of its operations, and the availability of an institutionalised and highly accessible grievance mechanism potentially providing remedies to millions of victims of IHRL abuses. The combined effect of direct application and enforcement of IHRL and the significant transnational governance functions actually exercised by Meta offers a potential model for regarding certain business enterprises as IHRL duty holders in a new way, which puts them in a status comparable with that held by states.

Like other online platforms, Meta’s Facebook, Instagram and Threads social media platforms self-regulate their customers’ interaction on the platform and with the platform through a series of legal documents, such as terms of service,Footnote 40 privacy policyFootnote 41 and community standards.Footnote 42 The latter refer to Meta’s ‘commitment to voice’ and expression, but note that they can be limited at the service of company values – authenticity, safety, privacy and dignity. The Facebook community standards also affirm that when making decisions about content removal, ‘[w]e look to international human rights standards to make these judgments’. Specific policies have been published by Meta over the years with regard to violence and criminal behaviour, safety, objectionable content, integrity and authenticity, intellectual property, content-related requests and decisions, and other policy areas.Footnote 43

In view of concerns raised about the adequacy of Facebook’s content moderation policies and about their manner of implementation, the company’s CEO – Mark Zuckerberg – announced in 2018 his decision to establish an independent body to review some of the company’s content moderation decisions.Footnote 44 The MOB began its operations in May 2020, with its functions and competencies spelt out in a number of documents, including a CharterFootnote 45 and Bylaws.Footnote 46 A number of provisions in these instruments allude to human rights standards: Article 2.2 of the Charter provides that ‘[w]hen reviewing decisions, the board will pay particular attention to the impact of removing content in light of human rights norms protecting free expression’; Article 1.4.1 of the Bylaws requires the MOB to include in its annual reports ‘[a]n analysis of how the board’s decisions have considered or tracked the international human rights implicated by a case’; and Article 2.2.3.2 of the Bylaws requires Facebook to notify posting and reporting users about MOB decisions in a manner ‘guided by relevant human rights principles’.

In parallel with these developments, Meta published, in March 2021, its Corporate Human Rights Policy.Footnote 47 The first segment, dealing with ‘our commitments’, provides the following language, in part lifted directly from the UNGP:

We are committed to respecting human rights as set out in the United Nations Guiding Principles on Business and Human Rights (UNGPs).This commitment encompasses internationally recognized human rights as defined by the International Bill of Human Rights — which consists of the Universal Declaration of Human Rights; the International Covenant on Civil and Political Rights; and the International Covenant on Economic, Social and Cultural Rights — as well as the International Labour Organization Declaration on Fundamental Principles and Rights at Work. Depending on circumstances, we also utilize other widely accepted international human rights instruments, including the International Convention on the Elimination of All Forms of Racial Discrimination; the Convention on the Elimination of All Forms of Discrimination Against Women; the Convention on the Rights of the Child; the Convention on the Rights of Persons with Disabilities; the Charter of Fundamental Rights of the European Union; and the American Convention on Human Rights … We recognize the diversity of laws in the locations where we operate, and where people use our products. We strive to respect domestic laws. When faced with conflicts between such laws and our human rights commitments, we seek to honor the principles of internationally recognized human rights to the greatest extent possible. In these circumstances we seek to promote international human rights standards by engaging with governments, and by collaborating with other stakeholders and companies.

Other sections of the Policy cover implementation (including internalising human rights into relevant policy instruments), governance, oversight and accountability processes, conducting human rights due diligence, providing access to remedies and protecting human rights defenders.

The upshot of these normative developments has been the emergence within one large technology company – Meta – of a thick net of IHRL standards that purport to shape its approach to content moderation, corporate governance, oversight and accountability, and set normative expectations for Meta clients. These standards consist of major IHRL treaties, which Meta has undertaken to respect. Significantly, Meta has accepted, in line with the UNGP,Footnote 48 to follow IHRL standards, to the greatest extent possible, even in the face of incompatible domestic legal standards that govern its operations in different jurisdictions. This suggests that IHRL occupies a uniquely high status in the normative hierarchy of sources to which the company strives to adhere. Unlike many corporate commitments and human rights policies the legal status of which is less than clear – they are general representations about standards applicable to business operations and not contractual arrangements – it is noteworthy that Meta’s community standards, which are forms of contractual arrangements,Footnote 49 reflect in part IHRL standards, and stipulate that they should be construed in accordance with IHRL.

A review of the practice of the MOB in applying these normative standards confirms that IHRL holds pride of place in the process of the quasi-judicial review that it exercises over Meta content moderation decisions. Almost all MOB decisions are structured in a way that alludes to IHRL as relevant standards, alongside Meta’s Community Standards and Values, and the analysis of compliance with Meta’s IHRL responsibilities occupies a significant part of the decision. Such analysis follows, as a rule, a typical human rights analysis methodology: identification of the right allegedly infringed (most cases implicate freedom of expression issues), and consideration of whether the limitation of rights was permissible on the basis of its legality (including clarity and accessibility of the applicable Meta instruments), legitimate aim, necessity and proportionality. The decisions often cite normative outputs of IHRL bodies, such as Human Rights Committee general commentsFootnote 50 and views in individual communications,Footnote 51 UN Human Rights Council resolutionsFootnote 52 and UN Special Rapporteur reports.Footnote 53 The decisions also cross-cite one another, contributing thereby to the creation of a significant body of MOB jurisprudence on IHRL.

The experience of Meta offers a possible model for the application of IHRL beyond the state. Parts of the normative framework it applies remain state-centric – the domestic laws of the different states in which it operates, and regional and international standards that such states created. Still, significant parts of its transnational governance structures are independent of states, and directly apply autopoietic norms (community standards, company values, MOB case law), which mirror IHRL. With regard to IHRL, it appears as if the MOB treats Meta’s exercise of governance power over speech as analogous to state government power. In fact, the Board sometimes notes weaknesses in the application of IHRL by a relevant state as a reason for affording users a higher level of protection by Meta. In the Öcalan Isolation case, for example, the MOB wrote: ‘The Board is particularly concerned about Facebook removing content on matters in the public interest in countries where national, legal and institutional protections for human rights, in particular freedom of expression, are weak’.Footnote 54 This suggests a reversal of roles. Whereas the UNGP envisioned states as the primary IHRL duty holders, which ought to develop and put in place safeguards against IHRL abuses by business enterprises, the Meta experience proposes that there could be situations where the business enterprise might be expected to serve as a corrective device for IHRL violations committed by states.Footnote 55 Furthermore, given the almost global reach of companies like Meta, their impact on the global interpretation and application of IHRL, and especially on the application of freedom of expression, exceeds that of most states.

While the Meta model is interesting and promising (albeit not free from criticism),Footnote 56 it is important to acknowledge that it remains voluntary in nature, with the scope of decisions and policies reviewed determined unilaterally by Meta. The voluntariness of the model also implies that Meta may decide in the future to stop replenishing the trust fund that facilitates the long-term operations of the MOB,Footnote 57 or stop providing it with the necessary staffing or logistical support.Footnote 58 In addition, Meta retains broad discretion over whether or not to adopt certain general policy recommendations made by the Board.Footnote 59

Furthermore, it remains to be seen how much the model will be replicated by other large technology companies. So far, no other major big tech company has set up an MOB-like structure, and no company has followed up on Meta’s suggestion that they consider joining the MOB itself.Footnote 60 To be sure, many other technology companies have also introduced codes of conduct or community guidelines that internalise IHRL standards, such as freedom of expression and the prohibition of hate speech,Footnote 61 right to privacy,Footnote 62 non-discrimination,Footnote 63 protection of personal security,Footnote 64 child safetyFootnote 65 and intellectual property rights.Footnote 66 Some of these companies also regularly publish detailed human rights policy statements.Footnote 67 While the scope, extent, independence and permanence of the different internal oversight mechanisms resorted to by other technology companies varies,Footnote 68 they do not offer an IHLR-applying grievance mechanism that resembles the MOB; nor do they allow for ongoing normative engagement with IHRL in the same way that the MOB facilitates.

The diversity in the IHRL practices of big tech companies underscores the continued voluntariness of much of the BHR framework that governs their engagement with IHRL – effectively allowing them to select the level of desirable engagement with IHRL – and the absence of strong and effective mechanisms that operate across the board for applying IHRL norms and for providing remedies to victims. So, while human rights due diligence proceduresFootnote 69 and compliance or transparency reporting on certain IHRL-related practicesFootnote 70 are gaining traction across large technology companies, and although some companies do involve third parties in addressing complaints concerning ethical and compliance issues,Footnote 71 the creation of a fully fledged independent IHRL-applying complaint mechanism – àla the MOB – has not yet attained a ‘gold standard’ status.

4. The special procedures mechanisms

Another development that throws more light on the fit between IHRL and the responsibilities of large technology companies is increased attempts by special mandate holders or special procedures operating alongside the HRC to extend their standard-setting and compliance monitoring mechanisms to such companies. The special procedures of the HRC consist of some 60 working groups and special rapporteurs entrusted with reviewing thematic IHRL issues and specific country situations. In their official capacity, they produce annual reports, conduct country visits and respond to individual complaints (or communications), as well as produce other public statements and documents, including comments on pending legislation and policy measures. As ‘Charter Bodies’ – drawing their legal authority ultimately from the UN Charter – they can apply specific treaties binding on the states subject to their review, or other universal IHRL standards.Footnote 72

There are four principal ways in which special mandate holders can extend their procedures to large technology companies through issuing recommendations to them or to relevant states: (i) the elaboration in thematic reports of IHRL standards relevant to the technology sector, (ii) the inclusion of the private technology sector in country visits, (iii) the issuance of comments to big tech companies on specific measures that were undertaken or should be undertaken by the latter, and (iv) the processing of individual communications against specific technology companies. The mandate of the Special Rapporteur on Freedom of Opinion and Expression stands out as a particularly interesting example of an attempt to extend its normative and institutional purview to the technology sector. As a result, the focus of this section is on the activities of this mandate in that regard.

4.1. Thematic reports

The work of special procedures in elaborating IHRL standards applicable to big tech corporations appears to be an extension of the work of the HRC and other international and multi-stakeholders in the BHR field. For example, in his 2015 report on encryption and anonymity in digital communications, the Special Rapporteur on Freedom of Opinion and Expression, David Kaye, discussed the role of corporations and recommended that companies should implement BHR due diligence and transparency practices, embrace certain technological solutions and conduct themselves, in certain aspects, in the same ways in which states are expected to behave:Footnote 73

[C]orporate actors should review the adequacy of their practices with regard to human right norms … Companies, like States, should refrain from blocking or limiting the transmission of encrypted communications and permit anonymous communication… Corporate actors that supply technology to undermine encryption and anonymity should be especially transparent as to their products and customers.

In a subsequent report on freedom of expression in the digital age, from 2016, the Special Rapporteur stated his intention to use the mandate to monitor the implementation of IHRL policies by technology companies, using procedures originally developed for monitoring IHRL compliance by states:Footnote 74

Beyond adoption of policies, private entities should also integrate commitments to freedom of expression into internal policymaking, product engineering, business development, staff training and other relevant internal processes. The Special Rapporteur will aim to explore policies and the full range of implementation steps in a number of ways, including through company visits.

In a 2017 report on internet and telecommunication access,Footnote 75 the Special Rapporteur elaborated the positive duties of technology companies, moving beyond the focus on responsibility to respect found in the UNGP – underscoring thereby the interconnectedness of negative and positive duties – and portraying companies as counterweights to states, especially when the latter engage in potential IHRL violations:

In this spirit, in addition to high-level policy commitments to human rights, the industry should allocate appropriate resources towards the fulfilment of these commitments, including due diligence, rights-oriented design and engineering choices, stakeholder engagement, strategies to prevent or mitigate human rights risks, transparency and effective remedies. … [W]hen States request corporate involvement in censorship or surveillance, companies should seek to prevent or mitigate the adverse human rights impacts of their involvement to the maximum extent allowed by law. In any event, companies should take all necessary and lawful measures to ensure that they do not cause, contribute or become complicit in human rights abuses.

The 2018 Special Rapporteur’s report on content moderation further called on information and communication technology (ICT) companies to rely on IHRL in their policies and processes so as to ‘respect democratic norms and counter authoritarian demands’.Footnote 76 As part of their transparency requirements, he called on them to develop a ‘case law’ that will frame the interpretation of their content moderation standards and implementation practices,Footnote 77 foreshadowing thereby the work of the MOB (which started its operations two years later). A later report on online hate speech explicitly mentions the IHRL standards that companies are expected to apply in their relevant content moderation policies.Footnote 78

In the 2019 report on surveillance and human rights, Special Rapporteur Kaye called for a multi-stakeholder approach to develop rights-based regulatory standards and implementation initiatives.Footnote 79 He also called on private surveillance companies to introduce robust safeguards against IHRL violations, including suitable contractual clauses, technological features against misuse, and effective grievances and remedial mechanisms.Footnote 80 In the same vein, the next Special Rapporteur, Irene Khan, in 2021 called on ICT companies to ‘establish internal appeals mechanisms for a broader range of content moderation decisions and types of content, such as coordinated inauthentic behaviour’ and to ‘explore the creation of external oversight mechanisms such as social media councils’.Footnote 81 In the same report she criticised technology companies for failing ‘to apply their policies consistently across all geographical areas or to uphold human rights in all jurisdictions to the same extent’.Footnote 82

The upshot of these thematic reports is that the Special Rapporteur on Freedom of Opinion and Expression has made extensive use of his/her mandate to define and develop the IHRL obligations of technology companies, to identify shortcomings in their policies and actual practices and to recommend improvements in policy design, implementation measures, and accountability and transparency mechanisms. The harnessing of IHRL mandates for advancing standard setting for, and review of practices of technology companies can be found, albeit to a lesser extent, in the work of the certain other special mandate holders as well.Footnote 83

4.2. Country visits

As indicated above, the Special Rapporteur on Freedom of Opinion and Expression, expressed in 2016 an interest in holding a company visit – presumably along lines similar to that used in country visits – that is, visits dedicated to investigating human rights conditions in specific countries. The Rapporteur, David Kaye, did visit Silicon Valley, albeit not in the form of a ‘country visit’ but as part of research he conducted in connection with a 2018 report on content regulation.Footnote 84 This followed an earlier visit to San Francisco by the Working Group on Business and Human Rights, which took place as part of its US country visit, and was dedicated to reviewing the ICT sector’s engagement with IHRL in managing supply chains and in addressing issues relating to the right to privacy and freedom of expression.Footnote 85 At least one other mandate holder expressed an interest in conducting a separate ‘country visit’ to Silicon Valley with a view to assessing the IHRL practices of technology companies based there.Footnote 86

4.3. Comments on policies and measures

Until now, the Special Rapporteur on Freedom of Opinion and Expression has sent at least seven letters to large technology companies, alerting them to IHRL concerns about recent policies or measures that they have allegedly undertaken or considered undertaking. These include letters concerning Apple’s removal of VPN applications in China,Footnote 87 Meta’s need to ensure the independence of the MOB and its ability to rely on IHRL,Footnote 88 a comment on the MOB Al Jazeera decision relating to arbitrary/biased content moderation in the Israeli–Palestinian context,Footnote 89 a letter on Meta’s policy relating to cases involving online gender violence,Footnote 90 a joint letter with the Special Rapporteur on Freedom of Assembly and of Association to ICANN concerning the transfer of the Public Interest Registry to a private equity firm, letters to the NSO Group concerning potential abuse of their spyware products and their new whistleblower policy,Footnote 91 and a letter written together with the Working Group on Human Rights and Business to TikTok concerning the need to integrate IHRL in their new content moderation policy.Footnote 92 Whereas most of the letters remain unanswered, in a few cases the companies responded and provided more information, suggesting some interest in engaging with the Special Rapporteur around the concerns raised.Footnote 93

4.4. Communications

A similar letter-writing procedure – which partly overlaps with the policies and measures comments procedure – exists in connection with communications submitted to special procedures by individual complainants. They too have resulted on occasion in the writing of letters by the Special Rapporteur on Freedom of Opinion and Expression to large technology companies (almost always jointly with other special procedures), expressing concern about the practices in question and/or asking for clarification. In 2023, for example, the Special Rapporteur was involved in writing letters to Google in relation to cyber attacks against a human rights defender,Footnote 94 TikTok and Omegle relating to the streaming of online sexual activity,Footnote 95 Telegram in relation to the dissemination of hate speech in MyanmarFootnote 96 and the NSO Group with regard to the application of spyware against human rights defenders.Footnote 97 By 2023, at least 14 letters had been sent by the Special Rapporteur on Freedom of Opinion and Expression (the first letter on the Special Rapporteur’s website dates from 2018).Footnote 98 Replies by technology companies to special procedures communication letters, however, have been sporadic.Footnote 99

The experience of the Special Rapporteur on Freedom of Opinion and Expression and other special procedures involved in elaborating standards and monitoring the IHRL practices of large technology companies suggests that there are now enough legal sources and conceptual building blocks to provide meaningful normative guidance to such companies with regard to certain areas of IHRL, and that special procedures have some capacity to monitor their activities. Going forward, the UN Working Group on Business and Human Rights could serve as a focal point within the UN for corporate accountability for human rights abuses, including in the digital sector.

Still, the looseness of the BHR legal framework, the paucity of tailormade IHRL norms addressing the rights of users of digital technology, and the overall weakness of the special procedures as enforcement mechanisms raise the concern that this procedural avenue offers a rather limited basis for controlling the IHRL practices of large technology companies. One may recall in this regard that, for states, the special procedures are one among numerous legal controls, alongside treaty bodies, regional human rights systems, domestic constitutional law etc. Most of these legal controls are unavailable or only partly available for large technology companies: treaty bodies and regional courts remain focused on state responsibility for IHRL violations, and domestic law treats the legal responsibility of technology companies primarily through private law, not public law lenses.

5. Conclusions

The question of fit between IHRL and the operations of large technology companies that affect the enjoyment of rights by users of digital technology remains a vexing problem in law and in practice. Driven by an ethos of effective protection of IHRL, international actors have been able to develop in the twenty-first century a BHR framework that gives concrete contents to the positive obligations of states to ensure the enjoyment of IHRL by individual rights holders, to provide access to effective remedies, and to supplement existing IHRL with certain soft law norms that apply both to states and to business enterprises. This framework, codified in the UNGP and expected to be upgraded some day by the LBI, however, remains state-centric in its orientation. Still, it applies some, albeit indirect, pressure on companies, including big tech companies, to conform their activities to IHRL standards, and to undertake substantive and procedural measures to give effect to the normative justifications and social expectations that undergird the UNGP. The rise of IHRL due diligence obligations has been particularly noteworthy in this regard.

Despite representing a significant normative and conceptual development, the BHR framework suffers from a number of weaknesses. It is voluntary in nature, allowing companies, in practice, to select the level of engagement with IHRL they wish to assume. It also lacks, for the time being, effective oversight mechanisms. Such lack of ‘teeth’ is particularly problematic when one confronts the need to control the operations of large technology companies that conduct business on the global level, far beyond the regulatory reach and capacity of most states. Still, recent advances in the regulation of technology by the European Union (EU) – including, the Digital Services Act,Footnote 100 the Digital Markets ActFootnote 101 and the AI ActFootnote 102 (and before that, the General Data Protection RegulationFootnote 103 )– suggest that the regulation of large companies, including very large online platforms and search engines,Footnote 104 is feasible, provided that states muster the required political will to cooperate in creating effective regulatory structures. Although such EU regulatory instruments do not focus on the IHRL responsibilities of big tech companies, some of their contents – in areas such as privacy protection, curbing hate speech and disinformation, and ensuring child safety – have clear IHRL dimensions.

The case of Meta, discussed in Section 3, illustrates that the BHR framework does have some potential for guiding the effective integration of IHRL into self-regulation instruments and grievance mechanisms operated by big tech companies. This experience is built, however, around the voluntary assumption of IHRL commitments by Meta, coupled with the voluntary subjection of its content moderation decisions to monitoring by an independent review body – the MOB. The relative success of the MOB in developing a considerable body of IHRL jurisprudence is impressive. However, it is ultimately linked to the voluntary nature of the entire enterprise, which to date remains limited in scope and unique in the big tech sector. It is also worth noting that, with respect to freedom of expression, the analogy between Meta’s content moderation decisions and state regulation of speech is rather straightforward. It remains to be seen how more complex configurations of power holding between large technology companies and individuals that have no immediate parallel in state–individual relations – such as different forms of AI-based decision making or the administration of clouding or blockchain services – can be addressed by the existing IHRL framework and/or an MOB-like process, which would mirror judicial or quasi-judicial review available at the national, regional or international level.

Indeed, one key problem of the BHR framework is that it addresses only to a limited degree the conceptual and practical adaptations required to facilitate the application to private companies of IHRL obligations, which were created with states as prototypical duty holders in mind. The focus of the UNGP under the second pillar on the responsibility of business enterprises to respect IHRL is too crude a measure of adaptation given the interdependence between negative and positive IHRL obligations, and indeed the LBI appears to be moving away from such sharp distinctions. More fine tuning of IHRL obligations will need to be introduced in the future by law-interpreting and law-applying bodies, probably on a sector-by-sector basis, taking into account any governance or quasi-governance functions fulfilled by private sector entities, including big tech companies. In any event, the BHR framework does not appear to provide a clear theoretical basis for direct application of IHRL to business enterprises.

The last part of this article (Section 4) discussed the potential of special procedures of the UN Human Rights Council to provide normative guidance and compliance monitoring to technology companies. Such a review is couched primarily in the BHR framework, but can provide an institutional context for identifying new sector-specific IHRL obligations, as well as help to shape voluntary commitments and grievance mechanisms developed and operated by large technology companies. Still, like other parts of the BHR framework, the special procedures have no binding authority over technology companies, and seem to generate only limited compliance pull through the non-judicial monitoring measures they pursue. The special procedures also have not articulated to date a theoretical foundation for the direct application of IHRL to private companies, nor have they promulgated clear guidelines for the process of adaptation of IHRL from states to business enterprises, and from offline to online environments.

The upshot of this analysis is that much normative and institutional work remains to be done in this area of IHRL in order to effectively utilise IHRL norms and institutions to address any right-infringing practices of large technology companies. As it currently stands, IHRL, when applied to big tech companies, suffers from a series of structural constraints that hamper its ability to afford effective protection to digital users. These include excessive reliance on voluntary commitments by companies under the BHR framework, the foregrounding of some digital rights (such as freedom of expression), and the backgrounding of others (such as limits on automated decision making and control over personal data), limited access to remedies (the MOB being a glaring exception to the general practice of not establishing strong complaint mechanisms), and the inadequacy of international enforcement machinery, which includes the Human Rights Council special procedures, for effective monitoring and curbing abuses by big tech companies.

In order to address these difficult constraints and challenges, we need to do more thinking outside the box, including engaging in a frank discussion about the adequacy of the box itself – IHRL – to the contents inside it – the operations of big tech companies. Such a discussion may lead to different approaches to business regulation – direct regulation, indirect regulation or self-regulation – which could perhaps protect more effectively the basic rights and interests of digital users than the current BHR-driven agenda.

Acknowledgements

The author thanks participants in a workshop on the human rights obligations of big tech companies, which took place in January 2024 at the Center for Transnational Law at King’s College London (UK), for their useful comments on an earlier draft of the article.

Financial statement

The research for this article was conducted with the support of ERC Grant No. 101054745: the Three Generations of Digital Human Rights (DigitalHRGeneration3), https://3gdr.huji.ac.il.

Competing interests

The author declares none.

References

1 For a history of IHRL see, eg, Christopher NJ Roberts, The Contentious History of the International Bill of Human Rights (Cambridge University Press 2014); Steven Wheatley, The Idea of International Human Rights Law (Oxford University Press 2019); Ed Bates, ‘History’ in Daniel Moeckli and others (eds), International Human Rights Law (4th edn, Oxford University Press 2014) 15.

2 See, eg, International Covenant on Civil and Political Rights (entered into force 23 March 1976) 999 UNTS 171 (ICCPR), art 2(1) (‘Each State Party to the present Covenant undertakes to respect and to ensure to all individuals within its territory and subject to its jurisdiction the rights recognized in the present Covenant, without distinction of any kind, such as race, colour, sex, language, religion, political or other opinion, national or social origin, property, birth or other status’); International Covenant on Economic, Social and Cultural Rights (entered into force 3 January 1976) 999 UNTS 3 (ICESCR), art 2(1) (‘Each State Party to the present Covenant undertakes to take steps, individually and through international assistance and co-operation, especially economic and technical, to the maximum of its available resources, with a view to achieving progressively the full realization of the rights recognized in the present Covenant by all appropriate means, including particularly the adoption of legislative measures’); European Convention for the Protection of Human Rights and Fundamental Freedoms (entered into force 3 September 1953) 213 UNTS 221 (ECHR), art 1 (‘The High Contracting Parties shall secure to everyone within their jurisdiction the rights and freedoms defined in Section I of this Convention’).

3 See, eg, Charter of the United Nations (entered into force 24 October 1945) 1 UNTS XVI (UN Charter), arts 55–56 (‘With a view to the creation of conditions of stability and well-being which are necessary for peaceful and friendly relations among nations based on respect for the principle of equal rights and self-determination of peoples, the United Nations shall promote: … universal respect for, and observance of, human rights and fundamental freedoms for all without distinction as to race, sex, language, or religion’; ‘All Members pledge themselves to take joint and separate action in co-operation with the Organization for the achievement of the purposes set forth in Article 55’).

4 Declaration of the Rights of Man and the Citizen (20–26 August 1789) (Preamble: ‘afin que les actes du pouvoir législatif, et ceux du pouvoir exécutif, pouvant être à chaque instant comparés avec le but de toute institution politique, en soient plus respectés’; art 2: ‘Le but de toute association politique est la conservation des droits naturels et imprescriptibles de l’homme. Ces droits sont la liberté, la propriété, la sûreté, et la résistance à l’oppression’; art 3: ‘Le principe de toute souveraineté réside essentiellement dans la nation. Nul corps, nul individu ne peut exercer d’autorité qui n’en émane expressément’); US Declaration of Independence (4 July 1776) (‘We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the Pursuit of Happiness. That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed’).

5 See, eg, UN Human Rights Council (HRC), ‘Human Rights Council Opens Special Session on the Human Rights Situation in Iraq’, 1 September 2014, https://www.ohchr.org/en/press-releases/2014/09/human-rights-council-opens-special-session-human-rights-situation-iraq (alluding to ISIS ‘abuses’ of IHRL).

6 See, eg, Committee on Economic, Social and Cultural Rights, General Comment No. 24 on State Obligations under the International Covenant on Economic, Social and Cultural Rights in the Context of Business Activities (10 August 2017), UN Doc E/C.12/GC/24, paras 23–24 (‘The obligation to fulfil requires States parties to take necessary steps, to the maximum of their available resources, to facilitate and promote the enjoyment of Covenant rights, and, in certain cases, to directly provide goods and services essential to such enjoyment. Discharging such duties may require the mobilization of resources by the State, including by enforcing progressive taxation schemes. It may require seeking business cooperation and support to implement the Covenant rights and comply with other human rights standards and principles. This obligation also requires directing the efforts of business entities towards the fulfilment of Covenant rights’).

7 The term ‘fit’ was used by Dworkin to describe the relationship between rules, theory and political philosophy: Ronald Dworkin, Taking Rights Seriously (Duckworth 1977) 131–34.

8 The term ‘digital rights’ has recently been used in the EU Declaration on Digital Rights and Principles for the Digital Decade [2023] OJ C 23/1 (as an elaboration of how ‘values and fundamental rights applicable offline should be applied in the digital environment’).

9 See, eg, Kalev Leetaru, ‘As the Privacy Regulators Circle Facebook Is It Already Unstoppable?’, Forbes, 26 April 2019, https://www.forbes.com/sites/kalevleetaru/2019/04/26/as-the-privacy-regulators-circle-facebook-is-it-already-unstoppable; Anu Bradford, ‘What Is at Stake if Antitrust Regulation Fails?’, Network Law Review, 6 September 2023, https://www.networklawreview.org/anu-authoritarian-governments.

10 See, eg, Daniel J Gervais, ‘The Regulation of Inchoate Technologies’ (2010) 47 Houston Law Review 665, 683–84; Catalina Goanta, ‘The Proof Is in the Digital Enforcement Pudding’, Network Law Review, 27 April 2023, https://www.networklawreview.org/digiconsumers-three.

11 See, eg, Filippo Santoni de Sio, Human Freedom in the Age of AI (Taylor and Francis 2024) 238. Note that the European Union has recently designated a number of large big tech companies as ‘gatekeepers’ of market access under the Digital Markets Act: European Commission, Press Release, ‘Digital Markets Act: Commission Designates Six Gatekeepers, 6 September 2023, https://ec.europa.eu/commission/presscorner/detail/en/ip_23_4328. Although such a designation is aimed at fostering commercial competition, it also implies certain restrictions on privacy-infringing practices. Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on Contestable and Fair Markets in the Digital Sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act) [2022] OJ L 265/1, art 5.

12 Agnés Callamard, ‘The Human Rights Obligations of Non-State Actors’ in Rikke Frank Jørgensen (ed), Human Rights in the Age of Platforms (The MIT Press 2019) 191, 215–16.

13 See, eg, Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with regard to the Processing of Personal Data and on the Free Movement of Such Data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L 119/1, art 20.

14 CJEU, Case C-131/12 Google Spain SL and Google Inc v Agencia Espacñola de Protección de Datos (AEPD) and Mario Costeja González, Judgment, 13 May 2014, ECLI:EU:C:2014:317 (CJEU Grand Chamber).

15 See, eg, General Data Protection Regulation (n 13) art 22.

16 Note that similar challenges also arise, from time to time, in connection with small and medium-sized enterprises (eg, local data processing and cyber security companies). Still, their modest size is expected to allow for effective (at times, over-burdensome) government regulation of their activities and for stronger consumer push-back against business excesses.

17 ICCPR (n 2) arts 19–20.

18 Office of the High Commissioner of Human Rights, ‘Guiding Principles on Business and Human Rights’ (2011) (UNGP). The UNGP were adopted by HRC Resolution 17/4, Human Rights and Transnational Corporations and Other Business Enterprises (6 July 2011), UN Doc A/HRC/RES/17/4.

19 John Ruggie, Report of the Special Representative of the Secretary-General on the Issue of Human Rights and Transnational Corporations and Other Business Enterprises – Protect, Respect and Remedy: A Framework for Business and Human Rights (7 April 2008), UN Doc A/HRC/8/5.

20 UNHRC, Norms on the Responsibilities of Transnational Corporations and Other Business Enterprises with regard to Human Rights (26 August 2003), UN Doc E/CN.4/Sub.2/2003/12/Rev.2.

21 UNGP (n 18) Principle 11 (‘Business enterprises should respect human rights. This means that they should avoid infringing on the human rights of others and should address adverse human rights impacts with which they are involved’).

22 Ruggie (n 19) para 9. For criticism see Surya Deva, ‘Treating Human Rights Lightly: A Critique of the Consensus Rhetoric and the Language Employed by the Guiding Principles’ in Surya Deva and David Bilchitz (eds), Human Rights Obligations of Business: Beyond the Corporate Responsibility to Respect? (Cambridge University Press 2013) 78, 94.

23 For discussion see Florian Wettstein, ‘Normativity, Ethics, and the UN Guiding Principles on Business and Human Rights: A Critical Assessment’ (2015) 14 Journal of Human Rights 162, 169–77.

24 See, eg, Peter Muchlinski, ‘The Impact of the UN Guiding Principles on Business Attitudes to Observing Human Rights’ (2021) 6 Business and Human Rights Journal 212, 221–22; Sarah Joseph and Joanna Kyriakakis, ‘From Soft Law to Hard Law in Business and Human Rights and the Challenge of Corporate Power’ (2023) 36 Leiden Journal of International Law 335, 341–42.

25 See, eg, Florian Wettstein, Business and Human Rights: Ethical, Legal, and Managerial Perspectives (Cambridge University Press 2022) 202–04.

26 See, eg, Loi n° 2017-399 du 27 Mars 2017 relative au de voir de vigilance des sociétés mères et des entreprises donneuses d’ordre (France), https://www.legifrance.gouv.fr/affichTexte.do?cidTexte=JORFTEXT000034290626&categorieLien=id; Modern Slavery Act 2015 (UK), s 54, https://www.legislation.gov.uk/ukpga/2015/30/contents/enacted; Wet van 24 oktober 2019 houdende de invoering van een zorgplicht ter voorkoming van de levering van goederen en diensten die met behulp van kinderarbeid tot stand zijn gekomen (Wet zorgplicht kinderarbeid) 2019 (The Netherlands) (Act of 24 October 2019 Introducing a Duty of Care to Prevent the Supply of Goods and Services Created with the Help of Child Labour) (Child Labor Duty of Care Act)); Directive (EU) 2024/1760 of the European Parliament and of the Council of 13 June 2024 on Corporate Sustainability Due Diligence and amending Directive (EU) 2019/1937 and Regulation (EU) 2023/2859 [2024] PE-CONS 9/1/24 REV 1, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ:L_202401760. See also Kishanthi Parella, ‘Hard and Soft Law Preferences in Business and Human Rights’ (2020) 114 AJIL Unbound 168, 171–72; Ludovica Chiussi Curzi and Camille Malafosse, ‘A Public International Law Outlook on Business and Human Rights’ (2022) 24 International Community Law Review 11, 21–29.

27 See, eg, Vedanta v Lungowe [2019] UKSC 20; Four Nigerian Farmers and Milieudefensie v Shell, Court of Appeal of The Hague (The Netherlands), 29 January 2021, ECLI:NL:GHDHA:2021:132, English translation at https://uitspraken.rechtspraak.nl/details?id=ECLI:NL:GHDHA:2021:1825; but see Shell v Milieudefensie, Court of Appeal of The Hague (The Netherlands), 12 October 2024, ECL:NL:GHDHA: 2024:2100, English translation at https://uitspraken.rechtspraak.nl/details?id=ECLI:NL:GHDHA:2024:2100.

28 See, eg, Tara J Melish, ‘Putting “Human Rights” Back into the UN Guiding Principles on Business and Human Rights: Shifting Frames and Embedding Participation Rights’ in César Rodriguez-Garavito (ed), Business and Human Rights: Beyond the End of the Beginning (Cambridge University Press 2017) 76, 83–85.

29 The process was launched by UN HRC Res 26/9, Elaboration of an International Legally Binding Instrument on Transnational Corporations and Other Business Enterprises with respect to Human Rights (14 July 2014), UN Doc A/HRC/RES/26/9.

30 ‘Updated Draft Legally Binding Instrument (clean version) to Regulate, in International Human Rights Law, the Activities of Transnational Corporations and Other Business Enterprises’, July 2023, art 6.2 (LBI), https://www.ohchr.org/sites/default/files/documents/hrbodies/hrcouncil/igwg-transcorp/session9/igwg-9th-updated-draft-lbi-clean.pdf.

31 ibid art 4.2(d).

32 ibid art 8.1.

33 ibid art 8.4(a).

34 ibid art 6.5.

35 Wettstein (n 25) 187.

36 LBI (n 30) art 15.

37 ibid Preamble.

38 ibid art 6.5 (‘Each Party shall take necessary measures to ensure that business enterprises take appropriate steps to prevent human rights abuse by third parties where the enterprise controls, manages or supervises the third party, including through the imposition of a legal duty to prevent such abuse in appropriate cases’).

39 cf UN HRC, General Comment No. 31: The Nature of the General Legal Obligation on States Parties to the Covenant (26 May 2004), UN Doc CCPR/C/21/Rev.1/Add.13, para 8 (‘[T]he positive obligations on States Parties to ensure Covenant rights will only be fully discharged if individuals are protected by the State, not just against violations of Covenant rights by its agents, but also against acts committed by private persons or entities that would impair the enjoyment of Covenant rights in so far as they are amenable to application between private persons or entities’).

40 Facebook, ‘Terms of Service’, https://www.facebook.com/legal/terms; Instagram, ‘Terms of Use’, https://help.instagram.com/581066165581870; Threads, ‘Threads Terms of Use’, https://help.instagram.com/769983657850450.

41 Facebook, ‘Privacy Policy’, 26 June 2024, https:///www.facebook.com/privacy/policy; Instagram, ‘Privacy Policy’, https://privacycenter.instagram.com/policy; Threads, ‘Threads Supplemental Privacy Policy’, 13 November 2023, https://help.instagram.com/515230437301944; Threads, ‘Threads Policies and Terms’, https://help.instagram.com/280495901606863/?helpref=hc_fnav.

42 Facebook, ‘Facebook Community Standards’, https://transparency.fb.com/en-gb/policies/community-standards; Instagram, ‘Community Guidelines’, https://help.instagram.com/477434105621119?ref=igtos&helpref=faq_content.

44 Mark Zuckerberg, ‘A Blueprint for Content Governance and Enforcement’, 15 November 2018, https://perma.cc/ZK5C-ZTSX. A revised version of the blueprint can be found at https://www.facebook.com/notes/751449002072082/?hc_location=ufi.

46 Facebook, ‘Oversight Board Bylaws’, February 2022, https://about.fb.com/wp-content/uploads/2020/01/Bylaws-Feb-2022.pdf. The first draft of the bylaws was published in 2020, https://about.fb.com/news/2020/01/facebooks-oversight-board.

48 Wettstein (n 25) 188.

49 cf Associazione di Promozione Sociale Casapound Italia v Meta Platforms Ireland LDT, Tribunale Ordinario di Roma, 5 December 2022, 39, https://globalfreedomofexpression.columbia.edu/wp-content/uploads/2023/08/TribunaleRoma_CasaPoundvMeta_2022.pdf; for an English language summary see https://globalfreedomofexpression.columbia.edu/cases/casapound-v-meta-platforms-ireland-ltd.

50 See, eg, 2022-012-IG-MR, ‘India Sexual Harassment Video’, MOB Decision of 14 December 2022, https://oversightboard.com/decision/IG-KFLY3526 (citing Human Rights Committee, General Comment No. 34 (11–29 July 2011), UN Doc CCPR/C/GC/34).

51 See, eg, 2022-009/10-IG-UA, ‘Gender Identity and Nudity’, MOB Decision of 17 January 2013, https://oversightboard.com/decision/BUN-IH313ZHJ (citing Human Rights Committee views on Nepomnyaschchiy v Russia, UN Doc CCPR/C/123/2318/2013 (2018) and Toonen v Australia, UN Doc A/HRC/19/41 (1994)).

52 See, eg, 2022-013-FB-UA, ‘Iran Protest Slogan’, MOB Decision of 9 January 2023, https://oversightboard.com/decision/FB-ZT6AJS4X (citing HRC Resolution 23/2 (24 June 2013), UN Doc A/HRC/RES/23/2).

53 See, eg, 2022-011-IG-UA, ‘Video after Nigeria Church Attack’, MOB Decision of 14 December 2023, https://oversightboard.com/decision/IG-OZNR5J1Z (citing UN Special Rapporteur for Freedom of Expression Report (6 April 2018), UN Doc A/HRC/38/35).

54 2021-006-IG-UA, ‘Öcalan’s Isolation’, MOB Decision of 8 July 2021, https://oversightboard.com/decision/IG-I9DP23IB.

55 An acknowledgement of the possibility that business enterprises might be more committed than certain states to complying with IHRL is found, however, in UNGP (n 18) commentary on para 23 (‘Where the domestic context renders it impossible to meet this responsibility fully, business enterprises are expected to respect the principles of internationally recognized human rights to the greatest extent possible in the circumstances, and to be able to demonstrate their efforts in this regard’).

56 See, eg, Alia Al Ghussain, ‘Meta’s Human Rights Report Ignores the Real Threat the Company Poses to Human Rights Worldwide’, Amnesty International, 22 July 2022, https://www.amnesty.org/en/latest/campaigns/2022/07/metas-human-rights-report-ignores-the-real-threat-the-company-poses-to-human-rights-worldwide; Paul M Barrett, ‘Meta’s Oversight Board and the Need for a New Theory of Online Speech’, Lawfare, 9 November 2023, https://www.lawfaremedia.org/article/meta-s-oversight-board-and-the-need-for-a-new-theory-of-online-speech.

57 See Oversight Board, ‘Securing Ongoing Funding for the Oversight Board’, 22 July 2022, https://www.oversightboard.com/news/1111826643064185-securing-ongoing-funding-for-the-oversight-board.

58 See, eg, Kate Irwin, ‘Meta’s Oversight Board Confirms Layoffs Are Coming’, MSN News, 29 April 2024, https://www.msn.com/en-us/news/other/meta-s-oversight-board-confirms-layoffs-are-coming/ar-AA1nSyNf?ocid=BingNewsSearch.

59 See Meta, ‘Oversight Board Recommendations’, updated 12 July 2024, https://transparency.meta.com/en-gb/oversight/oversight-board-recommendations.

60 See Karissa Bell, ‘Facebook Wants “Other Companies” to Use the Oversight Board, Too’, Engadget, 17 May 2021, https://www.engadget.com/facebook-oversight-board-other-companies-202448589.html.

61 See, eg, X, ‘Hateful Conduct’, X Help Center, April 2023, https://help.twitter.com/en/rules-and-policies/hateful-conduct-policy (‘Free expression is a human right – we believe that everyone has a voice, and the right to use it … We recognize that if people experience abuse on X, it can jeopardize their ability to express themselves … We are committed to combating abuse motivated by hatred, prejudice or intolerance, particularly abuse that seeks to silence the voices of those who have been historically marginalized. For this reason, we prohibit behavior that targets individuals or groups with abuse based on their perceived membership in a protected category’).

62 See, eg, X, ‘Private Content’, X Help Center, March 2024, https://help.twitter.com/en/rules-and-policies/personal-information; OpenAI, ‘Usage Policies’, 10 January 2024, https://openai.com/policies/usage-policies; Google AI, ‘Our Principles’, https://ai.google/responsibility/principles (‘we will not design or deploy AI in the following application areas: … Technologies that gather or use information for surveillance violating internationally accepted norms’).

63 Apple, ‘Third Party Code of Conduct’, February 2022, https://www.apple.com/compliance/pdfs/third-party-code.pdf (‘Third parties may not discriminate against any worker based on race, color, age, gender, sexual orientation, ethnicity, disability, pregnancy, religion, political affiliation, union membership, national origin, or marital status in hiring and employment practices such as applications for employment, promotions, rewards, access to training, job assignments, wages, benefits, discipline, termination, and retirement. In addition, third parties may not require workers or potential workers to undergo medical tests that could be used in a discriminatory way, except where required by applicable law or regulation or prudent for workplace safety’).

65 TikTok, ‘Youth Safety and Well-Being’, 17 April 2024, https://www.tiktok.com/community-guidelines/en/youth-safety; Reddit, ‘Reddit Content Policy’, https://www.redditinc.com/policies/content-policy (‘Do not share or encourage the sharing of sexual, abusive, or suggestive content involving minors. Any predatory or inappropriate behavior involving a minor is also strictly prohibited’).

66 Microsoft, ‘Standards of Business Conduct: Integrity in Everything We Do’, 14 July 2014, https://www.caseiq.com/wp-content/uploads/2016/01/Microsoft-Standards-of-Business-Conduct-EN-US.pdf (‘We comply with the laws and regulations that govern the rights to and protection of our own and others’ intellectual property including copyrights, patents and trade secrets’).

67 See, eg, Amazon, ‘Amazon Global Human Rights Principles’, https://sustainability.aboutamazon.com/human-rights/principles#:~:text=We%20are%20committed%20to%20ensuring,way%20that%20respects%20human%20rights (‘We are committed to ensuring the people, workers, and communities that support our entire value chain are treated with fundamental dignity and respect. We strive to ensure that the products and services we provide are produced in a way that respects human rights’); Apple, ‘Our Commitment to Human Rights’, https://s2.q4cdn.com/470004039/files/doc_downloads/gov_docs/2020/Apple-Human-Rights-Policy.pdf (‘We’re deeply committed to respecting internationally recognized human rights in our business operations, as set out in the [United Nations International Bill of Human Rights] and the International Labour Organization’s Declaration on Fundamental Principles and Rights at Work. Our approach is based on the UN Guiding Principles on Business and Human Rights. Everywhere we operate, we seek to conduct business in compliance with applicable laws and in accordance with our commitment to respect internationally recognized human rights. When faced with conflicting requirements, in keeping with the UN Guiding Principles, we seek to comply with applicable law and also seek ways to honor our commitment to respect principles of internationally recognized human rights … We conduct robust human rights due diligence to identify salient human rights risks’); TikTok, ‘Upholding Human Rights’, https://www.tiktok.com/transparency/en-us/upholding-human-rights; Intel, ‘Intel Global Human Rights Principles and Approach’, updated December 2023, https://www.intel.com/content/www/us/en/policy/policy-human-rights.html; Oracle, ‘Oracle Human Rights Statement’, December 2020, https://www.oracle.com/assets/human-rights-statement-3208823.pdf.

68 See, eg, Graeme Massie, ‘Elon Musk Fires Twitter’s Human Rights Team as Part of Sweeping Layoffs at Platform’, Independent, 4 November 2022, https://www.independent.co.uk/tech/elon-musk-twitter-employees-layoffs-b2218097.html; TikTok, ‘Community Guidelines Enforcement Report’, December 2023, https://www.tiktok.com/transparency/en-us/community-guidelines-enforcement-2023-3 (‘More than 40,000 trust and safety professionals work alongside innovative technology to maintain and enforce our robust Community Guidelines, Terms of Service and Advertising Policies, which apply to all content on our platform’); Intel, ‘Intel Integrity Line: Ethics and Compliance Reporting Portal’, https://secure.ethicspoint.com/domain/media/en/gui/31244/index.html.

69 See, eg, Steve Crown, Vice President and Deputy General Counsel, Human Rights, ‘Taking on Human Rights Due Diligence’, 20 October 2021, https://blogs.microsoft.com/on-the-issues/2021/10/20/taking-on-human-rights-due-diligence/#:~:text=Respecting%20human%20rights%20is%20a,use%20in%20real%2Dworld%20deployments; Apple, ‘2022 Statement on Efforts to Combat Modern Slavery in Our Business and Supply Chains’, https://www.apple.com/legal/more-resources/Apple-Combat-Human-Trafficking-and-Slavery-in-Supply-Chain-2022.pdf; NVIDIA, ‘Human Rights Policy’, updated 7 July 2022), https://www.nvidia.com/content/dam/en-zz/Solutions/about-us/documents/HumanRightsPolicy.pdf. For a discussion of the application of due diligence obligations to the AI sector and to digital supply chains see Marco Fasciglione, ‘Business and Human Rights in the Age of Artificial Intelligence’ (2022) 2 Federalismi.it 164, 176–81; Christine Kaufmann, ‘Responsible Business in a Digital World – What’s International Law Got to Do With It?’ (2021) 81 Zeitschrift für ausländisches öffentliches Recht und Völkerrecht (Heidelberg Journal of International Law) 781, 804–07.

70 See, eg, Reddit, ‘Transparency Report: January to June 2023’, www.redditinc.com/policies/2023-h1-transparency-report; Intel, ‘2022–23 Corporate Responsibility Report’, https://csrreportbuilder.intel.com/pdfbuilder/pdfs/CSR-2022-23-Full-Report.pdf.

71 See, eg, Intel (n 68); NVIDIA, ‘How to Report a Violation of Our Code’, https://secure.ethicspoint.com/domain/media/en/gui/25599/index.html.

72 See, generally, Elvira Domínguez-Redondo, In Defense of Politicization of Human Rights: The UN Special Procedures (Oxford University Press 2020) 39–43; UN HRC Res 5/2, Code of Conduct for Special Procedures Mandate-holders of the Human Rights Council (18 June 2007), UN Doc A/HRC/RES/5/2, art 6(c).

73 David Kaye, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (22 May 2015), UN Doc A/HRC/29/32, paras 27–28, 62 (emphasis added).

74 David Kaye, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (11 May 2016), UN Doc A/HRC/32/38, para 90 (emphasis added).

75 David Kaye, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (30 March 2017), UN Doc A/HRC/35/22, paras 82–83 (emphasis added).

76 David Kaye, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (6 April 2018), UN Doc A/HRC/38/35, para 70.

77 ibid para 71.

78 David Kaye, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (9 October 2019), UN Doc A/74/486, para 58(b) (‘[all companies in the ICT sector should:] Adopt content policies that tie their hate speech rules directly to international human rights law, indicating that the rules will be enforced according to the standards of international human rights law, including the relevant United Nations treaties and interpretations of the treaty bodies and special procedure mandate holders and other experts, including the Rabat Plan of Action’).

79 David Kaye, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (28 May 2019), UN Doc A/HRC/41/35, para 69.

80 ibid para 67.

81 Irene Khan, Report of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression (13 April 2021), UN Doc A/HRC/47/25, para 101.

82 ibid para 65.

83 See, eg, Joseph A Cannataci, Report of the Special Rapporteur on the Right to Privacy (16 October 2019), UN Doc A/HRC/40/63, paras 89–98, 106; Joseph A Cannataci, Report of the Special Rapporteur on the Right to Privacy (23 July 2021), UN Doc A/76/220, para 114(c)–(d); Philip Alston, Report of the Special Rapporteur on Extreme Poverty and Human Rights (11 October 2019), UN Doc A/74/493, paras 72–74.

84 See Kaye (n 76). The Special Rapporteur also obtained one written submission from a technology company ahead of the report, https://www.ohchr.org/sites/default/files/Documents/Issues/Opinion/ContentRegulation/Github.pdf.

85 UN Working Group on Business and Human Rights, Statement at the End of Visit to the United States, UN Working Group on Business and Human Rights, Washington D.C., 1 May 2013 (2 May 2013), https://www.ohchr.org/en/statements/2013/05/statement-end-visit-united-states-un-working-group-business-and-human-rights?LangID=E&NewsID=13284. The Working Group has conducted visits to other industrialised countries where it reviewed concerns about the IHRL practices of companies, including technology companies, and issued recommendations to such corporations; see, eg, UN HRC, Report of the Working Group on the Issue of Human Rights and Transnational Corporations and Other Business Enterprises on Its Visit to the Republic of Korea (1 May 2017), UN Doc A/HRC/35/32/Add.1, paras 80–90.

86 See Tomoya Obokata, Report of the Special Rapporteur on Contemporary Forms of Slavery, including Its Causes and Consequences (16 July 2020), UN Doc A/75/166, para 11 (the visit was cancelled for COVID-19 related reasons).

87 David Kaye, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to Apple Inc. (4 August 2017), OL OTH 16/2017, https://www.ohchr.org/sites/default/files/Documents/Issues/Opinion/Legislation/OLOTH.pdf.

88 Business & Human Rights Resource Centre, UN Special Rapporteur on Freedom of Expression Calls for Oversight Board’s Review Standards to Integrate Human Rights Law, 1 May 2019, https://www.business-humanrights.org/en/latest-news/un-special-rapporteur-on-freedom-of-expression-calls-for-oversight-boards-review-standards-to-integrate-human-rights-law.

89 Irene Khan, Public Comment by UN Special Rapporteur on Freedom of Opinion and Expression Irene Khan on Facebook Oversight Board Case No. 2021-009, ‘Uneven Content Moderation in the Middle East’ (9 September 2021), https://www.ohchr.org/sites/default/files/Documents/Issues/Opinion/Legislation/Case_2021_009-FB-UA.pdf.

90 Irene Khan, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to Oversight Board LLC (5 July 2023), OL OTH 90/2023, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=28221.

91 David Kaye, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to NSO Group (20 February 2020), OL OTH 2/2020, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=25079; David Kaye, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to NSO Group (18 October 2019), OL OTH 52/2019, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=24905.

92 David Kaye, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, and Githu Muigai, Chair-Rapporteur of the Working Group on the Issue of Human Rights and Transnational Corporations and Other Business Enterprises, Letter to TikTok (15 May 2020), OL OTH 37/2020, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=25243.

93 See, eg, Göran Marby, President and Chief Executive Officer, Internet Corporation for Assigned Names and Numbers (ICANN), ‘RE: Joint Communication from Special Procedures’, 7 March 2020, https://spcommreports.ohchr.org/TMResultsBase/DownLoadFile?gId=35242; Shalev Hulio, Chief Executive Officer for NSO Group Technologies, ‘RE: NSO Human Rights and Whistleblower Policies Response to February 20, 2020 Letter’, 1 June 2020, https://spcommreports.ohchr.org/TMResultsBase/DownLoadFile?gId=35326.

94 Irene Khan, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to Google LLC (24 March 2023), JAL OTH 19/2023, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=27926. The letter was also sent in the name of the Special Rapporteur on the Situation of Human Rights Defenders and the Working Group on Arbitrary Detention.

95 Irene Khan, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to TikTok (13 March 2023), JAL OTH 10/2023, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=27882; Irene Khan, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to Omegle LCC (13 March 2023), JAL OTH 11/2023, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=27883. The letters were also sent in the name of the Special Rapporteur on the sale and sexual exploitation of children, including child prostitution, child pornography and other child sexual abuse material; the Working Group on the issue of human rights and transnational corporations and other business enterprises; the Special Rapporteur on the right to education; the Special Rapporteur on contemporary forms of slavery, including its causes and consequences; the Special Rapporteur on trafficking in persons, especially women and children; the Special Rapporteur on violence against women and girls, its causes and consequences; and the Working Group on discrimination against women and girls.

96 Irene Khan, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to Telegram Messenger LPP (9 March 2023), JAL OTH 12/2023, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=27891. The letter was also sent in the name of the Special Rapporteur on the situation of human rights in Myanmar; the Working Group on the issue of human rights and transnational corporations and other business enterprises; the Special Rapporteur on the rights to freedom of peaceful assembly and of association; the Special Rapporteur on the situation of human rights defenders; the Special Rapporteur on the right to privacy; the Special Rapporteur on violence against women and girls, its causes and consequences; and the Working Group on discrimination against women and girls.

97 Irene Khan, Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Letter to NSO Group Technologies (3 July 2023), JAL OTH 62/2023, https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=28135. The letter was also sent in the name of the Special Rapporteur on the situation of human rights defenders; the Working Group on the issue of human rights and transnational corporations and other business enterprises; the Special Rapporteur on the rights to freedom of peaceful assembly and of association; and the Special Rapporteur on the right to privacy.

98 The list of communications for the mandate is found at https://spcommreports.ohchr.org/TmSearch/Mandates?m=24. The Working Group on Business and Human Rights has also addressed a fair number of communications against technology companies, including Apple, Google and 4Sale, see https://spcommreports.ohchr.org/TmSearch/Mandates?m=45.

99 See, eg, UN HRC, Communications Report of Special Procedures (13 September 2021), UN Doc A/HRC/48/3, 12–17 (listing, eg, replies by TikTok and Nintendo and no replies from Apple and Amazon).

100 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) [2022] OJ L 277/1.

101 Digital Markets Act (n 11).

102 Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down Harmonised Rules on Artificial Intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act) [2024] OJ L 2024/1689, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ:L_202401689.

103 General Data Protection Regulation (n 13).

104 See, eg, Digital Services Act (n 100) s 5.