Hostname: page-component-78c5997874-8bhkd Total loading time: 0 Render date: 2024-11-13T00:16:34.385Z Has data issue: false hasContentIssue false

Platform power and regulatory capture in digital governance

Published online by Cambridge University Press:  28 October 2024

Katharina Kausche
Affiliation:
LMU Munich, Munchen, Germany
Moritz Weiss*
Affiliation:
LMU Munich, Munchen, Germany
*
Corresponding author: Moritz Weiss; Email: moritz.weiss@gsi.lmu.de
Rights & Permissions [Opens in a new window]

Abstract

Digital governance is a public concern, yet under private control. After numerous scandals, all stakeholders in the European Union (EU) agreed to establish a “novel constitution for the internet” that would effectively constrain the power of large platforms. Yet the Digital Services Act (DSA) ultimately legitimized and institutionalized their position as the gatekeepers of the internet. Why? We argue that platforms prevailed thanks to their ability as intermediaries to quietly shape the available policy options. Our “platform power mechanism” combines institutional and ideational sources of business power to show how big tech drew on its entrenched position as an indispensable provider of essential services and promulgated the idea of itself as a responsible and neutral intermediary. We follow the unfolding of platform power through a process-tracing analysis of Google and Meta’s activities with respect to DSA legislation from its announcement (2020) to its adoption (2022). Besides contributing a reconceptualization of the DSA as a regulatory capture, we integrate the notion of platform power into a “regulator–intermediary–target” model and demonstrate how gatekeepers have exploited information asymmetries to share “the public space.” Our analysis thus supplements established approaches that have derived regulators’ deference to platforms from the tacit allegiance of consumers.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of Vinod K. Aggarwal

Introduction

The digital revolution has made internet regulation an increasingly urgent political task, as more and more scandals caused by failed data protection have become public in the recent decade. While the Snowden revelations were still fresh in mind, the next scandal popped up in 2018 when the British consulting firm, Cambridge Analytica systematically collected personal data belonging to millions of Facebook users without their consent and employed them for political advertising in the 2016 US presidential elections. In addition to these data protection scandals, the global spread of disinformation and hate speech has increasingly been regarded as a major threat to democratic governanceFootnote 1 so some scholars even warn of a “weaponization of social media in a so-called LikeWar.”Footnote 2 This growing political demand has called on regulators worldwide to reform the existing institutional arrangements governing the exchange of data and the moderation of content on the internet.

The European Union (EU) appeared to be the most formidable candidate to supply tightened internet regulation in general and to trim the powerful position of large platforms in particular. Recent policy research found that the EU has comprehensively shifted towards stricter control of the private sector across several digital policy areas.Footnote 3 Also, prominent legal scholars stressed the strength of Europe’s “rights-driven regulatory model,” which needs to be differentiated from the United States’ and China’s by virtue of its protection of individual users against both platform power and state interference.Footnote 4

Beyond scholars, practitioners and regulators emphasized the success and impact of the EU’s existing General Data Protection Regulation (GDPR); and on her first appointment as EU Commission President in 2019, Ursula von der Leyen designated digital governance as one of her priorities,Footnote 5 and Commissioner Vestager declared a tougher stance vis-à-vis big tech.Footnote 6 Likewise, powerful legislators such as Germany and France joined in and vehemently called for stricter rules on data protection and content moderation, supported, not surprisingly, by civil rights organizations such as the European Digital Rights (EDRi) network.Footnote 7 Finally, even the platforms themselves supported efforts to reform the existing self-regulation. For instance, Mark Zuckerberg from Meta voiced the belief that “we need a more active role for governments and regulators”Footnote 8 so that eventually the Wall Street Journal published a report under the headline: “Big Tech Braces for a Wave of Regulation.”Footnote 9 As a result, this new Brussels consensus among all its stakeholders made the EU the most likely regulator to supply the growing demand for strict internet regulation that would protect the rights of individual users and robustly constrain the leverage of large platforms.

This overwhelming push to supply tight regulation formed the point of departure when EU regulators updated their relationships with both the intermediaries (i.e., large platforms) and the targets (i.e., individual users) at the end of 2019. The regulatory status quo was defined by the more than 20-year-old e-commerce directiveFootnote 10 ; and the EU’s proclaimed objective was nothing less than a new “constitution” for the internet.Footnote 11 Yet, the actual outcome, the EU’s Digital Services Act (DSA) of 2022, hardly lived up to this ambition. If it was a constitution, it was a “captured”Footnote 12 one, as its implementation and enforcement will undoubtedly entail an “uphill battle to rein in big tech.”Footnote 13

We draw on the regulator–intermediaries–target (RIT) modelFootnote 14 to conceptualize the institutional design that follows from DSA reforms. Although the EU as regulator has, admittedly, supplemented the existing self-regulation of intermediaries (i.e. platforms), their autonomy and thus their power position has not been weakened but rather reinforced vis-à-vis both the regulator and the target (i.e. individual users). Now, platforms not only share rule-setting with the regulator but they also act as the main enforcers of the digital space. Despite some new, but vague regulatory constraints on large platforms, the DSA has institutionalized intermediaries’ indispensable, responsible, and neutral position in internet regulation in Europe and beyond. The question thus arises of why the combination of an overwhelming demand and a straightforward supply option did not result in strict regulation of large internet platforms: Why have the intermediaries ultimately prevailed over the regulator?

We argue that platform power enabled intermediaries to capture institutional reforms. The powerful position of large data platforms has allowed them to quietly shape the availability of policy options, defend their autonomy, and advocate their interests. Our “platform power mechanism” will demonstrate—both theoretically and empirically—how big tech drew on its entrenched position as an indispensable provider of essential services in order to guarantee platforms’ autonomy from future regulatory interference. Most crucially, platforms succeeded in removing the idea of “liability” from the reform agenda and replaced it with the notion of responsibility, which ultimately discharged them from legal obligations. They also advanced an ideational strategy of being neutral intermediaries that promote merely technical solutions and help to provide public goods such as freedom and innovation. We forecast that their interests will further prevail as the DSA has not only legitimized their former behavior but also institutionalized their position for the future. After having reinforced their entrenched position, Meta and Google, for instance, will necessarily be involved in any “quiet”Footnote 15 attempt to reform the rules and implementation of internet regulation in the future. This paper will demonstrate the unfolding of platform power by a process-tracing analysisFootnote 16 of the DSA’s legislation process from its announcement by the Commission in January 2020 to its adoption in the Council in October 2022. Our explanatory strategy has focused on two of the main players, Meta and Google, which has allowed us to reduce the corpus of analyzed documents to 214 (esp. documents from legislation, internal lobby meetings, trilogues, and the media).

This paper aims for two main contributions. First, we critically question the widespread—and insistent—emphasis on the EU’s alleged public interventionism in internet regulation.Footnote 17 Our investigation suggests, by contrast, that, rather than being tamed, large platforms ultimately received what they initially preferred. Most significantly, they were able to preserve their autonomy as intermediaries by securing their own “terms of use” as the most relevant rules for individual users, the ultimate target of regulation. By drawing on the RIT model as our conceptualization of the DSA’s new institutional design (i.e. our explanandum), we highlight the soft nature of ex post control instruments which will shape the future opportunities of platforms to moderate internet content. What users will ultimately glimpse online remains to be controlled by private platforms rather than public regulators. The platforms’ entrenched position has been reinforced and legitimized, and their powers of enforcement have even been institutionalized with respect to the moderation of digital content. Unlike conventional vested interests that are somewhat outside the political decision-making system,Footnote 18 the RIT model sheds light on the fact that large platforms will now “share the public space”Footnote 19 with regulators and targets. This ultimately allows them exerting significant influence on the space’s “rules of the game.” Despite the fact that their entrenched position and ultimately their very business model were seriously threatened at the outset of the DSA’s legislative process in late 2019, they prevailed and ultimately regained the driver’s seat of regulating the moderation of digital content.

Second, our explanation contributes not only to conceptualizing platform powerFootnote 20 but to investigating it empirically as it unfolds within an RIT setting. We integrate two often separated sources of platform power—the pre-existing self-regulatory institutions and the importance of ideas—into a generalizable mechanism to explain processes of how regulators make via intermediaries collectively binding decisions for the digital domain. This allows us to trace how entrenched gatekeepers exploit ideational asymmetries and leverage their power position into desired outcomes.Footnote 21 Our approach helps to explain how these private firms influence the policy agenda by tracing their lobbying activities mostly in quiet politics; i.e. platforms’ influence partly depends on the extent of an issue’s salience within the policy process. “The more the public cares about an issue, the less managerial organizations will be able to exercise disproportionate influence over the rules governing that issue.”Footnote 22 Given that the two-years regulatory design process was relatively quiet, our “backdoor perspective” complements scholarship on platform power, which primarily derives regulators’ deference to platforms from the “tacit allegiance of consumers.”Footnote 23 Moreover, empirical studies of the DSA found that platforms had hardly applied these indirect strategies, but drew on direct forms of policy influence instead.Footnote 24 Therefore, our mechanism supplements this seminal line of theorizing platform power through consumers by investigating platforms more direct forms of shaping regulation on their terms. This makes it theoretically applicable to policy-making processes where public regulators need to interact with platforms, whose preferences however diverge—be it in the EU or beyond.

The paper proceeds as follows. First, we draw on the RIT model to conceptualize and empirically map the regulatory outcome, namely the EU’s Digital Services Act. Second, we introduce our theoretical framework that shows how platform power ties in with the RIT model to derive an explanatory mechanism focusing on the interactions between the regulator and intermediaries in the DSA. Third, we set out our methodology and present the findings of our process-tracing analysis, which showcase how Google and Meta constrained institutional reforms and pushed the available policy options closer to their own interests. Finally, we discuss the implications of our paper as well as the future of the EU’s regulation of digital services.

The RIT Model and the EU’s reform of internet regulation

The institutional design of internet regulation in the EU used to be based on the 20-year-old e-commerce directive.Footnote 25 Yet, given that “Internet sites often serve as platforms for disinformation, bullying, hatred, and repulsive content, undermining the safety and dignity of individuals while dividing societies and destabilizing democracies,”Footnote 26 EU policy-makers were called upon to reform these mainly self-regulatory arrangements and to design safeguards against those threats.Footnote 27

The regulator (R), the intermediaries (I), and the target (T)

The recent institutional reform (i.e. the DSA)—jointly with the Digital Markets Act—seeks “to create a safer digital space in which the fundamental rights of all users of digital services are protected.”Footnote 28 In accordance with these objectives, the EU as a regulator makes rules; individual users as targets take rules. Yet, the DSA also “foresees an important role for private entities both when it comes to the further elaboration of the regulatory framework applicable to intermediary service providers and to its enforcement.”Footnote 29 Given the prominent position of large platforms such as Google and Meta,Footnote 30 this form of regulation “operates indirectly via chains of intermediation” so that “regulators and targets can expand their capacities by selecting, [and] engaging intermediaries.”Footnote 31 These intermediaries perform specific functions and relevant roles as they possess and operate capacities that regulators normally lack or could only provide at higher cost.

We draw on this RIT modelFootnote 32 —the chain from the regulator (R) via intermediaries (I) towards a target (T)—to conceptualize the DSA as an outcome of the EU’s institutional reform of internet regulation. The power position of intermediaries (i.e., large platforms) depends on the extent of political intervention. The more control a regulator exercises over intermediaries, the less autonomy intermediaries have to act and the less power they have to impose their preferences on outcomes. While “power works in various forms and has various expressions that cannot be captured by a single formulation,”Footnote 33 we will draw on the working definition: A causes B (i) to do or (ii) to know something that B otherwise would not.Footnote 34 We distinguish between three conceivable designs of our RIT model and thus three distinct power positions of intermediaries.Footnote 35

First, we start out from an RIT design characterized by the near absence of direct control and involvement by the regulator (i.e., the EU) and, therefore, operating mostly under rules set by intermediaries (i.e., large platforms) for the targets (i.e., individual users). Such forms of self-regulation are normally initiated by private actors and characterized by their voluntary and non-binding engagement. In such an RIT design, intermediaries perform an extremely powerful and thus influential role almost without any interference by the regulator. Yet, the political level is not fully excluded. Policy-makers and regulators often informally encourage private actors to take action, a setup that has sometimes been described as “regulation by raised eyebrow.”Footnote 36

Second, the regulator may also share responsibility for rule setting and enforcement with intermediaries,Footnote 37 though the specific balance between them may vary from case to case.Footnote 38 For instance, one might think of public regulators setting the general rules, while intermediaries oversee the operational dimensions of implementation and monitoring.Footnote 39 The intermediaries’ power position is more constrained by the regulator’s control than in the first instance, but there are still numerous opportunities for them to impose their interests on both regulators and targets. Given its inherent vagueness, this design suggests a relatively unstable equilibrium, as it will arguably tilt the balance more towards those in charge of implementing the rules than those setting them.

Third, the regulator may also prevent intermediaries as far as possible from engaging with the targets. For instance, statuary regulation largely implies a logic of command and control and thus hierarchical direction, even though intermediaries are often involved.Footnote 40 Despite some flexibility within this type, the common ground is that “the government sets the regulations and enforces them.”Footnote 41 This results in maximal control by the regulator and thus minimal autonomy as well as a weak power position for the intermediaries.

The RIT design of the DSA

Against the backdrop of these three conceivable RIT designs, this step explores the EU’s recent institutional reforms to internet regulation in the DSA. How does the EU, as regulator, interact with large platforms, as intermediaries, in both rule setting and rule enforcement? We start out from the regulator, then, gravitate towards the intermediaries.

First, the EU as a regulator has set several standards in the DSA that are binding on private actors.Footnote 42 However, these hierarchically set standards are mostly prescriptive about what is to be achieved rather than about how precisely it is to be achieved.Footnote 43 As a result, intermediaries are granted the right to draft many of the standards that define how the regulatory objectives of the DSA are to be met. Take, for instance, the supposedly far-reaching rule prohibiting advertising targeted at minors.Footnote 44 At first glance, this is highly interventionist as it directly controls how platforms address different targets. It may even challenge the very business models of intermediaries such as Google and Meta.Footnote 45 Yet, given that a ban only applies if platforms know “with reasonable certainty” (Art. 28)Footnote 46 that a user is under age, it may turn out to be hollow. The reason is that the DSA does not even oblige platforms to gather information about this characteristic.Footnote 47 The general and merely prescriptive nature of rules of this kind thus grants intermediaries substantial discretion in how to implement the standards set by the EU. The devil is eventually in the detail.

In a similar vein, the EU’s Commission is, at first glance, designated as the key enforcer of the DSA. Due to the fact that it is an asymmetric regulation, some specific elements only apply to designated intermediaries. For instance, this is the case for systematic risk assessment and risk mitigation measures, which only apply to Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs) and which are primarily overseen and enforced by the Commission. Jointly with the so-called Digital Services Coordinators (who are appointed by the member states), the Commission is entitled to conduct on-site inspections, interviews, and request data from platforms,Footnote 48 on whom it can impose extensive fines of up to 6 percent of their worldwide turnover.Footnote 49 The Commission is also designated to supervise VLOPs and VLOSEs with more than 45 million users per month in the EU. However, assessing their compliance is left to independent auditors, which—under the DSA—are supposed to define and operationalize the auditing criteria themselves. Therefore, several parties have noted that such a design might lead to “audit-washing”Footnote 50 . Furthermore, “the mere expectation to provide auditing services to the same provider in the future might influence the auditor’s objectivity.“Footnote 51 In addition, legacies of prior arrangements, such as non-binding codes of conduct, continue to play an important role in key areas, such as targeted advertising or systemic risk assessment (Arts. 35–37). They have been designed by the intermediaries themselves. Here, the Commission, as regulator, is referred to “in weak terms,” by the use of such expressions as “facilitate,” “invite,” and “aim to ensure.”Footnote 52

Second, large platforms, as intermediaries, have preserved autonomy and control over their business models thanks to the DSA. They remain largely immune from extensive regulatory intervention. Most significantly, Art. 6 defines that “the service provider shall not be liable for the information stored at the request of a recipient of the service,”Footnote 53 but providers are encouraged to take voluntary measures (e.g., monitoring technology, indexing content) to prevent hate speech and fake news from spreading. As long as they act in good faith and with due diligence (Art. 7), they are relatively free as regards achieving the general objectives of the DSA. For example, if providers find harmful content and delete it, they have to provide a transparent and accessible complaint mechanism to enable the decision to be questioned. Buri and van HobokenFootnote 54 have stressed that the responsibility for pursuing this lies principally with the platforms themselves. It is based on their terms of use, rather than on specified policies set out in the DSA—for instance, “through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress [Art. 17 (3) (f)].”Footnote 55 As a result, there is a “tendency to outsource primary decisions on fundamental rights and speech governance to platforms.”Footnote 56 In other words, intermediaries have sweeping discretion over how individuals (i.e. targets) exercise their rights and obligations in digital space.

In a similar vein, intermediaries are relatively autonomous in implementing the enforcement of the DSA.Footnote 57 This becomes particularly clear with regard to the asymmetric approach of the DSA. “So-called “Very Large Online Platforms” and “Very Large Online Search Platforms” mainly monitor themselves (i.e. their compliance with the DSA and (self-set) codes of conduct). They are directly supervised by the Commission and regularly provide reports, risk assessments, and mitigations (Arts. 34–35). Their supervision is “relatively light touch.”Footnote 58 While the platforms are obliged to evaluate the systemic risk stemming from their service, they do this via a self-assessment and impose mitigation measures on that basis. This allows them to design mitigation in accordance with their own terms of service as well as already established internal practices.

These procedures, then, serve as the foundation for the European Board of Digital Services and the Commission to further identify “best practices,” which may ultimately be translated into general guidelines (Art. 27). The DSA also designates an important role for civil rights organizations by integrating their experts into consultations and recommendations.Footnote 59 Yet, it remains vague what the future role of these experts will be with regard to drafting guidelines and standards (e.g. regular round tables).Footnote 60 Moreover, rather than by the regulator, the audit supervisor is chosen by intermediaries to conduct the annual audit. While the Commission retains the right to impose fines, it needs to clarify that intermediaries have not complied with imposed obligations in the way of risk assessments and audits.Footnote 61 Due to its scarce financial ressources for an adequate enforcement, the Commision aims to impose a supervision fee (up to 0,05 percent of the annual turnover) on the VLOPs and VLOSEs.Footnote 62 Given that it is seeking to enforce the DSA with approximately “123 full-time employees” but “Meta and TikTok … each had more than 1,000 people working on DSA implementation at the time,” it will remain an “uphill battle”Footnote 63 at best. Figure 1 summarizes the findings of our analysis:

Figure 1. The regulator–intermediaries–target model and the European Union’s internet regulation.

In sum, the RIT model helps to conceptualize what is at stake in internet regulation. We have introduced three baseline RIT designs that set out how the intermediaries’ power position may be institutionalized in setting and enforcing the rules governing digital space. They range from essential autonomy in self-regulatory arrangements via a sharing of tasks with the regulator up to virtual exclusion of intermediaries from statutory regulatory design. Unlike the initial expectations of the DSA in 2019, the so-called Brussels consensus regarding a strong demand for tight controls on intermediaries has not materialized into institutional reforms. Instead, intermediaries’ autonomy and thus their power position has been institutionalized vis-à-vis both the EU as regulator and individual users as target. To some extent, it has even been strengthened since their role as co-regulator and main enforcer of digital space has been formally legitimized. Two characteristics of the DSA’s design are of particular importance. First, the idea of liability—that is, who is ultimately in charge of illegal and harmful content—was strongly contested as this rule-setting directly touches upon the degree of ex post control between the regulator and the intermediary. Second, the idea to specify technological instruments—that is, who is in charge of future implementation—was another contested issue as it speaks to the future response to technological innovation and thus also defines the relationship between the regulator and the intermediary. If both ideas had prevailed, this would have resulted in tight control of platforms. As it stands today, however, the DSA’s design has clearly fallen behind these initial ambitions. Platforms will largely continue to predominate the day-to-day practices of internet regulation and content moderation in the EU and beyond.Footnote 64

Theorizing the platform power of intermediaries in the RIT model

This section theorizes how intermediaries shape institutional reforms by employing platform power within the legislative process, which will then serve as the basis for specifying a causal mechanism to explain the RIT design of the EU’s DSA.

Intermediaries in institutional reforms

How can an intermediary that has been formally excluded from the legislative reform process (i.e., through lack of veto) nevertheless shape the outcome? Our starting point in tackling this question is “how ideas and institutions limit the range of possible solutions that policymakers are likely to consider when trying to resolve policy problems.”Footnote 65 The premise is that any institutional reform like the DSA involves distributional conflicts in general, and a potential challenge to vested interests of intermediaries in particular.Footnote 66 All intermediaries are thus expected to employ their power to move the outcome of institutional reform in a preferred direction.

Unlike conventional business firms, however, platforms can draw on an additional asset, namely the support of consumers (i.e., regulatory targets), which they can combine with their scale advantages as classic monopolies. Public decision-makers anticipate the “political fallout to which overeager regulators would expose themselves by messing with the infrastructure of people’s lives.”Footnote 67 This reluctance to antagonize consumers gives rise to a permissive environment for platforms to influence institutional reforms. Enacting platform power within the legislative process is, then, based both on their entrenched capacities to leverage information asymmetries and their ideational expertise as digital gatekeepers.Footnote 68 We theorize two distinct, but mutually reinforcing, processes whereby platforms can shape desired institutional reforms of digital governance.

First, institutional platform power derives from the “entrenched position” of intermediaries that enables them to contribute to the basic provision of those services that are essential for a given society in the capacity of “infrastructural goods.”Footnote 69 More specifically, the institutional status quo of internet regulation has mostly been one where platforms regulate themselves (e.g., through non-binding codes of conduct).Footnote 70 Therefore, their power does not exclusively derive from economic markets themselves or lobbying activities, but from certain institutional arrangements through which regulators “invite or allow private interests to play a central role in providing crucial collective goods on which society depends.”Footnote 71 Feedback effects from both formalFootnote 72 and informalFootnote 73 institutions structure interactions between regulators and intermediaries. Rather than deriving from a strict separation of their respective domains, institutional platform power unfolds by “sharing public space.”Footnote 74

Put briefly, the more entrenched intermediaries are in the provision of public (infrastructural) services, the more they will be able to appear indispensable and the better they will be able to impose their preferences on institutional reforms.

Second, ideational platform power derives from the fact that regulators, intermediaries, and targets engage in activities as a result of interpreting their surroundings, which are constituted by narratives and ideas.Footnote 75 For instance, intermediaries “seek to influence the beliefs of others by promoting their own ideas at the expense of others.”Footnote 76 Hence, they may employ ideational elements as tools to shape the definition of the policy problem as well as the resulting solution, which makes their power, at least partly, traceable in policy-making processes.Footnote 77 More specifically, intermediaries draw on different ideational strategies to shape the reform agenda by framing and enhancing desired options. For instance, rather than being one idea among several alternatives, a preferred idea may be framed as the only objective and neutral idea. It is not an idea, it is the idea. It does not serve some particular interest, it is in the public interest instead. Within the context of any technological revolution, it is helpful to be credited with responsibility and techno-solutional expertise.Footnote 78 If regulators essentially subscribe to these ideas, intermediaries may be successful in “influencing public actors’ beliefs both in the necessity and effectiveness of the solutions offered by platforms and in the compatibility of public and private interests.”Footnote 79 Both institutional change and resistance to reforms may be based on strategically employing ideas:

Put briefly, the better intermediaries succeed in appearing to be responsible experts and in putting forward (neutral) techno-solutionist arguments, the more they will be able to frame the reform agenda in the way they desire and the better they will be able to impose their preferences on institutional reforms.

If we hypothesize that these two processes might mutually reinforce one another, the design of an RIT model of this kind would indicate “weak regulatory capture”: that is to say that the institutional reform has intentionally created a regulation that serves the regulated intermediaries rather than the public interest.Footnote 80 This form of regulatory capture is to be understood as weak rather than strong because the public will still benefit from the institutional reform more than from maintaining the status quo. Yet, it seems not to be in the public interest for intermediaries, as gatekeepers, to be able to decide on access to and the distribution of information. The same applies to information exploitation. Institutional reforms to tackle that part of platform power from a public interest angle would include, for instance, “ending surveillance-based business models (by requiring platforms to spin off their ad networks).”Footnote 81 Hence, the new regulation serves the regulated intermediaries first and foremost—even though the public may be better off than under the e-commerce directive of 2000.

This form of regulatory capture is fully conceivable from an RIT perspective: “Intermediaries often have unusual influence in these settings: their expertise, informational advantages, and experience put them in a better position than the regulator in terms of understanding what modifications are needed. In such cases, intermediaries become the leaders in regulation, with the ostensible rule-makers following them.”Footnote 82 The next section draws on these two processes to develop a “platform power mechanism,” which helps us to link these theoretical expectations with more specific empirical observations of the DSA legislative process.

The ‘platform power mechanism’ in the DSA

Applied to the DSA, the trigger for institutional reform is the perceived demand for more effective internet regulation in general, and better protection of private data in particular. This demand aligns with an overwhelming majority of stakeholders, who all prefer a tighter RIT design with less autonomy for and closer control of platforms. As there was no essential opposition to such an approach, we can even speak of a Brussels consensus among regulators and intermediaries in January 2020, when the legislative process for the DSA began.

The outcome, then, is an institutional design of shared co-regulation. The DSA is neither an instance of tight regulation nor will it effectively control intermediaries. As we have shown above, it reinforces and institutionalizes a powerful position for large platforms as intermediaries in both rule-setting and rule enforcement. This indicates a weak form of regulatory capture. The two theorized processes help us to formulate the “actors and activities” that interact in a causal pathway leading from the trigger to the outcome.Footnote 83

First, large platforms have informally been integrated as intermediaries into the legislative process as they have been recognized as indispensable providers of essential services. Rather than merely powerful firms, they “exercise broad control over the terms of access to crucial services on which a wide range of other actors depend.”Footnote 84 Yet, their explicit consent is not necessary for legislation. Given the strong trigger of the Brussels consensus, future regulation may involve more liability for platforms, reduce the autonomy of intermediaries, and thus strengthen the regulator’s control.Footnote 85 This is the most conceivable regulatory outcome as platforms have no formal veto power, in spite of their entrenched institutional position.

Consequently, when interactions between the regulator and intermediaries unfold as part of the legislative hearing process, platforms need to focus their efforts on shaping the decision-making agenda.Footnote 86 Accordingly, we expect them to stress their indispensability and to opt for a quiet politics strategy—rather than pushing, for instance, the liability question into the public sphere. The “preferred arena of conflict for business is typically a regulatory conference room rather than a public parliamentary hearing”Footnote 87 —somewhere where informal meetings can be held with key decision-makers to address the regulatory proposals. These meetings serve as venues for intermediaries to make them appear as indispensable providers of public goods and to articulate their definition of the problem in a less formal and less public setting. By contrast, extensive media coverage of legislation is undesired as a “prime indicator of loudness.”Footnote 88

Second, intermediaries will advance an ideational strategy to shape a policy agenda that highlights the responsibility of large platforms, on the one hand, and their technical expertise and thus neutrality when addressing problems, on the other. The regulators’ intervention becomes unnecessary and possibly harmful to future innovation. Therefore, intermediaries will fiercely (i) contest platforms’ liability for content and highlight soft responsibility instead; (ii) they will also stress uncertainty about future policy problems, thus suggesting a flexible technological fix by the experts themselves. The objective of this ideational strategy is to shape the available policy options of the regulator in a desired way to preserve intermediaries’ autonomy of the self-regulatory status quo.

The first part starts out with a self-designation as an impartial platform rather than a media or telecommunications company, as “the term is a valuable and persuasive token in legal environments.”Footnote 89 This notion of a neutral host of content allows them to circumvent regulations that typically govern traditional media and telecommunications sectors (especially not being liable for the content that is distributed via their technological infrastructure). As a consequence, liability for content rests with the targets (i.e. individual users), rather than the producers.Footnote 90 Yet, an ideational strategy that consists only of opposing liability may damage the intermediaries’ reputation for innovation, so a more constructive approach may become necessary. When the regulator emphasizes the demand for platform liability, intermediaries may counter with a proposal of platform responsibility for the content they host, even though “the shape of this responsibility is by no means clear.”Footnote 91

The second part of the strategy involves framing most of the regulatory challenges, such as hate speech or misinformation, as problems whose nature is uncertain, but is likely, ultimately, to be technical rather than political. This idea needs to draw on technological solutionism—that is, the belief that there is a technical solution for any problem and there is always an expert who is capable of designing such a solution.Footnote 92 Since intermediaries are the ones with expertise in technical issues, they are indispensable and should have flexible discretion with regard to how these issues are to be addressed without being hampered by control and tight prescriptions. Given their technical rather than political nature, these measures will arguably have few distributional implications. They are based on the idea of neutral expertise.

In sum, this section has theorized and empirically specified how large platforms, as intermediaries, respond to the regulator’s willingness to tighten regulation in digital governance. They draw on their entrenched position to employ ideational strategies that may help to preserve some of their autonomy and prevent tighter regulation. Figure 2 summarizes our ‘platform power mechanism’ in the DSA’s legislative process:

Figure 2. The “platform power mechanism” at work in Digital Services Act legislation.

Large platforms as intermediaries in the DSA legislative process

Our analysis draws on Beach and Pedersen’s theory-building process-tracing.Footnote 93 We seek to identify observable manifestations that are empirical fingerprints of our “platform power mechanism.” This will enhance the degree of confidence in our theorizing efforts. We focus, therefore, on the regulator’s and intermediaries’ (i.e. the two entities) activities in the legislative process and explore how the trigger is translated through our “platform power mechanism” into the outcome (i.e. the relative autonomy of intermediaries and weak regulatory capture). As far as large platforms are concerned, we have chosen to focus on Google and Meta.Footnote 94 Google dominates the realm of search engines and Meta owns several of the most relevant social media platforms (e.g. Facebook).Footnote 95 Finally, they are both key to issues of content moderation, targeted advertising, and thus private data protection, more broadly.

Given that our theoretical approach also stresses informal practices of quiet politics, we need to focus on the use of internal channels—outside the public sphere—to trace the path of platform power and how it has contributed to the watering down of policy options.Footnote 96 Relevant documents to investigate, therefore, include emails, memos, and minutes of meetings as well as legislative documents and media reports. Internal documents from lobbying meetings, however, are not automatically published by the EU. According to Regulation 1049/2001 EU, citizens have the right of access to documents in EU treaties.Footnote 97 Several documents relating to the DSA have thus been published through the website “Ask the EU.” As these documents do not cover all the lobbying meetings identified, we put in further requests via the platform and received additional material. As a result of our data-generation efforts, the corpus eventually consisted of 214 documents (esp. legislative documents, internal lobby meeting documents, trilogue documents, media, and research reports) that we analyzed.Footnote 98

The prolog

From a procedural perspective, the DSA was an ordinary legislative procedure (COD), and thus a co-decision between the Commission, Parliament, and Council.Footnote 99 Before the Commission initiated a proposal, it consulted various stakeholders for feedback and made an impact assessment. Then, the co-legislators, that is, the Council and Parliament, acted on the proposal. Afterward, the negotiations moved on to the trilogues, which are inter-institutional negotiations between the Council and the Parliament, mediated by the Commission. They are normally highly secret and a crucial step in drafting the ordinary legislative procedure. The Commission and the co-legislators ultimately reached a provisional agreement that had yet to be agreed on in legislative procedures in the Parliament and Council.Footnote 100

From an institutional perspective, the DSA did not emerge from an empty page. The pre-existing setting had predominantly been designed as one of self-regulation as part of the EU’s e-Commerce Regulation of 2000. While the latter imposed some general obligations on providers with regard to the disclosure of information and the protection of regulatory targets (i.e. users), it also laid down rules that granted certain intermediaries immunity for two decades as far as third-party content on their platforms was concerned.Footnote 101 It was intermediaries—rather than public regulators—that set, enacted, and enforced most rules of digital governance. However, the EU had taken an important legislative step before the DSA in 2016.Footnote 102 The General Data Protection Regulation (GDPR) to some extent represented a turnaround with regard to the regulation of data privacy as it affected the collection and processing of data.Footnote 103 It is primarily related to content moderation and also liability.Footnote 104 Despite overlaps with the DSA, the latter was primarily characterized by its general take on content regulation that also included “political advertizing,” whereas former initiatives tackled more sector-specific areas on (illegal) online content, such as terrorist activities.Footnote 105

In sum, the ordinary legislative procedure of the DSA was initiated within an institutional setting of self-regulation involving strong positions for large platforms as the predominant intermediaries. Yet, tighter rules for the private sector, which would have constrained the autonomy of intermediaries, were not only the Zeitgeist,Footnote 106 but they also predominated the initial agenda.

The legislative process

By the end of the 2010s and after constantly recurring instances of systematic data protection violation as well as the voter manipulation by Cambridge Analytica, the public demand that rules for large platforms should be tightened had become salient for policy-makers at different levels. The Commission wrote in a leaked preparatory document for the DSA that “specific obligations should be examined for cross-border online advertising services, including for rules around political advertising.”Footnote 107 Commentators agreed that these ideas would directly challenge internet platforms such as Google and Meta:Footnote 108 , as indicated by the New York Times headline “A Global Tipping Point for Reining In Tech Has Arrived.”Footnote 109 Due to the unfolding Brussels consensus that “some of the darker sides of digital technologies have become visible,”Footnote 110 Commission President Ursula von der Leyen announced in 2019 that better data protection measures, as provided by the GDPR, were only the “first steps.”Footnote 111 Although industry self-regulation had been relied on for decades, EU officials, member states, and civil society players were now calling for stricter regulations. And even the companies themselves allegedly asked to be regulated as intermediaries.Footnote 112

As a result, an indisputable demand for internet regulation helped to form a new Brussels consensus among stakeholders and thus triggered the drafting of a strict design for the DSA in order to tame the large platforms.

Institutional platform power shapes influence…

In accordance with our mechanism, this first step will investigate whether and how institutional platform power unfolded in this situation. Process-tracing shows the entrenched position from which large platforms, as intermediaries, interacted with the EU, as regulator, in the course of the DSA’s legislative procedure. First, it was undisputed that they were deeply involved in delivering essential public services. Second, neither the regulator nor any other public actor could realistically substitute for these services on its own.Footnote 113

The role of online intermediaries as providers of essential goods and services is based on their backbone function in modern societies: “Google and Facebook are increasingly part of our information infrastructure, shaping the distribution of and access to news, ideas and information on which our economy, culture and increasingly our politics depend.”Footnote 114 This infrastructure is crucial beyond digital markets. Many firms are dependent on these services as they can only reach potential consumers through Google or Facebook. Both users and businesses thus have to accept their “terms of service” when using this infrastructure, which turns platforms into gatekeepers.Footnote 115 In addition, this enables platforms to collect personal data from users as well as from businesses that operate on their platforms and make use of it; that is, a procedure referred to as “information exploitation.”Footnote 116 Their entrenched position goes back at least to the 2010s when public regulators encouraged private intermediaries to draft and enforce standards. From this time onwards, large platforms increasingly contributed to public functions.

For instance, when mushrooming hate speech threatened social peace, curbing it was undoubtedly a public task. Yet, the EU encouraged large platforms to make those quasi-collectively binding decisions; albeit in a politically objective way. Hence, platforms innovated by establishing in-house regulatory bodies, such as the Meta Oversight Board to “independently judge” and distinguish hate speech from free speech.Footnote 117 Farrand has recently demonstrated how “platforms have increasingly been brought into the regulatory structures for combating hate speech online.”Footnote 118 At the same time, intermediaries have left no doubt that their self-set rules (i.e., “terms of service”) would ultimately trump the non-binding codes of conduct agreed on at the EU level.Footnote 119 As a result, platforms unilaterally defined the standards and conditions for the regulatory targets (i.e. users) to access information and content. In other words, public regulators for a long time relied on private platforms’ self-initiatives and self-commitments to prevent hate speech.

Entrenching intermediaries in a position of providing public services (e.g., reducing hate speech) reinforced existing information asymmetries and strengthened their gatekeeper position.Footnote 120 The regulator simply did not know how to implement measures against such things as hate speech. By contrast, intermediaries were able to enact this task and, then, feed their experiences back into the legislative process of the DSA. In addition to information about the technologies they were using and their knowledge of their own business models and algorithms, the large platforms thus also possessed detailed information about the substance of regulatory initiatives.

Given the political salience of some of these challenges, however, the regulator obviously did not fully surrender to Meta and Google. Instead, informal arrangements became increasingly prevalent. Since 2015 the annual EU Internet Forum, a “collaborative environment for governments in the EU, the internet industry, and other partners to (…) address the challenges posed by the presence of malicious and illegal content online,”Footnote 121 has provided an informal setting for discussing rules of data governance. Intermediaries have taken charge of helping to provide public functions. Informality has increasingly shaped the interactions between Google and Meta with the regulator. Politics has become more and more quiet.

This entrenchment pattern has been reinforced by a number of non-binding codes of conduct as well as further informal “meetings.”Footnote 122 During the legislative process for the DSA, for instance, informal coordination also extended to the member-state level. In a highly critical phase in November 2021, Google and Meta informally met up with Irish policy-makers, and “parties were granted confidentiality and assured that no detailed notes would be taken on their discussions, according to redacted emails released under FoI.”Footnote 123

Moreover, Google and Meta’s entrenchment found expression in their frequent exchanges with the Commission.Footnote 124 Over the entire period of our analysis, Google had 13, and Meta 11 meetings with representatives of the Commissioners, Thierry Breton (DG CONNECT) and Margrethe Vestager (DG COMP)Footnote 125 . As trilogue meetings approached, the number of Commission meetings would increase. Otherwise, Google and Meta also had regular exchanges with parliamentarians (est. Google 23, Meta 16) and member state representatives, though data concerning these exchanges are clearly less reliable.Footnote 126 The publicly available position of large platforms was generally status-quo-oriented and thus typical of vested interests that strongly benefit from the existing institutional order. In their public responses to the Inception Impact Assessment as well as their Public Consultation answers, Meta and Google both stressed that while new regulations were welcome, the basic framework of the previous regulations should remain in place. Emphasis was put on the idea of updating existing frameworks by identifying gaps and shortcomings rather than redrafting them from scratch.Footnote 127 Up to this point, process-tracing suggests three main findings:

First, the EU’s prior internet regulation was ultimately one of self-regulation by intermediaries—or regulated self-regulation, as some authors have framed it.Footnote 128 Platforms had a privileged institutional status as they were not liable for content distributed through their infrastructure. This regulatory design reinforced their gatekeeping position since the platforms were the ones defining the terms and conditions for the use of infrastructure crucial for information flows as well as digital markets.Footnote 129 This has strongly leveraged large platforms, which drew on the entrenched positions they maintained thanks to holding critical information on the technological side as well as the regulatory side. Interactions between the regulator and intermediaries were mainly informal and, despite recurring public outrage, politics evolved in a relatively quiet way.

Second, the intermediaries’ provision of regulatory functions is indisputable. For example, the Commission had considered changes with regard to the prohibition of a general monitoring obligation but discarded the idea at an early stage, as is evident from the impact assessment. Changes to the provision might lead to obligations that “could disproportionately limit users’ freedom of expression and freedom to receive information, or could burden service providers excessively and disproportionately, and thus unduly interfere with their freedom to conduct a business.”Footnote 130 This reflects the “permissive consensus” par excellence.Footnote 131 Users, rather than merely the platforms themselves, supposedly supported the regulatory status quo as platforms served a hinge function between sellers and customers.

Third, the extent to which public actors might substitute for private services was, at least, questionable. Whether it was more a case of the EU, as regulator, being unwilling to step in or being incapable of doing so cannot be definitely clarified. What is, however, clear is that large platforms were entrenched in the EU’s institutional setting of internet regulation. Despite their lack of formal veto, they were ultimately the indispensable partner and thus had manifold opportunities at their disposal to shape the legislative agenda.

To sum up the first part of our mechanism, platforms were clearly entrenched in the provision of public (infrastructural) services, which made them indispensable for any reform. They were in a largely autonomous position without strong control exercised by the regulator. The institutional opportunities for transforming potential platform power into power over outcomes were undoubtedly given.

Ideational platform power drives entrenched interests home…

Again in accordance with our mechanism, this second step will investigate how Google and Meta employed their ideational platform power to shape the design of the DSA. The intermediaries’ baseline approach was to position themselves as the technological experts whose solutions would help Europe to become innovative and competitive in digital markets.Footnote 132 Platforms thus aligned with small and medium enterprises (SMEs),Footnote 133 which were “the backbone of Europe’s economy”Footnote 134 and represented 99 percent of businesses in the EU. Google emphasized in its public consultation submission that its services “helped SMEs to enter and expand rapidly in new markets by improving their ability to find and connect with potential new customers.”Footnote 135 This is enacting platform power—that is, implicitly pointing out how stricter regulations might ultimately antagonize the majority of consumers. Therefore, platform representatives argued that a ban on targeted advertising, “would be detrimental also to SMEs.”Footnote 136 These ads were allegedly in the public interest; and only coincidentally accounted for more than two-thirds of most platforms’ revenues. As the research groups, Corporate Europe Observatory and Lobby Control, summed up the situation as regards Google: “the tech giant wanted to push for new narratives that focused less on the company itself and more on the alleged ‘unintended impact’ of well-meaning regulatory policies.”Footnote 137 While platforms did not always succeed in building a coalition with SMEs, their ideas for providing the foundations of innovation shaped the reform agenda. After having pointed out platforms’ baseline approach, we distinguish between two main pillars of the DSA that the large platforms targeted with their ideational strategies, namely non-liability and technical flexibility.

First, from the outset, large platforms fought fiercely against the idea of becoming legally liable—unlike media companies—for the content posted via their channels. They were no more in favor of strict and precise regulations governing how harmful content should be moderated. According to Politico and other news outlets, however, the preparations for the DSA in 2019 had been moving in precisely these directions. This new idea would have critically tied in with evolving case law. Most significantly, the European Court of Human Rights addressed the balance between freedom of expression and illegal speech in Delfi v Estonia (2013). It concluded that an online intermediary may be held liable for third-party content, even if it had no actual knowledge of that content. The Court thus opted for a narrow interpretation of the liability of online intermediaries. In light of there being some opposition to this ruling, the Court took a more nuanced perspective in MTE v Hungary in 2016. It added refinements regarding the illegal nature of speech and thus the criteria under which intermediaries might be held liable.Footnote 138 Yet, Pandora’s box had been opened, and this severely threatened both the entrenched power position of the intermediaries and their business models. Unlike under the status quo, platforms under legal pressure to moderate content adequately would forfeit autonomy and no longer be able to decide unilaterally how information was to be accessed and distributed.

However, Google and Meta vehemently insisted on the premise of not being liable—with some pre-specified exceptions—and, instead, introduced the legally non-binding notion of responsibility. Meta continuously resisted all attempts by stating that a “strict liability regime holding online intermediaries directly liable would have prevented a whole range of innovative services from entering the market and would have resulted in overremoval of content.”Footnote 139 Interestingly, large platforms not only stood up for the public good of innovation but also stressed that there was no technical solution for the political problem. Large platforms “do not possess the capacity and requisite knowledge to comprehensively assess the legality of most content on their platforms.”Footnote 140 Instead, non-binding responsibility was supported. “The platform then has a responsibility to take appropriate action on that content. (…) We take our responsibility seriously.”Footnote 141

When the Commission was still setting up the first legislative proposal in 2020, platforms did not merely oppose reforms, but they had already argued strongly in favor of the constructive idea of responsibility during meetings with members of the Breton and Vestager cabinets—thus in direct interactions. Internal notes show that representatives of Meta advocated for “responsibility for the content that is being published, secondary liability.”Footnote 142 That is to say, they wanted to keep things as they were, according to the notes of a meeting with Google: “Important to preserve the main principles of the E-Commerce Directive (country of origin, liability exemption and incentives for voluntary action).”Footnote 143 While the notion of responsibility was not completely novel to the digital realm, platforms rapidly took it up and re-interpreted it to avert a stricter liability regime by advancing new ideas toward the policy agenda. In its Public Consultation Response, Meta argued for establishing two distinct, rather than, one idea: “[A]ny new framework needs to clearly distinguish between the liability and responsibility of online intermediaries.”Footnote 144

Platforms forcefully fought for their new idea of responsibility. When Google’s lobbying strategy was leaked, however, its resolve to defend the status quo of non-liability could no longer be questioned. The intermediary was supposedly taking a multi-pronged approach—via both third parties and events. As Corporate Observatory noted, Big Tech’s “message is amplified by a wide network of think tanks and other third parties.”Footnote 145 This suggests that there was some kind of interest coalition between the platforms and those business organizations that they are members of and they fund in significant parts.Footnote 146 Google and Meta, for instance, are prominent members of the trade association Dot.Europe (i.e. formerly known as EDiMA).Footnote 147 The association proactively sent its so-called “Responsibility Framework” to the DG CONNECT and Commissioners Breton and Vestager. Again, they sought to add responsibility to liability with the objective of weakening a strict notion of liability. They emphasized, “that a new framework should be created which clearly distinguishes between the principles of responsibility and liability.”Footnote 148 According to the leaked document, the focus was on informal meetings with these two DGs; and the objective was ultimately to seed conflicts within the Commission: “The document singled out Mr Breton, who has been one of the EU’s chief proponents of breaking up big tech companies, listing among its objectives to “increase pushback” on the French commissioner, while also ‘weakening support’ for the proposed legislation within Brussels.”Footnote 149 Our process-tracing analysis demonstrates that—despite of the regulator’s initial willingness and the evolving case law—platforms have succeeded in re-frame the agenda for re-moving the idea of liability step-by-step.

At the outset of the legislative process, tightened control of intermediaries was no longer reflected by the first “official” proposals made by the Commission. The idea of conditionality between the liability exemption and the due diligence regime was initially considered by the Commission as they stated in their Impact Assessment. However, it was “discarded, as failing to achieve the objectives of the intervention, placing disproportionate burdens on authorities, and introducing further legal uncertainty on service providers.”Footnote 150 Platforms that do not comply with the due diligence obligation, thus do not have to fear losing their liability exemption but must pay a fine that could, however, be up to 6 percent of their annual turnover.Footnote 151

During the legislative process, the JURIs parliamentary report amended the Commission’s initial proposal by making the liability exemption conditional on specific time-frames for platforms to remove flagged content. The Committee on the Internal Market and Consumer Protection even invited the famous whistleblower, Frances Haugen, for a public hearing on “Whistleblowers’ testimonies on the negative impact of big tech companies’ products on users.” The final report of the committee also amended the initial Commission’s text by imposing deadlines for illegal content removal.

Towards the end of the DSA’s legislative process, the idea of responsibility eventually arrived in the European Parliament. This was increasingly evident in several debates between 2020 and 2022. For instance, Rapporteur Andreas Schwab stated in the debate immediately before the vote on the first reading: “For the first time, platforms will have to take responsibility for the legality of the content posted there.”Footnote 152 Parliamentarians repeatedly drew on the notion of “responsibility” rather than that of liability. In short, platforms had evaded tighter control by watering down the available ideas for content moderation.

Second, platforms put forward the idea that the DSA should not by any means become “over-prescriptive” with regard to technology. Again, they defended the status quo. Google even stated: “That is, they should avoid mandating specific technological fixes (emphasis added).”Footnote 153 Instead, the gatekeepers, Google and Meta themselves, should decide what the appropriate solution would be in the event of a problem. Meeting notes show that Meta emphasized to members of the Commission that “platforms and AI knows [sic] what to look for.”Footnote 154 This also applied to rules to moderate harmful content. Rather than precise and predefined rules, Google and Meta argued for ideas of flexible and non-binding “codes of conduct.”Footnote 155 “Any regulation looking at harmful content should focus on ways to hold internet companies accountable for having certain systems and procedures in place to address harmful content rather than holding them liable for specific content.”Footnote 156 In other words, platforms promoted ideas of autonomy and rejected those of control.

For instance, the three legislators—Commission, Parliament, and Council—had initially discussed a complete ban on targeted advertising. The platforms, whose business model would be threatened, responded to these ideas by suggesting technological fixes against “information exploitation.” Autonomy over employing the targeted advertising technology could thus be preserved. According to the memo, Google noted during a meeting that it was “increasingly able to do the same with less data or using synthetic data.”Footnote 157 In any case “regulation that is overly prescriptive or rigid could interfere with development and uptake of digital services.”Footnote 158 Tracing the meeting notes demonstrated that Google and Meta pointed out initiatives and technological developments to the Commission. In doing so, they proposed a variety of technological solutions and recommended themselves as the ones with technological expertise.

At the same time, Meta and Google repeatedly insisted on the idea that not even technologically innovative firms, such as themselves, could predict innovation paths. Therefore, technologically specific tools should be avoided: Again, “relying on automatic filtering and detection leads to significant over-removals of content.”Footnote 159 They continued: “While breakthroughs in machine learning and other technology are impressive, the technology is far from perfect, and less accurate on more nuanced or context-dependent content.”Footnote 160 In particular, Google stressed the idea that regulation should not prescribe specific technologies to combat problems: “Flexibility to accommodate new technology: Given the fast-evolving nature of the sector, laws should be technologically neutral” (emphasis added).Footnote 161 Policy needed to remain flexible: “In any event, we believe regulatory reform of any kind should aim to be flexible and future-proof to adapt to technological change and accommodate the diverse European tech ecosystem.”Footnote 162 In short, they vehemently advanced the idea of future platform autonomy rather than increasing regulator control.

In sum, our process-tracing analysis shows how large platforms such as Google and Meta adopted two ideational strategies. They put their reputation as technical experts up front and sought to comment on alternative ideas from a third-party-expert perspective. On that basis they fiercely challenged any ideas on binding liability, proposing a commitment to responsibility in its place. Based on a similar argumentative pattern, they described most political problems as being based on technical challenges. Since these challenges cannot be predicted, they promoted the idea that the expert should—autonomously—deal with them in the future. Technologies needed to remain general, flexible, or simply open. These ideational strategies strongly suggest autonomy for intermediaries and loss of control by the regulator. It will be “technological experts” that resolve future problems. As a result, big tech would predominate the day-by-day practices of internet regulation in the EU and beyond.

Conclusion

Our paper contributes, on the one hand, to a reconceptualization of the EU’s recent DSA as a (weak) regulatory capture by large platforms that are clearly closer to the status quo than to a new and ground-breaking constitution of the internet. The DSA manifests liability exemptions of platforms preserving the fundament of the previous regulatory regime. The intermediaries’ autonomy from the EU’s tight control was reinforced as platforms remained institutionalized providers as enforcers of last resort. This becomes specifically visible in the technological openness of the DSA which puts them as autonomous experts and hub for future considerations. At the same time, the enforcement framework heavily relies on platforms’ assessments when it comes to build a comprehensive supervision. On the other hand, we have integrated the notion of platform power into the RIT model and fleshed out how entrenched gatekeepers and the exploitation of ideational asymmetries helped to move regulatory outcomes closer to the preferences of intermediaries. This explanation proved not only robust in our process-tracing analysis of the EU’s legislative process for the DSA, but is contingently applicable to platforms’ exertion of regulatory influence to other areas of content moderation, such as “political advertizing” or even “terrorist content.”

We also demonstrated that—even under the least-likely conditions of the EU’s digital governance regimeFootnote 163 —the operation of platform power led to regulatory capture of the DSA, albeit in a weak form. It is weak given that the “net social benefits of regulation are diminished as a result of special interest influence, but remain positive overall.”Footnote 164 The regulatory targets, individual users in the EU, will be certainly better off than they would have been under a self-regulatory regime run by the platforms on the basis of the 20-year-old e-commerce directive. In light of the huge demand for tighter regulation and the initial Brussels consensus, however, the DSA is undoubtedly a missed opportunity. Our process-tracing has demonstrated how the regulatory terms lost their focus on the goal of public interest (esp. protection of private data, adequate content moderation, prevention of any destabilization of democracy) and moved toward guaranteeing the intermediaries’ autonomy (esp. as ultimate enforcers). Platforms successfully advanced an ideational shift from strict legal liability toward a vague notion of responsibility. Neither could the legislative process show why the preservation of targeted advertising would be in the public interest. Instead, what became clear was the “permissive consensus” that operates when platforms shape legislative processes. Regulators seem to fear—or, at least, take into account—the users’ diffuse support of platforms that they do not wish to antagonize. This shows the supplementary nature of our “platform power mechanism” to scholarship that stresses the importance of consumers for exerting policy influence.

The explanation of this (weak) regulatory capture draws on the RIT model that pointed us to focus on the interactions between the EU as a regulator and large platforms as intermediaries in an institutional reform process. Our ‘platform power mechanism’ helped us to trace how the influence of intermediaries could unfold systematically. We argued that the entrenched position of large platforms has enabled them to move the policy process into a quiet mode, which has allowed them entangling the regulator with an ideational narrative concerning their indispensability, responsibility, and neutrality. Intermediaries’ autonomy from tight control by the EU was able to be maintained and even reinforced for the future. As a result, Meta, Google, and other big tech platforms will necessarily be involved in any attempt to reform internet regulation. They have indirectly reserved their seats at a future quiet table.

Finally, this effectiveness of “platform power” prompts several normative concerns. The paper has shown that the initial Brussels consensus could have been expected to constrain the unfolding of the large platforms’ influence. The specific conditions of DSA legislation together with the more general constraints on business actors might have led to another regulatory outcome. Scholarship on interest group success has demonstrated that, whenever the European Parliament is involved in legislation, it is the preferences of citizen groups rather than business that prevail.Footnote 165 Yet, platform power seems to operate at a level beyond that of conventional business groups. When we integrate more scholarly insights with our findings, we may derive some important suggestions about how platforms exercise this expanded influence. Heike KlüverFootnote 166 has, first, revealed the importance of coalition politics, something we support and supplement with our findings on the essentially joint approach of Google and Meta. Second, she stresses that information asymmetries form the basis of business influence. When we combine this explanation with the two steps of our “platform power mechanism,” the unfolding of platform power becomes comprehensible. The intermediaries’ entrenched position provides them not only with a reputation of indispensability but also with informational advantages that they, then, made use of by combining an ideational narrative of responsibility with one of technically neutral expertise. To some extent, the regulator trades influence for information supply; or, in the words of Abbott et al.,Footnote 167 the EU gains competence at the expense of control over platforms. Public regulators can either engage with competent intermediaries that are, however, hard to control; or they may exercise tight control of non-competent intermediaries. One of the general “lessons learned” from our study is thus to shed light on how highly competent intermediaries persistently succeed to avert future public control. The key political question for regulators will be where to draw the line—and this applies to digital platforms as much as, for instance, to the banking or semiconductor industries.

In conclusion, this rise of “platform capitalism” has arguably tilted the balance between regulators and intermediaries in favor of the latter.Footnote 168 Digital networks are mostly owned and operated by private providers. They strive for profits and market shares; yet, they have become entrenched in providing some of the most essential functions for modern societies. Therefore, regulators worldwide necessarily need to collaborate with them as regards infrastructure, but they also need to work with those that run the exchange of information, such as search engines and social networks.Footnote 169 Today’s internet regulations are thus not the end of the story, merely a stopover, since digitalization will advance further on the back of ever more intrusive artificial intelligence, which similarly awaits effective regulatory measures taken on behalf of the public interest.

Acknowledgements

We would like to thank numerous participants of research colloquia, panels and workshop for their constructive comments. We are especially grateful to our Munich colleagues, Benjamin Dassler, Andreas Kruck, Berthold Rittberger, and Bernhard Zangl.

Funding

We gratefully acknowledge funding from the Fritz Thyssen Foundation (research project “The Making of National Security: From Contested Complexity to Types of Security States,” Az. 10.23.2.001.PO).

Competing interests

The authors declare none.

Footnotes

2 Singer and Brooking (Reference Singer and Brooking2018).

4 Bradford (Reference Bradford2020, pp. 131–170; 2023); see also Heidebrecht (Reference Heidebrecht2024).

10 European Commission (2020).

12 We characterize this “future uphill battle” of the DSA as a “weak form of regulatory capture” given that it serves better platforms’, than public, interests. From the latter’s perspective, these institutional reforms are beneficial compared to the former status quo (i.e. weak regulatory capture). Yet, they institutionalize platforms as indispensable, but non-liable, gatekeepers and leave future enforcement to the technical experts (i.e. platforms). By contrast, the initial agenda would have more strongly strengthened individual users as the ultimate regulatory targets. On weak regulatory capture, see, in particular, Carpenter and Moss (Reference Carpenter and Moss2013, p. 13).

15 Culpepper (Reference Culpepper2010).

17 Bradford (Reference Bradford2023); Heidebrecht (Reference Heidebrecht2024); See also Farrand and Carrapico (Reference Farrand2022); Heermann (Reference Heermann2023); Falkner et al. (Reference Falkner, Heidebrecht, Obendiek and Seidl2024).

19 Busemeyer and Thelen (Reference Busemeyer and Thelen2020).

20 van Dijck et al. (Reference van Dijck, Nieborg and Poell2019); see also Cohen (Reference Cohen2016); Khan (Reference Khan2018).

21 See also, Bradford (Reference Bradford2023, p. 2).

22 Culpepper (Reference Culpepper2010, p. 17).

23 Culpepper and Thelen (Reference Culpepper and Thelen2020).

25 European Commission (2020).

26 Bradford (Reference Bradford2023, p. 1)

27 Farrand (Reference Farrand2023b).

28 European Commission (n.d.).

29 Caufmann and Goanta (Reference Caufmann and Goanta2021, p. 767).

30 Culpepper and Thelen (Reference Culpepper and Thelen2020).

31 Abbott et al. (Reference Abbott, Levi-Faur and Snidal2017, p. 16).

33 Barnett and Duvall (Reference Barnett and Duvall2005, p. 41).

34 Fuchs and Lederer (Reference Fuchs and Lederer2007, pp. 4–8).

35 See also, Scharpf (Reference Scharpf1997, pp. 46–47).

37 Hirsch (Reference Hirsch2011, p. 441); Marsden (Reference Marsden and Marsden2011, p. 46).

38 Bartle and Vass (Reference Bartle and Vass2005).

39 Marsden (Reference Marsden and Marsden2011, p. 46).

40 Doyle (Reference Doyle1997).

41 van der Heijden and Jong (Reference van der Heijden and Jong2009, p. 1046).

42 EUR-Lex Access to European Union Law (2022).

43 Farrand (Reference Farrand2023a).

44 Esp. Heidebrecht (Reference Heidebrecht2024).

45 For instance, Google’s advertising revenues amounted to about $238 bn in 2023, as part of its overall annual turnover of about $305 bn. Statista (2024).

46 EUR-Lex Access to European Union Law (2022, p. 60).

47 Duivenvoorde and Goanta (Reference Duivenvoorde and Goanta2023).

48 Caufmann and Goanta (Reference Caufmann and Goanta2021).

49 Farrand (Reference Farrand2023c); Genç-Gelgeç (Reference Genç-Gelgeç2022) In these matters, the Commission may call in the European Board of Digital Services as an advisor.

50 Morandini, Anna (Reference Morandini2023, November 28). “DSA Audits: Procedural Rules Leave Some Uncertainties.” DSA Observatory. https://dsa-observatory.eu/2023/11/28/dsa-audits-procedural-rules-leave-some-uncertainties/.

52 Caufmann and Goanta (Reference Caufmann and Goanta2021, p. 768).

53 EUR-Lex Access to European Union Law (2022, p. 45) (emphasis added).

54 Buri and van Hoboken (Reference Buri, van Hoboken and Heiko2021).

55 EUR-Lex Access to European Union Law (2022, p. 52). See also Caufmann and Goanta (Reference Caufmann and Goanta2021).

56 Buri and van Hoboken (Reference Buri, van Hoboken and Heiko2021, p. 10).

57 van Loo (Reference van Loo2020).

58 Farrand (Reference Farrand2023a, p. 486).

59 Mantelero (Reference Mantelero and van Hoboken2023, p. 114).

60 Eder (Reference Eder2023).

62 EUR-Lex Access to European Union Law (2024).

64 Bradford (Reference Bradford2020, pp. 131–170).

65 Campbell (Reference Campbell1998, p. 378).

67 Culpepper and Thelen (Reference Culpepper and Thelen2020, p. 293).

70 We have deliberately excluded the “tacit allegiance of consumers” as a source of platform power (Culpepper and Thelen Reference Culpepper and Thelen2020) for two reasons. First, we seek to study the legislative process of the DSA, which is more strongly influenced by institutional and ideational sources of platform power (see also, Gorwa et al. (Reference Gorwa, Lechowski and Schneiß2024)). Second, we seek to supplement—rather than to test or compete with—existing theories of the political relevance of platform power.

71 Busemeyer and Thelen (Reference Busemeyer and Thelen2020, p. 454).

73 Culpepper (Reference Culpepper2010, p. 12); Weiss (Reference Weiss2021).

74 Busemeyer and Thelen (Reference Busemeyer and Thelen2020, p. 454).

76 Carstensen and Schmidt (Reference Carstensen and Schmidt2016, p. 322).

77 Selling (Reference Selling2021, p. 51); Weiss and Biermann (Reference Weiss and Biermann2022); See also, Albareda et al. (Reference Albareda, Saz-Carranza, Van Acoleyen and Coen2023).

78 Hansen and Nissenbaum (Reference Hansen and Nissenbaum2009); Nachtwey and Seidl (Reference Nachtwey and Seidl2023); Kruck and Weiss (Reference Kruck and Weiss2023); Slayton and Clarke (Reference Slayton and Clarke2020).

79 Obendiek and Seidl (Reference Obendiek and Seidl2023, p. 1311).

80 Carpenter and Moss (Reference Carpenter and Moss2013, p. 13).

81 Khan (Reference Khan2018, p. 333).

82 Abbott et al. (Reference Abbott, Levi-Faur and Snidal2017, p. 30).

83 “A mechanism can be understood as ‘a system that produces an outcome through the interaction of a series of parts (…). Each part is composed of entities that engage in activities.” Beach and Pedersen (Reference Beach and Pedersen2013, p. 39).

84 Culpepper and Thelen (Reference Culpepper and Thelen2020, p. 289).

85 Katzenbach (Reference Katzenbach2021).

86 See also Lynskey (Reference Lynskey2017).

87 Culpepper and Thelen (Reference Culpepper and Thelen2020, p. 302).

88 Rommerskirchen and van der Heide (Reference Rommerskirchen and van der Heide2023, p. 1154).

89 Gillespie (Reference Gillespie2010, p. 348).

90 Gillespie (Reference Gillespie2010).

91 Katzenbach (Reference Katzenbach2021, p. 3).

94 Birch and Bronson (Reference Birch and Bronson2022).

97 EUR-Lex Access to European Law (2001).

98 As part of our appendix, we provide an overview of when and how the two intermediaries engaged in activities with the EU’s regulators from the Commission, the Parliament and the Council. They also indicate our (non-) access to some relevant documents.

99 EUR-Lex Access to European Law (n.D. b).

100 EUR-Lex Access to Eurpean Union Law (n.D. c).

101 Genç-Gelgeç (Reference Genç-Gelgeç2022).

102 Bradford (Reference Bradford2020, pp. 131–170).

103 Hert and Papakonstantinou (Reference Hert and Papakonstantinou2016).

104 Heldt (Reference Heldt, Flew and Martin2022) Further legislative steps, such as the Audiovisual Media Service Directive supplemented existing rules. With the Framework Decision on combating certain forms and expressions of racism and xenophobia by means of criminal law (2008/913/JHA), the EU also laid out its approach to illegal hate speech and held member states responsible for prosecuting certain illegal content (EUR-Lex Access to European Law, Document 32008F0913 2008).

105 Buri and van Hoboken (Reference Buri, van Hoboken and Heiko2021).

106 Several recent publications confirm that the EU has recently experienced a shift towards taking much tighter control of both private and foreign actors across numerous digital policy fields. See, for instance, Farrand and Carrapico (2022) ; Heermann (2023); Falkner et al. (Reference Falkner, Heidebrecht, Obendiek and Seidl2024).

107 Netzpolitik.org (2019).

109 Mozur and McCabe (Reference Mozur and McCabe2021)

110 Margrethe Vestager, cited from: The New York Times, 19 November 2019, https://www.nytimes.com/2019/11/19/technology/tech-regulator-europe.html.

113 See also Culpepper and Thelen (Reference Culpepper and Thelen2020).

114 Rahman (Reference Rahman2018, pp. 1669–1670).

115 Khan (Reference Khan2018); Lynskey (Reference Lynskey2017).

116 Khan (Reference Khan2018).

117 Oversight Board (n.d).

118 Farrand (Reference Farrand2023a, p. 483).

119 Gan (Reference Gan2017, p. 113).

120 “What we need right here from gatekeepers is changing behaviour,” the bloc’s competition commissioner Margrethe Vestager told AFP in an interview on the eve of the law coming into force The Guardian with the Press Association, 30 March 2019, https://www.theguardian.com/technology/2019/mar/30/mark-zuckerberg-calls-for-stronger-regulation-of-internet.

121 European Commission (n.D. a)

123 Business Post (2021, p. 2).

124 See also, YouTube (2020).

125 These findings relate to the meetings that we have identified as relevant and which clearly identified DSA as a topic. We do not rule out the possibility that there were other lobby meetings of this kind.

126 Corporate Observatory Report (2021)

127 This is evident, for example, in an excerpt from Google’s answer. Submission Google (2020).

128 Farrand (Reference Farrand2023a); Newman and Bach (Reference Newman and Bach2004).

129 This gatekeeping position also partly derives from the direct line connecting intermediaries to vast numbers of users, as Culpepper and Thelen note. They argue that a large part of the intermediaries’ power also stems from this “consumer–platform alliance” Culpepper and Thelen (Reference Culpepper and Thelen2020, p. 290). When it comes to intermediaries’ gatekeeping role, however, we show that the question of liability was an essential component.

130 Impact Assessment (2020, p. 50).

131 Culpepper and Thelen (Reference Culpepper and Thelen2020).

132 “Google highlighted that its advertising system was already very advanced in giving users transparency and control. The trend was moving away from a need for big data and toward a need for smart data. Google noted its increasing ability to do the same jobs with less data or using synthetic data. Internal Documents 2/12/2021 Vestager Document 2 (2021, p. 1) A similar reference was made by Meta in a lobby meeting “With respect to the DSA, Meta is working on having leaner systems permitting storing and processing less data while still delivering targeted advertising.” Internal Documents 22/9/2021 Document 1 (2021, p. 1).

133 Corporate Observatory Report (2021).

134 European Commission (2024).

135 Submission Google (2020, p. 21).

136 Internal Documents 22/9/2021 Document 1 (2021, p. 1).

137 Corporate Observatory Report (2021, p. 27) Reports and a survey showed that political interests were less aligned than platforms were arguing. Most SMEs favored stricter regulation of the big platforms. Yet, if that was to be at the expense of reaching customers, the SMEs’ position would certainly have been more ambivalent.

138 Brunner (Reference Brunner2016, p. 167); Maroni (Reference Maroni, Petkova and Ojanen2022, p. 274).

139 Response Meta (2020, p. 3).

140 Response Meta (2020, p. 3).

141 Blogpost Google (2020, pp. 1–2).

142 Internal Documents 16/11/2020 Document 1 (2020, p. 1).

143 Emphasis added, Internal Documents 4/5/2020 Document 4 (2020).

144 Response Meta (2020, p. 5). This idea was echoed by other stakeholders that were affiliated with Meta and Google. See for instance Corporate Europe Observatory (2022), Corporate Observatory Report (2021), Internal Documents EDiMA (2020, p. 2).

145 Corporate Observatory Report (2021, p. 4).

146 Corporate Europe Observatory (2022).

147 Dot.Europe (2024).

148 Internal Documents EDiMA (2020, p. 2).

149 Financial Times (p. 2).

150 Impact Assessment (2020, p. 166).

151 See Buiten (Reference Buiten2022, p. 378).

152 Translated from German to English; Verbatim report of proceedings – Digital Services Act – Digital (2020, p. 1).

153 Response Google (2020).

154 Internal Documents 16/11/2020 Document 1 (2020, p. 1).

155 Blogpost Google (2020); Submission Google (2020).

156 Response Meta (2020, p. 4). See also, Internal Documents 14/7/2020 Document 1 (2020).

157 Internal Documents 2/12/2021 Vestager Document 2 (2021, p. 1); See also: “With respect to the DSA, Facebook is working on having leaner systems permitting storing and processing less data while still delivering targeted advertising. Internal Documents 22/9/2021 Document 1 (2021, p. 1).

158 Blogpost Google (2020, p. 6).

159 Internal Documents 22/10/2020 Document 2 (2020, p. 1).

160 Submission Google (2020, p. 6); Submission Meta (p. 40).

161 Response Google (2020, p. 3).

162 Submission Google (2020, p. 9); The same idea was pushed forward by interest groups close to the platforms: “The scope of the new framework should be broadly defined, technology-neutral, and principles-based, applying proportionately to a variety of different online services rather than a specific list – which can become outdated or inapplicable in time” (Internal Documents EDiMA 2020, p. 3).

163 By contrast, to our findings on the DSA, the EU has recently moved towards tighter public control of the private sector in the digital domain. Farrand and Carrapico (2022); Heermann (Reference Heermann2024); Falkner et al. (Reference Falkner, Heidebrecht, Obendiek and Seidl2024).

164 Carpenter and Moss (Reference Carpenter and Moss2013, p. 12).

166 Klüver (Reference Klüver2013).

168 Andrew and Baker (Reference Andrew and Baker2021); Culpepper and Thelen (Reference Culpepper and Thelen2020); West (Reference West2019).

169 Glen (Reference Glen2017, pp. 121–142).

References

Abbott, Kenneth W., Levi-Faur, David, and Snidal, Duncan. 2017. “Theorizing Regulatory Intermediaries.” The Annals of the American Academy of Political and Social Science, 670(1), 1435.CrossRefGoogle Scholar
Abbott, Kenneth W., Zangl, Bernhard, Snidal, Duncan, and Genschel, Philipp. 2020. The Governor’s Dilemma: Indirect Governance Beyond Principals and Agents. Oxford Academic.CrossRefGoogle Scholar
Albareda, Adrià, Saz-Carranza, Angel, Van Acoleyen, Michiel, and Coen, David. 2023. “Lobbying the Executive Branch: Unpacking Access to Political Heads, Political Advisers, and Civil Servants.” Business and Politics, 25(1), 116.CrossRefGoogle Scholar
Andrew, Jane, and Baker, Max. 2021. “The General Data Protection Regulation in the Age of Surveillance Capitalism.” Journal of Business Ethics, 168(3), 565578.CrossRefGoogle Scholar
Barnett, Michael, and Duvall, Raymond. 2005. “Power in International Politics.” International Organization, 59(1), 3975.CrossRefGoogle Scholar
Bartle, Ian, and Vass, Peter. 2005. Self-regulation and the Regulatory State: A Survey of Policy and Practice. Centre for the Study of Regulated Industries, University of Bath School of Management.Google Scholar
Beach, Derek, and Pedersen, Rasmus Brun. 2013. Process-tracing Methods: Foundations and Guidelines. Ann Arbor: University of Michigan Press.CrossRefGoogle Scholar
Beach, Derek, and Pedersen, Rasmus Brun. 2016. “Selecting Appropriate Cases When Tracing Causal Mechanisms.” Sociological Methods & Research, 47(4), 837871.CrossRefGoogle Scholar
Birch, Kean, and Bronson, Kelly. 2022. “Big Tech.” Science as Culture, 31(1), 114.CrossRefGoogle Scholar
Blogpost Google. 2020. “A More Responsible, Innovative and Helpful Internet in Europe.”Google Scholar
Blyth, Mark. 2002. “A Theory of Institutional Change.” In Blyth, M. (Ed.), Great Transformations: Economic Ideas and Institutional Change in the Twentieth Century (pp. 1746). Cambridge University Press.CrossRefGoogle Scholar
Bradford, Anu. 2020. The Brussels Effect: How the European Union Rules the World. New York: Oxford University Press.CrossRefGoogle Scholar
Bradford, Anu. 2023. Digital Empires: The Global Battle to Regulate Technology. Oxford Scholarship Online. New York, NY: Oxford University Press.CrossRefGoogle Scholar
Brunner, Lisl. 2016. “The Liability of an Online Intermediary for Third Party Content The Watchdog Becomes the Monitor: Intermediary Liability after Delfi v Estonia.” Human Rights Law Review, 16(1), 163174.CrossRefGoogle Scholar
Buiten, Miriam. 2022. “The Digital Services Act: From Intermediary Liability to Platform Regulation”, JIPITEC, 12, 361380.Google Scholar
Buri, Ilaria, and van Hoboken, Joris. 2021. “The DSA Proposal’s Impact on Digital Dominance.” In Heiko, R. (Ed.), Max Planck Institute for Innovation & Competition Research Paper, 1015.Google Scholar
Busemeyer, Marius R., and Thelen, Kathleen. 2020. “Institutional Sources of Business Power.” World Politics, 72(3), 448480.CrossRefGoogle Scholar
Business Post. 2021. “Tech Companies were Granted Confidentiality before Meeting Minister.”Google Scholar
Campbell, John L. 1998. “Institutional Analysis and the Role of Ideas in Political Economy.” Theory and Society, 27(3), 377409.CrossRefGoogle Scholar
Carpenter, Daniel, and Moss, David A.. 2013. Preventing Regulatory Capture: Special Interest Influence and How to Limit it. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Carstensen, Martin B., and Schmidt, Vivien A.. 2016. “Power through, Over and in Ideas: Conceptualizing Ideational Power in Discursive Institutionalism.” Journal of European Public Policy, 23(3), 318337.CrossRefGoogle Scholar
Caufmann, Caroline, and Goanta, Catalina. 2021. “A New Order: The Digital Services Act and Consumer Protection.” European Journal of Risk Regulation, 12(4), 758774.CrossRefGoogle Scholar
Cohen, Julie E. 2016. “The Regulatory State in the Information Age.” Theoretical Inquiries in Law, 17(2), 136.CrossRefGoogle Scholar
Corporate Europe Observatory. 2022. “How Corporate Lobbying Undermined the EU’s Push to Ban Surveillance Ads.”Google Scholar
Corporate Observatory Report. 2021. “The Lobby Network: Big Tech’s Web of Influence in the EU | Corporate Europe Observatory.”Google Scholar
Cremer, David de, Narayanan, Devesh, Deppeler, Andreas, Nagpal, Mahak, and McGuire, Jack. 2022. “The Road to a Human-centred Digital Society: Opportunities, Challenges and Responsibilities for Humans in the Age of Machines.” AI Ethics, 2(4), 579583.CrossRefGoogle ScholarPubMed
Culpepper, Pepper D. 2010. Quiet Politics and Business Power: Corporate Control in Europe and Japan. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Culpepper, Pepper D., and Thelen, Kathleen. 2020. “Are We All Amazon Primed? Consumers and the Politics of Platform Power.” Comparative Political Studies, 53(2), 288318.CrossRefGoogle Scholar
Dot.Europe. 13.02.2024. “Members.”Google Scholar
Doyle, Chris. 1997. “Self Regulation and Statutory Regulation.” Business Strategy Review, 8, 3542.CrossRefGoogle Scholar
Duivenvoorde, Bram, and Goanta, Catalina. 2023. “The Regulation of Digital Advertising under the DSA: A Critical Assessment.” Computer Law & Security Review, 51, 114.CrossRefGoogle Scholar
Dür, Andreas, Bernhagen, Patrick, and Marshall, David. 2015. “Interest Group Success in the European Union.” Comparative Political Studies, 48(8), 951983.CrossRefGoogle Scholar
Eder, Niklas. 2023. “Making Systemic Risk Assessments Work: How the DSA Creates a Virtuous Loop to Address the Societal Harms of Content Moderation.” German Law Journal forthcoming,.Google Scholar
EDRi. 2019. “European Digital Rights: Content Regulation – What’s the (Online) Harm?”Google Scholar
EUR-Lex Access to European Law. 2001. “Regulation (EC) No 1049/2001 of the European Parliament and of the Council of 30 May 2001 regarding public access to European Parliament, Council and Commission documents.”Google Scholar
EUR-Lex Access to European Law. n.D. b. “Procedure 2020/0361/COD.”Google Scholar
EUR-Lex Access to European Union Law. 2022. “Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).”Google Scholar
EUR-Lex Access to European Union Law. 2024. “Digital Services Act – Supervisory Fees on Providers of Very Large Online Platforms and Search Engines.”Google Scholar
EUR-Lex Access to European Union Law. n.D. c. “Trilogue.”Google Scholar
European Commission. 2020. “Legislative Train Schedule European Parliament: Commission Work Programme 2020. A Union that thrives for more.”Google Scholar
European Commission. 2024. “Internal Market, Industry, Entrepreneurship and SMEs.”Google Scholar
European Commission. n.d. “Shaping Europe’s Digital Future: The Digital Services Act Package.”Google Scholar
European Commission. n.D. a. “European Union Internet Forum (EUIF).”Google Scholar
Falkner, Gerda, Heidebrecht, Sebastian, Obendiek, Anke, and Seidl, Timo. 2024Digital Sovereignty - Rhetoric and Realit.” Journal of European Public Policy, 31(8), 20992120.CrossRefGoogle Scholar
Farrand, Benjamin, and Helena Carrapico. 2022Digital Sovereignty and Taking Back Control: From Regulatory Capitalism to Regulatory Mercantilism in EU Cybersecurity.” European Security, 31(3), 435453.CrossRefGoogle Scholar
Farrand, Benjamin. 2023a “‘Is This a Hate Speech?’ The Difficulty in Combating Radicalisation in Coded Communications on Social Media Platforms.” European Journal on Criminal Policy and Research, 23, 477493.CrossRefGoogle Scholar
Farrand, Benjamin. 2023b “Regulating Misleading Political Advertising on Online Platforms: an Example of Regulatory Mercantilism in Digital Policy.” Policy Studies, 120.Google Scholar
Farrand, Benjamin. 2023c “The Ordoliberal Internet? Continuity and Change in the EU’s Approach to the Governance of Cyberspace.” European Law Open, 2(1), 106127.CrossRefGoogle Scholar
Financial Times. “Internal Google Document Reveals Campaign against EU Lawmakers.”Google Scholar
Fuchs, Doris, and Lederer, Markus M. L.. 2007. “The Power of Business.” Business and Politics, 9(3), 117.CrossRefGoogle Scholar
Gambrell, Dorothy. 2023. “Google’s Dominance Over the Internet, Visualized.”Google Scholar
Gan, Hui Zhen. 2017. “Corporations: The Regulated or the Regulators - The Role of IT Companies in Tackling Online Hate Speech in the EU.” Columbia Journal of European Law, 24(1), 11155.Google Scholar
Genç-Gelgeç, Berrac. 2022. “Regulating Digital Platforms: Will the DSA Correct Its Predecessor’s Deficiencies?Croatian Yearbook of European Law and Policy, 18, 2560.CrossRefGoogle Scholar
Gillespie, Tarleton. 2010. “The Politics of ‘Platforms’.” New Media & Society, 12(3), 347364.CrossRefGoogle Scholar
Glen, Carol M. 2017. Controlling cyberspace: The politics of Internet governance and regulation. Santa Barbara: Bloomsbury Publishing USA.CrossRefGoogle Scholar
Gorwa, Robert, Lechowski, Grzegorz, and Schneiß, Daniel. 2024. “Platform lobbying: Policy influence strategies and the EU’s Digital Services ActInternet Policy Review, 13(2), 126.CrossRefGoogle Scholar
Hansen, Lene, and Nissenbaum, Helen. 2009. “Digital Disaster, Cyber Security, and the Copenhagen School.” International Studies Quarterly, 53(4), 11551175.CrossRefGoogle Scholar
Heermann, Max. 2024. “Undermining Lobbying Coalitions: The Interest Group Politics of EU Copyright Reform.” Journal of European Public Policy, 31(8), 22872315.CrossRefGoogle Scholar
Heidebrecht, Sebastian. 2024. “From Market Liberalism to Public Intervention: Digital Sovereignty and Changing European Union Digital Single Market Governance.” Journal of Common Market Studies, 62(1), 205223.CrossRefGoogle Scholar
Heldt, Amélie. 2022. “EU Digital Services Act: The White Hope of Intermediary Regulation. In Flew, T. & Martin, F. R. (Eds.), Digital Platform Regulation. Palgrave Global Media Policy and Business (pp. 6984). Palgrave Macmillan.Google Scholar
Héritier, Adrienne, and Eckert, Sandra. 2008. “New Modes of Governance in the Shadow of Hierarchy: Self-Regulation by Industry in Europe.” Journal of Public Policy, 28(1), 113138.CrossRefGoogle Scholar
Hert, Paul de, and Papakonstantinou, Vagelis. 2016. “The new General Data Protection Regulation: Still a sound system for the protection of individuals?Computer Law & Security Review, 32(2), 179194.CrossRefGoogle Scholar
Hirsch, Dennis. D. 2011. “The Law and Policy of Online Privacy: Regulation, Self-regulation, or Co-regulation.” Seattle University Law Review, 34(2), 439480.Google Scholar
Impact Assessment. 2020. “Impact Assessment of the Digital Services Act.”Google Scholar
Internal Documents 14/7/2020 Document 1. 2020. “Report of the 14/07 call with Google.”Google Scholar
Internal Documents 16/11/2020 Document 1. 2020. “BTO Meeting 16 November 2020 on the Digital Service Package.”Google Scholar
Internal Documents 2/12/2021 Vestager Document 2. 2021. “Meeting between Cabinet Vestager and Google.”Google Scholar
Internal Documents 22/10/2020 Document 2. 2020. “Report of the 22/10/2020 call with Google.”CrossRefGoogle Scholar
Internal Documents 22/9/2021 Document 1. 2021. “Report of Video Conference between A. Whelan and Facebook, 22 September 2021.”Google Scholar
Internal Documents 4/5/2020 Document 4. 2020. “BTO CAB meeting with Google, 4 May 2020.”Google Scholar
Internal Documents EDiMA. 2020. “Responsibility Online.”Google Scholar
Katzenbach, Christian. 2021. ““AI will fix this” – The Technical, Discursive, and Political Turn to AI in Governing Communication.” Big Data & Society, 8(2), 18.CrossRefGoogle Scholar
Khan, Lina M. 2018. “Sources of Tech Platform Power.” Georgetown Law Technology Review, 2(10), 325335.Google Scholar
Klüver, Heike. 2013. Lobbying in the European Union: Interest Groups, Lobbying Coalitions, and Policy Change. Oxford: Oxford University Press.CrossRefGoogle Scholar
Kruck, Andreas, and Weiss, Moritz. 2023. “The Regulatory Security State in Europe.” Journal of European Public Policy, 30(7), 12051229.CrossRefGoogle Scholar
Laux, Johann, Wachter, Sandra, and Mittelstadt, Brent. 2021. “Taming the Few: Platform Regulation, Independent Audits, and the Risks of Capture Created by the DMA and DSA.” Computer Law & Security Review, 43, 112.CrossRefGoogle Scholar
Lynskey, Orla. 2017. “Regulating ’Platform Power’. LSE Legal Studies Working Paper No. 1/2017.” Available at SSRN: https://ssrn.com/abstract=2921021 or http://dx.doi.org/10.2139/ssrn.2921021 CrossRefGoogle Scholar
Mantelero, Alessandro. 2023. “Fundamental Rights Impact Assessment in the DSA”. In van Hoboken, Joris et al. (Ed.), Putting the DSA into Practice: Enforcement, Access to Justice and Global Implications (pp. 211–226). Verfassungsbooks.CrossRefGoogle Scholar
Maroni, Marta. 2022. “The liability of internet intermediaries and the European Court of Human Rights.” In Petkova, B. & Ojanen, T. (Eds.), Fundamental Rights Protection Online the Future Regulation of Intermediaries (pp. 255278). Edward Elgar Publishing Limited.Google Scholar
Marsden, Christopher T. 2011. “Internet Co-Regulation and Constitutionalism.” In Marsden, C. T. (Ed.), Internet Co-Regulation: European Law, Regulatory Governance and Legitimacy in Cyberspace (pp. 4670). Cambridge University Press.CrossRefGoogle Scholar
McGee, Patrick. 2022. “Meta and Alphabet Lose Dominance Over US Digital Ads Market.”Google Scholar
McLaughlin, S. 2013. “Regulation and legislation.” In O’Neill, B., Staksrud, E., and McLaughlin, S. (Eds.), Towards a better Internet for Children? Policy Pillars, Players and Paradoxes (pp. 7791). Nordicom.Google Scholar
Moe, Terry M. 2019. The Politics of Institutional Reform: Katrina, Education, and the Second Face of Power. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
Morandini, Anna. 2023. “DSA Audits: Procedural Rules Leave Some Uncertainties.” DSA Observatory. https://dsa-observatory.eu/2023/11/28/dsa-audits-procedural-rules-leave-some-uncertainties/.Google Scholar
Mozur, Paul, and McCabe, David. 2021. “A Global Tipping Point for Reining In Tech Has Arrived.” The New York Times. https://www.nytimes.com/2021/04/20/technology/global-tipping-point-tech.html Google Scholar
Nachtwey, Oliver, and Seidl, Timo. 2023. “The Solutionist Ethic and the Spirit of Digital Capitalism.” Theory, Culture & Society, Article 02632764231196829. Advance online publication.CrossRefGoogle Scholar
Netzpolitik.org. 2019. “Leaked document: EU Commission mulls new law to regulate online platforms – netzpolitik.org.”Google Scholar
Newman, Abraham L., and Bach, David. 2004. “Self-Regulatory Trajectories in the Shadow of Public Power: Resolving Digital Dilemmas in Europe and the United States.” Governance, 17(3), 387413.CrossRefGoogle Scholar
Obendiek, Anke S., and Seidl, Timo. 2023. “The (False) Promise of Solutionism: Ideational Business power and the construction of epistemic authority in digital security governance.” Journal of European Public Policy, 12051471.Google Scholar
Oversight Board. n.D. “Ensuring Respect for Free Expression, Through Independent Judgement.”Google Scholar
Podstawa, Karolina. 2019. “Hybrid Governance or… Nothing? The EU Code of Conduct on Combatting Illegal Hate Speech Online.” In Carpanelli, E., Lazzerini, N. (Ed.), Use and Misuse of New Technologies (pp. 167184). Springer.CrossRefGoogle Scholar
Rahman, K. Sabeel. 2018. “The New Utilities: Private Power, Social Infrastructure, and the Revival of the Public Utility Concept.” Cardozo Law Review, 59(5), 16211692.Google Scholar
Response Google. 2020. “European Commission - Have Your Say.”Google Scholar
Response Meta. 2020. “European Commission - Have Your Say.”Google Scholar
Rommerskirchen, Charlotte, and van der Heide, Arjen W.. 2023. “The Quiet Side of Debt: Public Debt Management in Advanced Economies.” Socio-Economic Review, 21(2), 11511170.CrossRefGoogle Scholar
Scharpf, Fritz W. 1997. Games Real Actors Play: Actor-centered Institutionalism In Policy Research. Boulder: Westview Press.Google Scholar
Selling, Niels. 2021. “The Long Shadow of Lobbying: Ideational Power of Lobbying as Illustrated by Welfare Profits in Sweden.” Interest Groups & Advocacy, 10, 4767.CrossRefGoogle Scholar
Singer, P. W., and Brooking, Emerson T.. 2018. Likewar: The Weaponization of Social Media. Boston: New York: Houghton Mifflin Harcourt.Google Scholar
Slayton, Rebecca, and Clarke, Brian. 2020. “Trusting Infrastructure: The Emergence of Computer Security Incident Response, 1989-2005.” Technology and Culture, 61(1), 173206.CrossRefGoogle ScholarPubMed
Statista. 13.02.2024. “Google - Umsatz mit Werbung 2023 | Statista.”Google Scholar
Streeck, Wolfgang, and Thelen, Kathleen. 2009. “Institutional Change in Advanced Political Economies”. In Hancké, B. (Ed.), Debating Varieties of Capitalism: A Reader (pp. 95134). Oxford University Press.Google Scholar
Submission Google. 2020. “Googles_submission_on_the_Digital_Services_Act_package_1.”Google Scholar
Submission Meta. “FINAL-FB-Response-to-DSA-Consultations.”Google Scholar
van der Heijden, Jeroen, and Jong, Jitske de. 2009. “Towards a Better Understanding of Building Regulation.” Environment and Planning B: Planning and Design, 36(6), 10381052.CrossRefGoogle Scholar
van Dijck, José, Nieborg, David, and Poell, Thomas. 2019. “Reframing Platform Power.” Internet Policy Review, 8(2).CrossRefGoogle Scholar
van Loo, Rory. 2020. “The New Gatekeepers: Private Firms as Public Enforcers.” Virginia Law Review, 106(2), 467522.Google Scholar
Verbatim report of proceedings - Digital Services Act - Digital. 02.02.2020. “CRE 04/07/2022-15.”Google Scholar
Weiss, Moritz. 2019. “From Wealth to Power? The Failure of Layered Reforms in India’s Defense Sector.” Journal of Global Security Studies, 4(4), 560578.CrossRefGoogle Scholar
Weiss, Moritz. 2021. “Varieties of Privatization: Informal Networks, Trust and State Control of the Commanding Heights.” Review of International Political Economy, 28(3), 662689.CrossRefGoogle Scholar
Weiss, Moritz, and Biermann, Felix. 2022. “Networked Politics and the Supply of European Defence Integration.” Journal of European Public Policy, 29(6), 910931.CrossRefGoogle Scholar
Weiss, Moritz, and Heinkelmann-Wild, Tim. 2020. “Disarmed Principals: Institutional Resilience and the Non-enforcement of Delegation.” European Political Science Review, 12(4), 409425.CrossRefGoogle Scholar
West, Sarah Myers. 2019. “Data Capitalism: Redefining the Logics of Surveillance and Privacy.” Business & Society, 58(1), 2041.CrossRefGoogle Scholar
YouTube. 2020. “Aura Salla, Facebook The Digital Services Act.” https://www.youtube.com/watch?v=qe6_5N5pygA Google Scholar
Zingales, Nicolo. 2023. “The DSA as a Paradigm Shift for Online Intermediaries’ Due Diligence: Hail to Meta-Regulation”. In van Hoboken, Joris, et al. (Ed.), Putting the DSA into Practice: Enforcement, Access to Justice and Global Implications (pp. 211226). Verfassungsbooks.Google Scholar
Figure 0

Figure 1. The regulator–intermediaries–target model and the European Union’s internet regulation.

Figure 1

Figure 2. The “platform power mechanism” at work in Digital Services Act legislation.