We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
One of the key challenges of regulating internet platforms is international cooperation. This chapter offers some insights into platform responsibility reforms by relying on forty years of experience in regulating cross-border financial institutions. Internet platforms and cross-border banks have much in common from a regulatory perspective. They both operate in an interconnected global market that lacks a supranational regulatory framework. And they also tend to generate cross-border spillovers that are difficult to control. Harmful content and systemic risks – the two key regulatory challenges for platforms and banks, respectively – can be conceptualized as negative externalities.
One of the main lessons learned in regulating cross-border banks is that, under certain conditions, international regulatory cooperation is possible. We have witnessed that in the successful design and implementation of the Basel Accord – the global banking standard that regulates banks’ solvency and liquidity risks. In this chapter, I will analyze the conditions under which cooperation can ensue and what the history of the Basel Accord can teach to platform responsibility reforms. In the last part, I will discuss what can be done when cooperation is more challenging.
Like information disseminated through online platforms, infectious diseases can cross international borders as they track the movement of people (and sometimes animals and goods) and spread globally. Hence, their control and management have major implications for international relations, and international law. Drawing on this analogy, this chapter looks to global health governance to formulate suggestions for the governance of online platforms. Successes in global health governance suggest that the principle of tackling low-hanging fruit first to build trust and momentum towards more challenging goals may extend to online platform governance. Progress beyond the low-hanging fruit appears more challenging: For one, disagreement on the issue of resource allocation in the online platform setting may lead to “outbreaks” of disinformation being relegated to regions of the world that may not be at the top of online platforms’ market priorities lists. Secondly, while there may be wide consensus on the harms of infectious disease outbreaks, the harms from the spread of disinformation are more contested. Relying on national definitions of disinformation would hardly yield coherent international cooperation. Global health governance would thus suggest that an internationally negotiated agreement on standards as it relates to disinformation may be necessary.
This chapter examines China’s approach to platform responsibility for content moderation. It notes that China’s approach is rooted in its overarching goal of public opinion management, which requires platforms to proactively monitor, moderate, and sometimes censor content, especially politically sensitive content. Despite its patchy and iterative approach, China’s platform regulation is consistent and marked by its distinct characteristics, embodied in its defining of illegal and harmful content, its heavy platform obligations, and its strong reliance on administrative enforcement measures. China’s approach reflects its authoritarian nature and the asymmetrical power relations between the government and private platforms. This chapter also provides a nuanced understanding of China’s approach to platform responsibility, including Chinese platforms’ "conditional liability" for tort damages and the regulators’ growing emphasis on user protection and personal information privacy. This chapter includes a case study on TikTok that shows the interplay between the Chinese approach, oversees laws and regulations and the Chinese online platform’s content moderation practices.
Platform governance and regulation have been salient political issues in Brazil for years, particularly as part of Congress’ response to democratic threats posed by former President Bolsonaro. The question became even more important after the January 8th attempted insurrection in Brasília, which many blame on social media. This includes the newly installed Lula administration. In a letter read on the February 2023 UNESCO “Internet for Trust” global conference, the President, now in his third (non-consecutive) term in office wrote that the attack on the nation’s seats of power was “the culmination of a campaign, initiated much before, and that used, as ammunition, lies and disinformation,” which “was nurtured, organized, and disseminated through several digital platforms and messaging apps.” The new administration has made platform regulation a policy priority, with regulatory and administrative pushes across the board. Brazil has been a battleground where proposals for platform responsibility have been advanced — and disputed.
Global platforms present novel challenges. They serve as powerful conduits of commerce and global community. Yet their power to influence political and consumer behavior is enormous. Their responsibility for the use of this power – for their content – is statutorily limited by national laws such as Section 230 of the Communications Decency Act in the US. National efforts to demand and guide appropriate content moderation, and to avoid private abuse of this power, is in tension with concern in liberal states to avoid excessive government regulation, especially of speech. Diverse and sometimes contradictory national rules responding to these tensions on a national basis threaten to splinter platforms, and reduce their utility to both wealthy and poor countries. This edited volume sets out to respond to the question whether a global approach can be developed to address these tensions while maintaining or even enhancing the social contribution of platforms.
The world has muddled through with limited and ambiguous understandings of the scope of national jurisdiction in a number of private and public law areas. In order to reduce the barriers of legal difference in the field of platform responsibility, states may begin by reducing areas of overlapping application of law, by agreeing on rules of exclusive jurisdiction. They may also agree on rules of national treatment, most favored nation treatment, and proportionality, or they may agree to harmonize rules. These incursions on national regulatory autonomy will require detailed, sector-specific negotiations, recognizing both the importance of global communications, and the importance of national regulatory autonomy.
Global platforms present novel challenges. They are powerful conduits of commerce and global community, and their potential to influence behavior is enormous. Defeating Disinformation explores how to balance free speech and dangerous online content to reduce societal risks of digital platforms. The volume offers an interdisciplinary approach, drawing upon insights from different geographies and parallel challenges of managing global phenomena with national policies and regulations. Chapters also examine the responsibility of platforms for their content, which is limited by national laws such as Section 230 of the Communications Decency Act in the US. This balance between national rules and the need for appropriate content moderation threatens to splinter platforms and reduce their utility across the globe. Timely and expansive, Defeating Disinformation develops a global approach to address these tensions while maintaining, and even enhancing, the social contribution of platforms. This title is also available as open access on Cambridge Core.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.