Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-25T07:15:09.794Z Has data issue: false hasContentIssue false

Dodging the autocratic bullet: enlisting behavioural science to arrest democratic backsliding

Published online by Cambridge University Press:  10 December 2024

Christoph M. Abels*
Affiliation:
Department of Psychology, University of Potsdam, Potsdam, Germany
Kiia Jasmin Alexandra Huttunen
Affiliation:
School of Psychological Science, University of Bristol, Bristol, United Kingdom
Ralph Hertwig
Affiliation:
Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany
Stephan Lewandowsky
Affiliation:
Department of Psychology, University of Potsdam, Potsdam, Germany School of Psychological Science, University of Bristol, Bristol, United Kingdom
*
Corresponding author: Christoph M. Abels; Email: christoph.maximilian.abels@uni-potsdam.de
Rights & Permissions [Opens in a new window]

Abstract

Despite a long history of research on democratic backsliding, the process itself − in which the executive branch amasses power and undermines democratic processes and institutions − remains poorly understood. We seek to shed light on the underlying mechanisms by studying democratic near misses: cases in which a period of autocratic governance is quickly reversed or full backsliding is prevented at the last minute. Building on the literature on near misses in sociotechnical systems such as nuclear power plants, we adapt the drift-to-danger model to the study of democratic systems. Two key findings emerge: First, democratic backsliding is often triggered by political elites pushing the boundaries of their power by violating norms, which are crucial yet vulnerable safeguards for democracy. Second, democratic backsliding is unpredictable and non-linear, being driven by the interaction between political elites and the public. Norm-violating elites may feel legitimized by a supportive public that sees norm violations as justified. At the same time, political elites may signal that norm-violating behaviour is acceptable, potentially leading the public to adopt anti-democratic beliefs and behaviours. We identify risk factors that make norm violations more likely and outline behavioural sciences-based interventions to address these violations.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press.

Introduction

When democracies fail, they rarely crash and burn in an instant. In most cases, their demise is slow. Failing democracies drift through a period of backsliding, in which the executive branch amasses power and undermines democratic processes and institutions. In some cases, a period of autocratic governance is quickly reversed or a full backsliding is prevented at the last minute. These ‘near misses’ (Ginsburg and Huq, Reference Ginsburg and Huq2018) are at the heart of our investigation. While near misses are a comparatively new concept in democracy studies, the field of human factors has long distinguished between accidents and near-accidents in sociotechnical systems such as oil rigs (Jones et al., Reference Jones, Kirchsteiger and Bjerke1999). We adapt the drift-to-danger model of sociotechnical accidents developed by Rasmussen (Reference Rasmussen1997) to conceptualize democratic instability. This approach helps to understand how democratic systems can gradually erode, often in plain sight. It highlights that incremental deviations from a liberal democratic equilibrium follow a non-linear dynamic: Once a tipping point or threshold has been reached, reversing democratic backsliding becomes extremely difficult or impossible, and the transition to an authoritarian regime can be swift (e.g., Hitler’s establishment of a one-party dictatorship and a repressive police state within months of his appointment as Chancellor; Weber, Reference Weber2022).

In this article, we analyse several near misses to identify both enabling risk factors and protective interventions that transcend the particularities of each near-miss episode (Lührmann et al., Reference Lührmann, Marquardt and Mechkova2020). By synthesising insights and methods from the behavioural sciences that can be recruited to strengthen democratic systems and prevent backsliding, the article serves as a conceptual review with empirical aspects. First, we adapt the drift-to-danger model and apply it to democratic near misses, identifying elite norm violations as a key cause of backsliding in all cases. Second, we draw on experimental, survey and empirical data to examine the consequences of elite norm violations on public attitudes and behaviours. Third, we outline classes of behavioural science-based interventions suitable for addressing risk factors identified as facilitating or amplifying elite norm violations.

Democracy and its erosion

Discriminating between democratic and authoritarian regimes is becoming increasingly difficult, as most states hold elections (Lührmann et al., Reference Lührmann, Tannenberg and Lindberg2018) and have learned to mimic various other attributes of liberal democracies. Only a few regimes (e.g., Belarus, Iraq, North Korea and Russia) are openly authoritarian, relying on a repressive security apparatus and coercion to control their citizens. ‘In the modern era, authoritarian wolves rarely appear as wolves. They are now clad, at least in part, in sheep’s clothing’ (Varol, Reference Varol2015, p. 1677). To illustrate, the democratic system in Hungary − an European Union (EU) member state − has been seriously eroded by measures such as gerrymandering, hijacking of state institutions, constitutional changes that weaken democratic checks and balances, and control of the media and public discourse (Polyák, Reference Polyák, Połońska and Beckett2019; Szelényi, Reference Szelényi2022).

Following Lindberg et al. (Reference Lindberg, Coppedge, Gerring, Teorell, Pemstein, Tzelgov, Wang, Glynn, Altman, Bernhard, Fish, Hicken, Kroenig, McMann, Paxton, Reif, Skaaning and Staton2014), we conceptualize democracy in terms of five core components: electoral, liberal, participatory, deliberative and egalitarian. These components form the basis of the democracy scores assigned by the Varieties of Democracy (V-Dem) project (e.g., Lindberg et al., Reference Lindberg, Coppedge, Gerring, Teorell, Pemstein, Tzelgov, Wang, Glynn, Altman, Bernhard, Fish, Hicken, Kroenig, McMann, Paxton, Reif, Skaaning and Staton2014; Boese et al., Reference Boese, Edgell, Hellmeier, Maerz and Lindberg2021). The electoral component captures the idea that leaders’ responsiveness is achieved through a system of competition and accountability, ensured by regular free and fair elections. Liberal refers to the protection of individual and minority rights against a tyranny of the majority. Participatory means that citizens’ active political participation in all political processes is encouraged − for example, through engagement in political parties and civil society organizations and direct democracy. Deliberative means that decision-making should be based on respectful and reasonable dialogue in pursuit of the public good. Finally, democracy should be egalitarian and strive to distribute resources such as education and health equitably.

Frequently, countries fail to ensure several of these core components of democracy, leading to incomplete democracies, hybrid systems and autocratic types of governance. Complete breakdowns or reversals of democracy are not as common as they used to be (Boese et al., Reference Boese, Edgell, Hellmeier, Maerz and Lindberg2021). Most symptoms of reversal are subtle, and backsliding processes take time (Haggard and Kaufman, Reference Haggard and Kaufman2021). In some cases, a transient period of democratic backsliding is reversed. Instances in which democracies are exposed to social, political or economic forces that could catalyse backsliding, but manage to overcome these forces and avoid a full and lasting backslide to autocratic governance can be understood as near misses. A near miss is defined as a ‘case in which a country 1) experiences a deterioration in the quality of initially well-functioning democratic institutions, without fully sliding into authoritarianism, but then, 2) within a time frame of a few years, at least partially recovers its high-quality democracy’ (Ginsburg and Huq, Reference Ginsburg and Huq2018, p. 17).

Democracy is in a tough spot globally. At the time of writing in 2024, the world is almost evenly divided between democratic (91) and autocratic states (88), with 71% of people living in autocracies, up from 48% in 2013 (Nord et al., Reference Nord, Lundstedt, Altman, Angiolillo, Borella, Fernandes, Gastaldi, God, Natsika and Lindberg2024). Citizens in 60 countries, making up around 45% of the world’s population, are being asked to cast their votes in elections in 2024. The majority of these elections (52%) are being held in countries in which democracy is declining (Nord et al., Reference Nord, Lundstedt, Altman, Angiolillo, Borella, Fernandes, Gastaldi, God, Natsika and Lindberg2024). Although it may seem unlikely that established democracies will experience substantial backsliding, countries such as the US and UK have recently shown early signs of democratic erosion. In the US, in particular, the last few years have seen a steady deterioration of norms and practices crucial for maintaining democracy. Although the country has experienced tumultuous periods before (e.g., the Watergate scandal), four problematic developments now coincide for the first time: political polarization, conflicts over in-group membership, high levels of social and economic inequality and excessive use of executive power (Mettler and Liebermann, Reference Mettler and Liebermann2020).

The effects of political polarization are particularly salient, as cooperation between the two parties in US Congress has become increasingly difficult, with members of Congress willing to break with established norms (e.g., denying a sitting president the hearings required to fill a vacant Supreme Court seat; Kar and Mazzone, Reference Kar and Mazzone2016). During his presidency and even more so as the Republican candidate for the 2024 Presidential election, Donald Trump also repeatedly attacked the judiciary and the rule of law (Freedom House, 2019). Beyond that, elements of the Republican party challenged the legitimacy of the 2020 presidential election, with various attempts to overturn the result and keep Trump in office (Helderman, Reference Helderman2022), culminating in a violent insurrection on 6 January 2021 (Haslam et al., Reference Haslam, Reicher, Selvanathan, Gaffney, Steffens, Packer, Van Bavel, Ntontis, Neville, Vestergren, Jurstakova and Platow2023).

Political elites in the UK have also shown disregard for democratic norms. A case in point was the unlawful prorogation of parliament (i.e., ending of the parliamentary session) in September 2019. This move was largely seen as an attempt by then Prime Minister Boris Johnson to avoid parliamentary scrutiny of his government’s Brexit plans and to prevent parliament from thwarting a hard Brexit. Johnson argued that the goal was to give his government time to prepare for the next parliamentary session (Hadfield, Reference Hadfield2019). However, the Supreme Court decided that the decision was ‘unlawful because it had the effect of frustrating or preventing the ability of Parliament to carry out its constitutional functions without reasonable justification’.Footnote 1

Modeling democratic near misses

Although established democracies like the US and UK appear resilient on the surface, these recent developments give cause for concern. This is because the processes at the heart of backsliding − which starts gradually with slow incremental change before suddenly switching to change that is difficult to reverse − remain poorly understood (Bermeo, Reference Bermeo2016; Waldner and Lust, Reference Waldner and Lust2018; Wiesner et al., Reference Wiesner, Birdi, Eliassi-Rad, Farrell, Garcia, Lewandowsky, Palacios, Ross, Sornette and Thébault2019; Haggard and Kaufman, Reference Haggard and Kaufman2021; Wunsch and Blanchard, Reference Wunsch and Blanchard2022).

We therefore turn to the field of human factors, which has a long history of studying accidents in complex sociotechnical systems such as nuclear power plants or oil rigs, to provide a conceptual lens through which to study democratic near misses. The term ‘near miss’ is widely used here to distinguish accidents, which result in injury or damage, from incidents without such detrimental outcomes (Jones et al., Reference Jones, Kirchsteiger and Bjerke1999). A near miss thus refers to any event that could have caused substantial damage but was prevented ‘by only a hair’s breadth’ (Reason, Reference Reason2016, p. 118). To cite Jones et al. (Reference Jones, Kirchsteiger and Bjerke1999), a near miss is ‘an unintended incident which, under different circumstances, could have become an accident’ (p. 63). In addition to highlighting risk factors, analyses of near misses can draw attention to the safety layers that contribute to preventing an adverse event (Gnoni et al., Reference Gnoni, Tornese, Guglielmi, Pellicci, Campo and De Merich2022). Ginsburg and Huq (Reference Ginsburg and Huq2018) have also previously discussed near misses in the context of democracy, arguing that they can help to identify the economic, political and social conditions that ‘can repel a threat to participatory governance once such a threat has arisen’ (p. 17).

We argue that sociotechnical systems share similarities with democratic systems and can therefore help to understand the non-linear process of erosion underlying democratic backsliding. In particular, we identify the drift-to-danger model developed by Rasmussen (Reference Rasmussen1997) as a valuable framework to study democratic backsliding.

The drift -to-danger model

Rasmussen (Reference Rasmussen1997) argued that any system is shaped by objectives and constraints to which individuals must adhere in order for the system to work effectively. Nevertheless, various degrees of freedom remain. Individuals interpret this leeway and develop strategies to balance the effort they invest and the demands of the system. If the operating conditions change, these strategies will be modified. If, for example, a factory increases its employees’ workload without hiring more staff, employees might neglect safety protocols to meet the new demands. According to Rasmussen, this will likely result in ‘systematic migration toward the boundary of functionally acceptable performance’ (p. 189). If this boundary is irreversibly crossed, an error or an accident may occur. Where exactly the boundary lies is inherently difficult to identify; accidents are often the only source of information on its position (Cook and Rasmussen, Reference Cook and Rasmussen2005).

In most systems, boundary transgressions are anticipated and addressed by adding several safety layers − also known as defences-in-depth − to the system’s design (for an overview, see Marsden, Reference Marsden2022). These safety layers, which are ideally independent, guard against each others’ breakdown, absorb violations, and thus maintain system stability even when failures such as human error or a malfunctioning alert system occur (Reason, Reference Reason2016). Many accidents discussed in the human factors literature, such as the partial meltdown of a nuclear reactor at the Three Mile Island power plant, can be attributed to multiple failures in complex systems, in which both human operators and technology contributed to the accident (Perrow, Reference Perrow1984).

Rasmussen (Reference Rasmussen1997) argued that while these multiple safety layers initially help the system to maintain its operations, the absence of a feedback signal − that is, visible negative effects of a transgression − prevents the necessary changes in behaviour. As the safety layers wear down over time, the system becomes unable to manage the strain, and the gradual build-up of transgressions eventually results in accidents. Additionally, the number of safety layers increases the overall complexity of the system, which in turn increases the level of risk (Marsden, Reference Marsden2022). Rasmussen’s model thus describes a system whose gradual erosion is not directly visible, but becomes apparent only when the system breaks down under pressure. Rasmussen identified the absence of an overarching monitoring or coordination layer with sufficient understanding of the entire organization to identify deviations and respond accordingly as a major flaw of the defences-in-depth approach (Rasmussen, Reference Rasmussen1997).

It should be noted here that not all transgressions go unnoticed. Operators may choose to ignore them if deviating from rules and norms has become accepted practice in the organization. The explosion of the Challenger space shuttle in 1986 is a case in point (Perrow, Reference Perrow1996). NASA engineers had repeatedly deviated from their goal of zero failures prior to the explosion, as previous deviations had not resulted in an accident. Crucially, damage to critical components (i.e., the O-ring seals in the booster rockets) had been discovered in tests and flights preceding the accident (Rogers et al., Reference Rogers, Armstrong, Acheson, Covert, Feynman, Hotz, Kutyna, Ride, Rummel, Sutter, Walker, Wheelon and Yeager1986).

In summary, according to the drift-to-danger model, complex sociotechnical systems are designed to be fault tolerant. This makes them resistant to human or technical error. However, while safety layers can tolerate small faults, they fail catastrophically once the compounding of multiple small faults reaches a threshold. A near miss happens when those faults can be contained and reversed before the threshold is crossed. We think of democracy as a similar system of largely independent safety layers − checks and balances as well as legal and informal norms − designed to protect the system against disruptions. Rasmussen’s criticism about the absence of a coordination layer that monitors safety layers and identifies deviations from rules and procedures (Rasmussen, Reference Rasmussen1997) also applies to democratic systems. Although a substantial number of democratic institutions (e.g., courts, the media, parliament, government and civil society) implement a defences-in-depth approach, the lack of an overarching coordination layer creates systemic vulnerabilities. Furthermore, democracies are much more complex and dynamic than technical systems. Modelling studies show that the dynamic demands of political (e.g., voters, parties, politicians, lobbyists) and economic (e.g., budget constraints, inflation) factors introduce additional risks by pushing the system to operate at the limits of acceptable strain (Eliassi-Rad et al., Reference Eliassi-Rad, Farrell, Garcia, Lewandowsky, Palacios, Ross, Sornette, Thébault and Wiesner2020; Morrison and Wears, Reference Morrison and Wears2022; Wiesner et al., Reference Wiesner, Bien and Wilson2023). Like technical systems, democracies can absorb small deviations from the ideal operational practice. If violations are normalized, however, their effects can accumulate, eventually leading to non-linear and irreversible system changes.

In the following, we analyse five cases in which such catastrophic failures have been successfully averted. We examine the lessons that can be drawn from considering these cases through the lens of the drift-to-danger model and the behavioural sciences generally.

Historical analysis of democratic near misses

Our review of the literatureFootnote 2 identified five cases of democratic near misses, presented in Table 1: Finland (1930), the UK (1930s), Spain (1981), Colombia (2010), Sri Lanka (2015) and South Korea (2017). The cases demonstrate that the erosion of democracy often begins with political elites pushing the boundaries of their power, and that − consistent with the drift-to-danger model − democratic erosion begins gradually (Rasmussen, Reference Rasmussen1997). Like frogs in a pan of slowly heating water, those who protect democracy often fail to see the risks to the system until it is almost too late. The sudden and unexpected collapse of democracy in Chile in 1973 serves as a clear illustration of this process. Consequently, Chile, while not being a near miss itself, is highlighted as a special case in Table 1.

Table 1. Selected historical cases of democratic near misses

Figure 1 illustrates the drift-to-danger model as applied to democratic backsliding. As political elites repeatedly violate norms, democracy slowly drifts towards autocracy. Public or behavioural interventions can reduce the prevalence or severity of such norm violations, thereby slowing the drift. The safety layers designed to slow or prevent democratic backsliding are also subject to protective and erosive forces. Risk factors such as misinformation, populism and polarization can undermine them; behavioural science interventions can strengthen them. The number of safety layers and the point at which a layer fails are difficult to predict. If at least one safety layer holds, full backsliding can be prevented, leading to a near miss. However, if all layers fail, the threshold to autocracy will be reached, endangering core democratic principles (e.g., freedom of speech, protection of minority rights). Thus, any norm violation is inherently problematic, as it remains unpredictable when and if a violation will push democracy over the edge.

Figure 1. Illustration of the drift-to-danger model applied to democratic backsliding. The solid black line represents a gradual drift toward autocracy. Elite norm violations are a principal driver of this drift and can be opposed by behavioural countermeasures. The threshold to autocracy (solid red horizontal line) is protected by a number of safety layers (thin red lines) that can be undermined by risk factors and strengthened by behavioural science interventions. If at least one safety layer holds, making it possible to reverse the drift, a near miss occurs.

Elite norm violations in near misses

The near misses presented in Table 1 reveal complex non-linear patterns of interacting factors; however, elite violations of democratic norms emerge as a core driver of democratic backsliding in all cases. These norm violations do not necessarily breach constitutional boundaries, indicating that constitutional and other legal provisions alone may be insufficient. Gaps and ambiguities in any constitution, no matter how well designed, leave room for interpretation and exploitation (Levitsky and Ziblatt, Reference Levitsky and Ziblatt2018). In a stable democracy, societal and political norms fill these gaps, ensuring the smooth operation of the system by governing elite and party behaviours and their interaction with the public. Political norms act as crucial, but often unspoken, safety layers that can arrest a drift to danger. However, they can become risk factors once eroded (e.g., if mainstream parties renege on the agreement not to form coalitions with extremist parties), changed (e.g., if violations become ‘normal’) or ignored (e.g., even if some political actors and institutions assume them to be still operative).

Given this critical role of elite norm violations in democratic near misses, we identify two areas in which behavioural science insights can safeguard democracy. The first involves direct interaction with political elites: Is the public willing to tolerate norm violations or does it punish such violations? Can politicians’ behaviour be shaped by pro-democratic interventions? The second involves the broader societal risk factors that may facilitate elite norm violations by eroding safety layers. Can those risk factors be mitigated by behavioural interventions? We examine both areas in turn.

Democratic norms, elite norm violations and behavioural science

Two particularly important democratic norms are mutual toleration and institutional forbearance (Levitsky and Ziblatt, Reference Levitsky and Ziblatt2018). The norm of mutual toleration states that each party accepts the other’s right to compete for power and govern, as long as they adhere to the democratic process. Rivals are not seen as existential threats and politicians are collectively willing to agree to disagree. However, the openness of democracies can be exploited by bad-faith actors such as extremist political organizations. Indeed, Hitler described the strategy behind the rise of National Socialism as using the democratic process to destroy democracy (Weber, Reference Weber2022).

Institutional forbearance means exercising restraint in situations where actions are legal but against the spirit of the constitution. In the US, there is a 200-year-old tradition of the sitting president to nominate a Supreme Court replacement even in a presidential election year, symbolising cooperation between president and Senate (Kar and Mazzone, Reference Kar and Mazzone2016). The Republican-led Senate broke this norm when it refused to hold hearings for President Obama’s nominee.

Elite attacks against these norms pose a major threat to democratic stability. They can be especially damaging in segments of the public where support for democratic norms and emancipatory values is already low (Kromphardt and Salamone, Reference Kromphardt and Salamone2021).

Elite norm violations can be amplified by the fact that norms are not static, but change over time, sometimes rapidly. Such changes can imperil democracy without any obvious breaches of rules or laws. For example, Bursztyn et al. found that the widespread social norm against overt expression of racism and xenophobia unravelled quickly after Donald Trump’s election. Study participants evidently interpreted his victory as a sign of widespread, hitherto hidden, anti-immigrant sentiment and became more willing to express such views (Bursztyn et al., Reference Bursztyn, Egorov and Fiorin2020). Thus, elite norm violations can systematically change the normative power of social norms, effectively giving ‘mainstream’ endorsement to behaviours previously considered unacceptable. Over time, such violations can undermine the existing norm, making the violation the new normal.

Second, elite norm violation can be enabled by a public that is unwilling to punish transgressions. For example, voters in Colombia did not oppose Uribe’s measures to expand presidential powers (Posada-Carbó, Reference Posada-Carbó2011). Similar developments can be witnessed in the US at present. Political polarization is such that behaviours previously considered unacceptable have become normalized (Mettler and Liebermann, Reference Mettler and Liebermann2020). During the 2016 US presidential campaign, empowered by a supportive base, Donald Trump invited a foreign adversary, Russia, to find and release emails from Hillary Clinton’s private server (Parker and Sanger, Reference Parker and Sanger2016). He later instructed Attorney General Jeff Sessions to shut down an investigation into his campaign’s ties to Russia. When Sessions refused, Trump fired him (Baker et al., Reference Baker, Benner and Shear2018). Republicans in Congress, with few exceptions, stood by Trump, even after his 2023 indictment for mishandling confidential documents and attempts to overturn the election. Two-thirds of Republican voters supported his renewed candidacy and were ready to vote for him regardless of whether he was convicted (Montanaro, Reference Montanaro2023).

Accepting the results of fair and free elections is, of course, a crucial norm in a democracy. Evidence suggests that rhetoric undermining this principle by claiming widespread electoral fraud in the US reinforced such beliefs (Clayton et al., Reference Clayton, Davis, Nyhan, Porter, Ryan and Wood2021). Additionally, emotional responses to norm violations by out-group elites (e.g., anger among Democrats over Republican actions or vice versa) tend to decrease over time, suggesting a desensitization effect of repeated norm violations (Clayton et al., Reference Clayton, Davis, Nyhan, Porter, Ryan and Wood2021).

Overall, research provides conflicting evidence about people’s willingness to punish norm-violating elites. On the one hand, only a small fraction of US citizens put democratic principles above their partisan identification when voting (Graham and Svolik, Reference Graham and Svolik2020). Thus, violations often go unpunished. Similarly, there is evidence that misinformation spread by politicians rarely affects individuals’ feelings towards them (Swire et al., Reference Swire, Berinsky, Lewandowsky and Ecker2017; Swire-Thompson et al., Reference Swire-Thompson, Ecker, Lewandowsky and Berinsky2020). Swire-Thompson et al. (Reference Swire-Thompson, Ecker, Lewandowsky and Berinsky2020) concluded that: ‘Liking a politician has the unfortunate side effect of blinding us to their falsehoods’ (p. 31). In fact, aggrieved groups can see politicians’ lies as a ‘symbolic challenge’ to an illegitimate establishment. Repeatedly spreading false information may also pay off for elites: Attempts to correct the falsehoods may become less effective due to people becoming habituated to the lies (T. Koch, Reference Koch2017) and political opponents becoming desensitized (Clayton et al., Reference Clayton, Davis, Nyhan, Porter, Ryan and Wood2021). Norm violations can also set examples of seemingly acceptable behaviour that partisans may then adopt (Bicchieri et al., Reference Bicchieri, Dimant, Gächter and Nosenzo2022).

On the other hand, there is some evidence that politicians are sensitive to the potential costs of having statements publicly corrected (e.g., by fact checkers). An experimental study found that the threat of reputational damage reduced the likelihood that US lawmakers would make inaccurate statements (Nyhan and Reifler, Reference Nyhan and Reifler2015). Specifically, Nyhan and Reifler randomly assigned state legislators to a treatment or control condition ahead of state elections. Legislators in the treatment condition received a letter reminding them that their public statements were subject to fact checking and that false statements carried a reputational cost. During the campaign, legislators in the treatment condition were found to be generally more accurate than their counterparts in the control condition. These findings could not be replicated in a more recent study (Ma et al., Reference Ma, Bergan, Ahn, Carnahan, Gimby, McGraw and Virtue2023), however, perhaps because of the numbing effects of the post-truth world ushered in with the election of Donald Trump (Lewandowsky et al., Reference Lewandowsky, Ecker and Cook2017).

Tsipursky and colleagues tested an intervention for politicians aimed at raising the benefits of committing to the truth and punishing the spread of misinformation. Politicians were invited to take a Pro-Truth Pledge consisting of three components (share, honour and encourage the truth), each designed to reduce misinformation sharing (Tsipursky et al., Reference Tsipursky, Votta and Mulick2018a, Reference Tsipursky, Votta and Roose2018b). It involved, for instance, sharing sources to allow others to verify information, defending others attacked for sharing factual information, and asking peers to stop using unreliable sources. The pledge seems to have had a beneficial effect, increasing signers’ sharing of truthful information on Facebook four weeks after taking it (Tsipursky et al., Reference Tsipursky, Votta and Roose2018b). These encouraging results are consistent with the finding that both voters and donors prefer candidates with pro-democratic positions (Carey et al., Reference Carey, Clayton, Helmke, Nyhan, Sanders and Stokes2022).

Risk factors enabling elite norm violation

Elite norm violations do not take place in a vacuum. They can be enabled or amplified by risk factors that erode democratic safety layers (Figure 1). Table 2 presents a selection of risk factors relevant from a behavioural sciences perspective that emerged from our analysis of near misses and the drift-to-danger model, categorized according to the five V-Dem core components of democracy. Before turning to the V-Dem core components, we consider a more domain-general insight about risk perception and behaviour.

Table 2. Factors that undermine democracy by the five V-Dem components

A neglected but potentially important risk factor is a lack of personal experience with the implications of autocratic rule. Personal experience with a risk − be it a macroeconomic shock such as the Great Depression (Malmendier and Nagel, Reference Malmendier and Nagel2011), a period of hyper-inflation (Malmendier and Nagel, Reference Malmendier and Nagel2016), a catastrophic natural hazard such an earthquake (Wachinger et al., Reference Wachinger, Renn, Begg and Kuhlicke2013) or a global pandemic such as COVID-19 (Dryhurst et al., Reference Dryhurst, Schneider, Kerr, Freeman, Recchia, van der Bles, Spiegelhalter and van der Linden2020) − has been found to influence people’s perception of risk more generally. For instance, a recent analysis of more than 15,000 people in Germany found that those who had contracted coronavirus consistently rated the likelihood of infection higher than those without such experience. Media coverage also influenced risk judgements, but to a lesser extent (Schulte-Mecklenbeck et al., Reference Schulte-Mecklenbeck, Wagner and Hertwig2024). Similarly, in an international survey of 24 countries, personal experience of global warming predicted the willingness to endorse specific mitigation actions (Broomell et al., Reference Broomell, Budescu and Por2015).

People tend to learn about risks either from personal experience or from description (Hertwig and Wulff, Reference Hertwig and Wulff2022). Ample evidence from psychology and economics indicates that people’s propensity to take risks in the future depends on lessons taught by past experiences. For example, one of the probabilistic outcomes of unprotected sex is contracting a sexually transmitted infection (STI). When base rates of STIs are low, as is typically the case, not contracting an STI is the likelier outcome of having unprotected sex. At the current rates of disease in Europe, a person would need to have sex with at least 15 (randomly selected) people to reach a 50% probability of encountering a partner with syphilis, chlamydia or gonorrhea (in 2016; see Ciranka and Hertwig, Reference Ciranka and Hertwig2023). Therefore, most people (especially adolescents) who have unprotected sex do not contract STIs, with one likely consequence being that many do not learn to protect themselves.

In contrast, people who do experience rare events with negative outcomes such as contracting an STI are more risk averse. This ‘hot-stove effect’ (Denrell, Reference Denrell2007) gives rise to a powerful behavioural bias that prevents them from repeating the behaviour associated with the adverse outcome − a cat that has sat on a hot stove lid once is unlikely to do so again.

These behavioural regularities have implications for the efficacy of warnings about risks in general and democratic decline in particular. Democracies may warn their citizens about the potential consequences of behaviours such as elite norm violations, but these warnings compete with the everyday experience of a still-functioning democracy (Hertwig and Wulff, Reference Hertwig and Wulff2022) and may thus go unheeded. Indeed, safe experiences can undermine the effectiveness of warnings in various domains (see Barron et al., Reference Barron, Leider and Stack2008). This dynamic may also help explain why early warnings about the risks of climate change were relatively ineffective (see Hertwig and Wulff, Reference Hertwig and Wulff2022; E. U. Weber, Reference Weber2006; E. U. Weber and Stern, Reference Weber and Stern2011).

This is especially problematic when the probability of a catastrophic event is low but increases over time. Hertwig and Wulff (Reference Hertwig and Wulff2022) used the example of Mount Vesuvius to illustrate this dynamic − the volcano described as ‘Europe’s ticking time bomb’ (Barnes, Reference Barnes2011, p. 140). Around 600,000 people live in the Red Zone that would be at highest risk in the event of an eruption. Yet neither expert warnings (e.g., Mastrolorenzo et al., Reference Mastrolorenzo, Petrone, Pappalardo and Sheridan2006) nor financial incentives (e.g., Barberi et al., Reference Barberi, Davis, Isaia, Nave and Ricci2008) have persuaded them to leave the danger zone. Hertwig and Wulff (Reference Hertwig and Wulff2022) argued that this can be attributed to the residents’ long-lasting ‘all-clear experience’ (p. 641): The last violent eruption occurred in 1944. People who have never experienced an eruption behave as if they underweight the probability of one occurring.

Experience is a powerful teacher of risks, causing people to both overweight risk (once experienced) and underweight it (after a sequence of safe experiences). Simulations − e.g., of earthquakes, investment risks and old age − can provide tangible demonstrations of the impact of potential risks without exposing individuals to actual harm (Hertwig and Wulff, Reference Hertwig and Wulff2022). Available simulations like the Swiss Seismological Service’s Earthquake GamesFootnote 3 or role-playing simulations on transitions to democracy (Jiménez, Reference Jiménez2015) or civil–military relations during mass uprisings (Harkness and DeVore, Reference Harkness and DeVore2021) can provide the blueprint for interventions that simulate life and risks in an autocracy. Citizens who ‘experience’ the risk of democratic decline may be better calibrated to address its threats and prospective losses. Interventions could take the form of online games, virtual reality simulations or interactive museum exhibits. For example, the House of Terror in BudapestFootnote 4 and the Museum of Occupations and Freedom Fights in VilniusFootnote 5 illustrate the brutal realities of living under an oppressive regime. There is increasing evidence that such simulations are more effective than description-based interventions. A study on COVID-19 vaccination found that people exposed to an interactive risk–ratio simulation were more likely to get vaccinated and tend to have a better understanding of the benefit-to-harm ratio (Wegwarth et al., Reference Wegwarth, Mansmann, Zepp, Lühmann, Hertwig and Scherer2023) than people presented with the same information in a conventional text-based format. It is difficult to directly target elites with interventions; using insights from behavioural science to make citizens more sensitive to the risks of democratic backsliding seems a promising approach to bolster democratic resilience.

The liberal, participatory and deliberative components of the V-Dem taxonomy appear especially vulnerable to erosion because − relative to the electoral and egalitarian components − they rely more on norm commitment than on legislation or regulation. We next show how the behavioural sciences can inform measures to counter the risks identified in Table 2, thus reinforcing safety layers against norm violations by elites. The allocation of risks to the V-Dem taxonomy is not clear cut, as some risks such as misinformation can affect more than one aspect of democracy, for instance liberal and participatory. Yet, for the sake of analytical clarity, we categorize each risk under the aspect of democracy it most directly impacts. This approach allows for a more precise identification of how specific threats undermine democratic functions and facilitates targeted responses.

For example, misinformation primarily threatens the participatory aspect of democracy by distorting public opinion and voter behaviour, which undermines the legitimacy of electoral processes and the responsiveness of political representatives. While it also has implications for liberal democracy by potentially eroding trust in institutions and the rule of law, its most direct and immediate impact is on the quality and inclusiveness of public participation.

Right-wing extremism under the banner of populism

Addressing right-wing extremism cloaked in populism from a behavioural perspective is easier said than done: Its proponents appeal to emotion − in particular anger and outrage (Gerbaudo et al., Reference Gerbaudo, De Falco, Giorgi, Keeling, Murolo and Nunziata2023) − and use rhetorical strategies that are difficult to counter, while sidestepping policy debate. According to Kayam (Reference Kayam and Akande2023, p. 277), three of Trump’s main strategies are ‘make it simple, make it negative, and make it “Twitty”’, which means using ad populum and ad hominem appeals. This makes conventional argumentation difficult, if not impossible. Less conventional approaches include using satire to shine a light on the shortcomings of right-wing populist policies (e.g., nationalist solutions to global problems). Using a large-scale survey methodology, Boukes and Hameleers (Reference Boukes and Hameleers2020) examined how the Dutch satirical show, Zondag met Lubach, influenced people’s willingness to vote for a populist party after the show targeted its lack of identifiable policy positions. The results revealed that the show reduced support for both the party and its leader, and that the decline was particularly strong among citizens inclined to vote for populist parties. Satire has been shown to work in other contexts as well, such as fact checking (Boukes and Hameleers, Reference Boukes and Hameleers2023): Satirical corrections reduce belief in misinformation, but also lead to greater polarization than plain fact-based corrections (Boukes and Hameleers, Reference Boukes and Hameleers2023). Humour and satire thus constitute an effective tool to counter populism; however, they should not be deployed without great care.

According to Rovira Kaltwasser (Reference Rovira Kaltwasser, Kaltwasser, Taggart, Espejo and Ostiguy2017), fighting right-wing populism by depicting its proponents as villains and its opponents as heroes is ineffective. Such framing fosters polarization and can create a populism vs anti-populism divide that may unwittingly align precisely with the division that right-wing extremists seek to create. Instead, reminding people of the value of deliberation and group norms may help close the social divide (e.g., Cialdini and Goldstein, Reference Cialdini and Goldstein2004; Mansbridge and Macedo, Reference Mansbridge and Macedo2019; Kendall-Taylor and Nietsche, Reference Kendall-Taylor and Nietsche2020; Pantazi et al., Reference Pantazi, Papaioannou and van Prooijen2022).

Another issue of concern for the behavioural sciences is the post-truth communication frequently employed by right-wing populists. Post-truth phenomena such as misinformation and conspiracy theories exploit existing societal chasms as well as individual beliefs about the government and political elites (Waisbord, Reference Waisbord2018; Uscinski et al., Reference Uscinski, Enders, Diekman, Funchion, Klofstad, Kuebler, Murthi, Premaratne, Seelig, Verdear and Wuchty2022). We discuss interventions to counter mis- and disinformation in the next section.

Once right-wing populists are in government, interventions become even more difficult and can backfire, as discussed by Schlipphak and Treib (Reference Schlipphak and Treib2017) using the cases of Austria in the early 2000s and Hungary under Victor Orban. In both cases, EU interventions (e.g., sanctions) did not reduce public support for the government; on the contrary, support increased over time. Schlipphak and Treib (Reference Schlipphak and Treib2017) argued that this effect can be attributed to successful blame deflection. The politicians framed the EU’s actions as an illegitimate intervention from ‘outside’, creating an ‘us’ vs ‘them’ juxtaposition, and thus de-legitimizing the interventions. A better approach would be for the EU to build a coalition with domestic actors, intervening only when oppressed domestic groups ask for help. Instead targeting an entire country, sanctions should focus on actual offenders, such as political elites and high-ranking officials. Institutionally, an independent supervisory body could be established to conduct ‘open, independent and impartial’ (Schlipphak and Treib, Reference Schlipphak and Treib2017, p. 362) assessments of the state of democracy. We propose that behavioural scientists could support this body by developing guidelines and designing evidence-based measures for cases in which legal interventions are insufficient − for example, when norms are threatened.

Misinformation and conspiracy theories

The proliferation of false or misleading information and conspiracy theories through media and digital platforms − especially social media − is a global phenomenon that can have detrimental effects on public welfare (e.g., health), responses to global crises (e.g., pandemics, climate change) and the stability of democracies (Lewandowsky, Smillie, et al., Reference Lewandowsky, Smillie, Garcia, Hertwig, Weatherall, Egidy, Robertson, O’Connor, Kozyreva, Lorenz-Spreen, Blaschke and Leiser2020b; Lorenz-Spreen et al., Reference Lorenz-Spreen, Oswald, Lewandowsky and Hertwig2022). It is influenced by media conglomerates and online platforms, but also by individual and collective behaviours (Lazer et al., Reference Lazer, Baum, Benkler, Berinsky, Greenhill, Metzger, Nyhan, Pennycook, Rothschild, Sunstein, Thorson, Watts and Zittrain2018; Lewandowsky, et al., Reference Lewandowsky, Ecker and Cook2017). A range of behavioural sciences-based interventions have been proposed to target behaviours in the digital world. Recent reports by Ecker et al. (Reference Ecker, Lewandowsky, Cook, Schmid, Fazio, Brashier, Kendeou, Vraga and Amazeen2022), Kozyreva et al. (Reference Kozyreva, Lorenz-Spreen, Herzog, Ecker, Lewandowsky, Hertwig, Ali, Bak-Coleman, Barzilai, Basol, Berinsky, Betsch, Cook, Fazio, Geers, Guess, Huang, Larreguy, Maertens and Wineburg2024) and van der Linden et al. (Reference van der Linden, Albarracín, Fazio, Freelon, Roozenbeek, Swire-Thompson and van Bavel2023) have examined the evidence for these interventions. They can be divided into individual-level interventions such as inoculation (which seeks to build people’s competence at discerning manipulative information), media-literacy tips, warnings and fact-check labels, debunking and accuracy and social norm nudges. ‘Friction’ can be introduced to slow information processing and encourage more careful analysis. Users can be taught to use lateral reading strategies, that is to leave the initial source and open new tabs to search for more information about the person or organization behind a website or social media post and the claims made. There is considerable evidence that those techniques work even in the wild. For example, Roozenbeek et al. (Reference Roozenbeek, van der Linden, Goldberg, Rathje and Lewandowsky2022) showed that YouTube users benefited from brief information videos that boosted their ability to distinguish manipulative information from high-quality information.

Although helpful, such individual-focused interventions are insufficient to address the scale of the misinformation problem. Systemic interventions are also needed for online content (e.g., regulatory legislation of platforms), algorithms (e.g., automated tools for content moderation) and business models (e.g., supporting reliable news media).

Issues surrounding content moderation, including the balance between safeguarding freedom of expression and minimizing risks to public health, have polarized debate, particularly in the US (see the ongoing legal dispute about the First Amendment and its impact on social media companies; Zakrzewski, Reference Zakrzewski2023). In perhaps the first behavioural science study of people’s preferences around content moderation, Kozyreva et al. (Reference Kozyreva, Lorenz-Spreen, Hertwig, Lewandowsky and Herzog2021) found that, under specific circumstances, a majority of US respondents would remove misinformation-based social media posts on election denial, anti-vaccination, Holocaust denial and climate change. Respondents were more likely to remove posts that contained potentially dangerous misinformation or if the information had been circulated multiple times by the person. They were more reluctant to suspend accounts than to remove posts. In general, however, the US public does not categorically oppose content moderation of harmful content.

Importantly, the cognitive and behavioural sciences have already contributed to EU regulations (Kozyreva, et al., Reference Kozyreva, Smillie and Lewandowsky2023) by, for instance, designing and testing interventions, informing the design of regulations and revealing and documenting people’s preferences (e.g., for content moderation).

Voter disenchantment

Voter apathy or lack of engagement with elections presents a problem in many liberal democracies. Communication campaigns are often seen as a relatively simple solution; however, the evidence for their effectiveness is limited (Haenschen, Reference Haenschen2023). Behavioural science can contribute by helping to build a culture where participation and active choice is valued. Even subtle changes in wording can be enough to increase people’s motivation to vote. For example, framing voting as a facet of personal identity rather than a behaviour − using phrases such as ‘being a voter’ rather than ‘voting’ − increases people’s likelihood to vote (Bryan et al., Reference Bryan, Walton, Rogers and Dweck2011).

Other interventions discussed to increase voter turnout include creating a pre-commitment device in form of a registry for people who commit to vote, with small penalties being imposed for failing to vote (Pedersen et al., Reference Pedersen, Thaysen and Albertsen2023). In contexts with mandatory voting or voter registration (e.g., Australia, Belgium), highlighting the role of the negative monetary effects of non-voting can be an effective nudge (Kölle et al., Reference Kölle, Lane, Nosenzo and Starmer2017). However, these interventions are difficult to implement, often only suitable in certain contexts, and raise ethical questions about the validity of consent if citizens have to opt out from interventions such as the pre-commitment system.

Prompting people to consciously consider when, where and how they will cast their vote can also help to increase voter turnout (Gollwitzer, Reference Gollwitzer1999; Gollwitzer and Sheeran, Reference Gollwitzer and Sheeran2006). Nickerson and Rogers (Reference Nickerson and Rogers2010) reported a substantial increase of voter turnout of 9.1 percentage points in single-eligible voter households. For households with two or more voters, however, the intervention reduced turnout by 1.5 percentage points, probably because these households would discuss voting and make a plan anyway. Anderson et al. (Reference Anderson, Loewen and McGregor2018) found that, in addition to making a voting plan, individuals benefit from relevant, clear information material about the election. Furthermore, the salience of norms that represent a group’s collective values influences voting intentions, regardless of how close or significant a group (e.g., friends, family) is to the individual (French Bourgeois and de la Sablonnière, Reference French Bourgeois and de la Sablonnière2023).

Polarization

Affective polarization is strongly correlated with democratic backsliding. Even within democracies, it can reduce accountability, freedom, deliberation and rights (Svolik, Reference Svolik2019; Orhan, Reference Orhan2022) and increase the chance of elite norm violations. Interventions to mitigate polarization can target information processing, beliefs or social relations. Information processing interventions seek to change individual reasoning patterns that guide the interpretation of information. Examples include addressing cognitive rigidity, which is associated with intergroup hostility and ideological extremism (Zmigrod, Reference Zmigrod2020), and intra-individual conflict, which can lead to paradoxical (Bar-Tal et al., Reference Bar-Tal, Hameiri, Halperin and Gawronski2021) or counterfactual thinking (Epstude and Roese, Reference Epstude and Roese2008).

Findings on the distorting effects of in- and out-group perceptions suggest that polarization can be mitigated by addressing specific beliefs (Mackie, Reference Mackie1986). In politics, polarization results in partisan animosity, defined as ‘negative thoughts, feelings or behaviours towards a political outgroup’ (Hartman et al., Reference Hartman, Blakey, Womick, Bail, Finkel, Han, Sarrouf, Schroeder, Sheeran, Van Bavel, Willer and Gray2022, p. 1194). In a large-scale study, Voelkel et al. (Reference Voelkel, Chu, Stagnaro, Mernyk, Redekopp, Pink, Druckman, Rand and Willer2023) tested 25 interventions designed to reduce polarization. Only six showed lasting results. These interventions took various approaches: highlighting that most Democrats and Republicans reject polarization; showing that positive social connection across party lines is possible despite political disagreement; making national identity salient; correcting misperceptions about outpartisans’ support for undemocratic actions and their tendency to dehumanize the other party; and creating sympathetic personal narratives.

Polarization interventions targeting social relations are informed by evidence on intergroup contact (Pettigrew, Reference Pettigrew1998; Pettigrew and Tropp, Reference Pettigrew and Tropp2006). They use contact with out-group members to humanize outpartisans and create a more realistic image of their thinking and behaviour. Interventions include improving people’s dialogue skills, enabling a constructive debate despite political differences and facilitating positive contact between partisans − for example, by highlighting what both groups have in common (Hartman et al., Reference Hartman, Blakey, Womick, Bail, Finkel, Han, Sarrouf, Schroeder, Sheeran, Van Bavel, Willer and Gray2022).

Recent work by Voelkel et al. (Reference Voelkel, Chu, Stagnaro, Mernyk, Redekopp, Pink, Druckman, Rand and Willer2023) and Broockman et al. (Reference Broockman, Kalla and Westwood2022) cautions that while depolarization interventions reliably reduce affective polarization, they do not appear to be successful in reducing anti-democratic attitudes, such as support for partisan violence. It appears that once a society becomes so divided that political identity overtakes social identity, members of the other political camp may be perceived as a threat to the nation, thus legitimizing all means possible to defend one’s own interests.

Limitations and expansion

We restricted our analysis to factors falling within the realm of the behavioural sciences (Table 2) that offer scope for countermeasures. However, numerous other systemic factors may also facilitate democratic backsliding, such as economic inequality (Siripurapu, Reference Siripurapu2022) and the design of the online information environment (Lewandowsky et al., Reference Lewandowsky, Smillie, Garcia, Hertwig, Weatherall, Egidy, Robertson, O’Connor, Kozyreva, Lorenz-Spreen, Blaschke and Leiser2020b). In the future, climate change may also impact the stability of democratic systems, as authoritarian policies to address the climate crisis become more likely in the most affected areas (Mittiga, Reference Mittiga2022). Although the importance of such systemic factors must not be underestimated, they do not negate the role of the behavioural sciences.

Conclusion: behavioural science against democratic backsliding

Near misses in sociotechnical systems are adverse events that could have caused damage to people and/or property but were prevented by means of safety layers (Jones et al., Reference Jones, Kirchsteiger and Bjerke1999). In democracies, near misses are understood as situations in which political systems either managed to withstand a drift towards autocracy (Figure 1) or briefly became autocratic before returning to democratic governance (Ginsburg and Huq, Reference Ginsburg and Huq2018). The present analysis of democratic near misses identified factors that have successfully prevented democratic decline in the past (as it is common in the field of safety science; see Gnoni et al., Reference Gnoni, Tornese, Guglielmi, Pellicci, Campo and De Merich2022) and can therefore inform future interventions to increase democratic stability.

Inspired by Rasmussen (Reference Rasmussen1997), we adapted the drift-to-danger model to democratic near misses. Within this framework, backsliding is enabled by the confluence of various factors: Populism, misinformation and polarization collude in eroding the safety layers that would otherwise prevent political elites from violating the norms essential to keep a democracy functioning. Elite norm violations − and the public response to them − are at the heart of all historical near misses we analysed. Indeed, this is the first crucial insight from our analysis: Democratic backsliding is closely tied to elite norm violations, but the role of the public in condoning or opposing those violations is far more variable.

The second insight concerns the non-linearity of the drift underlying democratic backsliding. Some violations can be absorbed, but democracy’s breaking point might at any point be just one safety layer away. This non-linearity underscores the importance of protecting all democratic norms and calling out all violations, as the downstream effects cannot be predicted. The drift-to-danger model helps to understand how gradual declines in democracy can suddenly turn catastrophic and irreversible. While the model is not testable in itself, it suggests hypotheses − for example, that the failure of a single safeguard does not critically influence the overall trajectory of backsliding or that the exact tipping point is difficult to predict.

Although our analysis of near misses was limited to past cases, we did briefly explore the implications of that analysis within the current situation in the UK and US. The US system of checks and balances has served as a model for many other democracies. Yet, its safety layers seem to be eroding in several areas. Society is highly polarized across all levels, from political leaders to citizens. Elite norm violations have become more frequent, culminating in Donald Trump disputing the legitimacy of the 2020 presidential elections.

It is, however, important to emphasize that cases such as the January 6 insurrection in the US and the prorogation of parliament in the UK represent individual episodes within a sequence of events that can ultimately contribute to democratic decline. They do not singularly constitute a near miss (from the perspective of the drift-to-danger model, no single event should cause a near miss). Yet, while the outcomes of anti-democratic actions are unpredictable, examining the intentions of the actors involved can reveal their willingness to undermine democracy in the absence of checks and balances. The intentions of political elites matter, as Levitsky and Ziblatt (Reference Levitsky and Ziblatt2023) have shown for what they call semi-loyal democrats. Members of this group do not actively harm democracy but fail to defend it in times of polarization and crisis. By turning a blind eye to the autocratic acts of ideological allies, they enable antidemocratic extremists. History provides several examples of this mechanism. In 1934, violent rioters tried to occupy the French parliament, leading to the resignation of the centrist prime minister. Many conservative politicians did not condemn the insurrection; some even praised the rioters as ‘heroes and patriots’. This lack of condemnation allowed the insurrectionists’ ideas to enter mainstream conservative thought, including a preference for Hitler over the socialist prime minister − although French conservatives were historically anti-German (Levitsky and Ziblatt, Reference Levitsky and Ziblatt2023). Therefore, even in the absence of clear and predictable tipping points, investigating established backsliding patterns, especially in political elites’ intentions and behaviour, can help to understand and prevent democratic breakdown. As Svolik et al. (Reference Svolik, Avramovska, Lutz and Milaèiæ2023) put it: ‘To diagnose the vulnerabilities of contemporary democracies, we must therefore ask: When faced with a choice between democracy and partisan loyalty, policy priorities, or ideological dogmas, who will put democracy first?’ (p. 6).

Our framework suggests various avenues for future research. One area involves exploring how to counteract complacency toward elite norm violations − for example, by testing interventions such as autocracy simulations and their effectiveness in raising awareness of the risk of democratic decline. Another is to examine the effects of nostalgic feelings for autocratic regimes such as the German Democratic Republic, especially in times of political or economic crisis. In these situations, nostalgia could bias people in favour of the autocratic regime, making them more critical of democracy (see for instance Neundorf et al., Reference Neundorf, Gerschewski and Olar2020). Further research is necessary to investigate how nostalgia impacts democratic backsliding and to identify mitigating strategies.

To date, research into how the behavioural sciences can strengthen democracy and prevent backsliding is scarce. Druckman (Reference Druckman2024)’s recent study on democratic backsliding from a psychological perspective also stresses the need to look beyond the structural factors frequently discussed in political science. Yet while Druckman (Reference Druckman2024) also identifies elites as important actors in the backsliding process − along with social movements, interest groups and campaign organizations − his framework does not address either the process of backsliding itself or potential interventions to stop it. Future research could therefore take a more process-oriented perspective, like the drift-to-danger model, and emphasize the role of non-elite actors in facilitating norm violations, a topic only briefly explored in this article.

Behavioural interventions are just one tool in the toolbox of forces working together to stop democratic backsliding. Most of these behavioural interventions are aimed at the public, aiming to increase awareness of the risks of democratic decline and to boost resilience to manipulation and false information (see also Herzog and Hertwig, Reference Herzog and Hertwig2025). When it comes to elites, however, these interventions seem largely ineffective in influencing those willing to push the norms of acceptable behaviour. Yet, as documented by our historical analysis, some political elites do stand up for democracy. For example, the UK Supreme Court blocked Boris Johnson’s attempt to prorogue parliament, and the Georgia’s Republican Secretary of State in Georgia, Brad Raffensperger, resisted Donald Trump’s pressure ‘to find 11,780 votes’ (Shear and Saul, Reference Shear and Saul2021). These examples highlight that elite resistance and push back can interrupt, at the least for the moment, democratic backsliding.

Funding statement

SL, KH, and CA are supported by an Advanced Grant from the European Research Council (101020961 PRODEMINFO) to SL. SL and RH received support from the Volkswagen Foundation (large grant ‘Reclaiming individual autonomy and democratic discourse online: How to rebalance human and algorithmic decision making’) and from the European Union’s Horizon 2020 grant 101094752 (SoMe4Dem). SL also received funding from UK Research and Innovation through the EU Horizon replacement funding grant number 10049415 and was supported by a Fellowship from the Academy for International Affairs NRW during part of this project. The authors thank Susannah Goss very much for editing the paper.

Competing interests

The authors declare no competing interests.

Appendix

Cases were selected for this analysis through a literature search on Google Scholar conducted between 5 and 14 July 2023. Table A1 presents the number of papers returned for each search query.

Table A1. Results for Google Scholar search queries

Further articles published at a later stage were included subsequently. The present analysis was limited to cases that were strongly documented as near misses in the literature (e.g., Colombia, Sri Lanka) or can historically be understood as a near miss (e.g., UK, in the 1930s).

Although various violations of democratic norms in several Western countries have been reported in the media (e.g., the prorogation of parliament in the UK; the attempt of former President Donald Trump to stop the peaceful transfer of power on 6 January 2021), these cases can be understood as instances of democratic backsliding rather than near misses, and are therefore not discussed in our historical analysis.

Footnotes

2 We searched Google Scholar for relevant journal articles using the keywords: ‘near misses’ AND ‘democratic backsliding’, ‘near misses’ AND ‘backsliding’, and ‘democratic near misses’. More details on case selection are presented in the Appendix.

References

Anderson, C. D., Loewen, P. J. and McGregor, R. M. (2018), Implementation intentions, information, and voter turnout: an experimental study, Political Psychology, 39(5): 10891103.CrossRefGoogle Scholar
Ardèvol‐Abreu, A., Gil de Zúñiga, H. and Gámez, E. (2020), The influence of conspiracy beliefs on conventional and unconventional forms of political participation: the mediating role of political efficacy, British Journal of Social Psychology, 59(2): 549569.CrossRefGoogle ScholarPubMed
Aytaç, S. E., Çarkoğlu, A. and Elçi, E. (2021), Partisanship, elite messages, and support for populism in power, European Political Science Review, 13(1): 2339.CrossRefGoogle Scholar
Baker, P., Benner, K. and Shear, M. D. (2018, November 7 ), Jeff Sessions is forced out as attorney general as Trump installs loyalist. The New York Times. https://www.nytimes.com/2018/11/07/us/politics/sessions-resigns.htmlGoogle Scholar
Barberi, F., Davis, M. S., Isaia, R., Nave, R. and Ricci, T. (2008), Volcanic risk perception in the Vesuvius population, Journal of Volcanology and Geothermal Research, 172(3–4): 244258.CrossRefGoogle Scholar
Barnes, K. (2011), Volcanology: Europe’s ticking time bomb, Nature, 473(7346): 140141.CrossRefGoogle ScholarPubMed
Barron, G., Leider, S. and Stack, J. (2008), The effect of safe experience on a warnings’ impact: sex, drugs, and rock-n-roll, Organizational Behavior and Human Decision Processes, 106(2): 125142.CrossRefGoogle Scholar
Bar-Tal, D., Hameiri, B. and Halperin, E. (2021), ‘Paradoxical thinking as a paradigm of attitude change in the context of intractable conflict’, in Gawronski, B. (ed), Advances in Experimental Social Psychology, volume 63, Cambridge, MA: Academic Press, 129187.Google Scholar
Bermeo, N. (2016), On democratic backsliding, Journal of Democracy, 27(1): 519.CrossRefGoogle Scholar
Bicchieri, C., Dimant, E., Gächter, S. and Nosenzo, D. (2022), Social proximity and the erosion of norm compliance, Games and Economic Behavior, 132: 5972.CrossRefGoogle Scholar
Boese, V. A., Edgell, A. B., Hellmeier, S., Maerz, S. F. and Lindberg, S. I. (2021), How democracies prevail: democratic resilience as a two-stage process, Democratization, 28(5): 885907.CrossRefGoogle Scholar
Boukes, M. and Hameleers, M. (2020), Shattering populists’ rhetoric with satire at elections times: the effect of humorously holding populists accountable for their lack of solutions, Journal of Communication, 70(4): 574597.CrossRefGoogle Scholar
Boukes, M. and Hameleers, M. (2023), Fighting lies with facts or humor: comparing the effectiveness of satirical and regular fact-checks in response to misinformation and disinformation, Communication Monographs, 90(1): 6991.CrossRefGoogle Scholar
Broockman, D. E., Kalla, J. L. and Westwood, S. J. (2022), Does Affective Polarization Undermine Democratic Norms or Accountability? Maybe Not, American Journal of Political Science, 67(3): 808828.CrossRefGoogle Scholar
Broomell, S. B., Budescu, D. V. and Por, H-H. (2015), Personal experience with climate change predicts intentions to act, Global Environmental Change, 32: 6773.CrossRefGoogle Scholar
Brown, É. (2018), Propaganda, misinformation, and the epistemic value of democracy, Critical Review, 30(3–4): 194218.CrossRefGoogle Scholar
Bryan, C. J., Walton, G. M., Rogers, T. and Dweck, C. S. (2011), Motivating voter turnout by invoking the self, Proceedings of the National Academy of Sciences of the United States of America, 108(31): 1265312656.CrossRefGoogle Scholar
Bursztyn, L., Egorov, G. and Fiorin, S. (2020), From extreme to mainstream: the erosion of social norms, American Economic Review, 110(11): 35223548.CrossRefGoogle Scholar
Carey, J., Clayton, K., Helmke, G., Nyhan, B., Sanders, M. and Stokes, S. (2022), Who will defend democracy? Evaluating tradeoffs in candidate support among partisan donors and voters, Journal of Elections, Public Opinion and Parties, 32(1): 230245.CrossRefGoogle Scholar
Cialdini, R. B. and Goldstein, N. J. (2004), Social influence: compliance and conformity, Annual Review of Psychology, 55(1): 591621.CrossRefGoogle ScholarPubMed
Ciranka, S. and Hertwig, R. (2023), Environmental statistics and experience shape risk-taking across adolescence, Trends in Cognitive Sciences, 27(12): 11231134.CrossRefGoogle Scholar
Clayton, K., Davis, N. T., Nyhan, B., Porter, E., Ryan, T. J. and Wood, T. J. (2021), Elite rhetoric can undermine democratic norms, Proceedings of the National Academy of Sciences of the United States of America, 118(23): 16.Google ScholarPubMed
Cook, R. and Rasmussen, J. (2005), “Going solid”: a model of system dynamics and consequences for patient safety, Quality and Safety in Health Care, 14(2): 130134.CrossRefGoogle Scholar
Cullen, S. M. (1993), Political violence: the case of the British Union of Fascists, Journal of Contemporary History, 28(2): 245267.CrossRefGoogle Scholar
Denrell, J. (2007), Adaptive learning and risk taking, Psychological Review, 114(1): 177187.CrossRefGoogle ScholarPubMed
Douglas, K. M. and Sutton, R. M. (2008), The hidden impact of conspiracy theories: perceived and actual influence of theories surrounding the death of Princess Diana, Journal of Social Psychology, 148(2): 210222.CrossRefGoogle ScholarPubMed
Druckman, J. N. (2024), How to study democratic backsliding, Political Psychology, 45(S1): 342.CrossRefGoogle Scholar
Dryhurst, S., Schneider, C. R., Kerr, J., Freeman, A. L. J., Recchia, G., van der Bles, A. M., Spiegelhalter, D. and van der Linden, S. (2020), Risk perceptions of COVID-19 around the world, Journal of Risk Research, 23(7–8): 9941006.CrossRefGoogle Scholar
Ecker, U. K. H., Lewandowsky, S., Cook, J., Schmid, P., Fazio, L. K., Brashier, N., Kendeou, P., Vraga, E. K. and Amazeen, M. A. (2022), The psychological drivers of misinformation belief and its resistance to correction, Nature Reviews Psychology, 1(1): 1329.CrossRefGoogle Scholar
Eliassi-Rad, T., Farrell, H., Garcia, D., Lewandowsky, S., Palacios, P., Ross, D., Sornette, D., Thébault, K. and Wiesner, K. (2020), What science can do for democracy: a complexity science approach, Humanities and Social Sciences Communications, 7(1): 811.CrossRefGoogle Scholar
Epstude, K. and Roese, N. J. (2008), The functional theory of counterfactual thinking, Personality and Social Psychology Review, 12(2): 168192.CrossRefGoogle ScholarPubMed
Ewing, K. and Gearty, C. A. (2001), The Rise and Fall of Facism, Oxford: Oxford University PressGoogle Scholar
Freedom House. (2019), Freedom in the world 2019: democracy in retreat. In Freedom House. https://freedomhouse.org/sites/default/files/Feb2019_FH_FITW_2019_Report_ForWeb-compressed.pdf.Google Scholar
French Bourgeois, L. and de la Sablonnière, R. (2023), Realigning individual behavior with societal values: the role of planning in injunctive‐norm interventions aimed at increasing voter turnout, Analyses of Social Issues and Public Policy, 23(1): 155173.CrossRefGoogle Scholar
Frenken, M. and Imhoff, R. (2023), Don’t trust anybody: conspiracy mentality and the detection of facial trustworthiness cues, Applied Cognitive Psychology, 37(2): 256265.CrossRefGoogle Scholar
Gallego, A. (2010), Understanding unequal turnout: education and voting in comparative perspective, Electoral Studies, 29(2): 239248.CrossRefGoogle Scholar
Gerbaudo, P., De Falco, C. C., Giorgi, G., Keeling, S., Murolo, A. and Nunziata, F. (2023), Angry posts mobilize: emotional communication and online mobilization in the Facebook pages of western European right-wing populist leaders, Social Media + Society, 9(1).CrossRefGoogle Scholar
Ginsburg, T. and Huq, A. (2018), Democracy’s near misses, Journal of Democracy, 29(4): 1630.CrossRefGoogle Scholar
Gnoni, M. G., Tornese, F., Guglielmi, A., Pellicci, M., Campo, G. and De Merich, D. (2022), Near miss management systems in the industrial sector: a literature review, Safety Science, 150: .CrossRefGoogle Scholar
Goertzel, T. (1994), Belief in conspiracy theories, Political Psychology, 15(4): 731742.CrossRefGoogle Scholar
Goldberg, P. A. (1975), The politics of the Allende overthrow in Chile, Political Science Quarterly, 90(1): 93116. https://www.jstor.org/stable/2148700CrossRefGoogle Scholar
Gollwitzer, P. M. (1999), Implementation intentions: strong effects of simple plans, American Psychologist, 54(7): 493503.CrossRefGoogle Scholar
Gollwitzer, P. M. and Sheeran, P. (2006), Implementation Intentions and goal achievement: a meta-analysis of effects and processes, Advances in Experimental Social Psychology, 38(06): 69119.CrossRefGoogle Scholar
Graham, M. H. and Svolik, M. W. (2020), Democracy in America? Partisanship, polarization, and the robustness of support for democracy in the United States, American Political Science Review, 114(2): 392409.CrossRefGoogle Scholar
Hadfield, A. (2019, August 28 ), Boris Johnson suspends parliament: what does it mean for Brexit and why are MPs so angry? The Conversation. https://theconversation.com/boris-johnson-suspends-parliament-what-does-it-mean-for-brexit-and-why-are-mps-so-angry-122574Google Scholar
Haenschen, K. (2023), The conditional effects of microtargeted Facebook advertisements on voter turnout, Political Behavior, 45(4): 16611681.CrossRefGoogle Scholar
Haggard, S. and Kaufman, R. (2021), The anatomy of democratic backsliding, Journal of Democracy, 32(4): 2741.CrossRefGoogle Scholar
Harkness, K. A. and DeVore, M. R. (2021), Teaching the military and revolutions: simulating civil–military relations during mass uprisings, PS: Political Science and Politics, 54(2): 315320.Google Scholar
Hartman, R., Blakey, W., Womick, J., Bail, C., Finkel, E. J., Han, H., Sarrouf, J., Schroeder, J., Sheeran, P., Van Bavel, J. J., Willer, R. and Gray, K. (2022), Interventions to reduce partisan animosity, Nature Human Behaviour, 6(9): 11941205.CrossRefGoogle ScholarPubMed
Haslam, S. A., Reicher, S. D., Selvanathan, H. P., Gaffney, A. M., Steffens, N. K., Packer, D., Van Bavel, J. J., Ntontis, E., Neville, F., Vestergren, S., Jurstakova, K. and Platow, M. J. (2023), Examining the role of Donald Trump and his supporters in the 2021 assault on the U.S. Capitol: a dual-agency model of identity leadership and engaged followership, The Leadership Quarterly, 34(2): .CrossRefGoogle Scholar
Helderman, R. S. (2022, February 9 ), All the ways Trump tried to overturn the election—and how it could happen again. The Washington Post. https://www.washingtonpost.com/politics/interactive/2022/election-overturn-plans/Google Scholar
Herrera, M. and Morales, M. (2023), Public opinion, democracy, and the armed forces: Chile before the 1973 military coup, Social and Education History, 12(2): 160192.CrossRefGoogle Scholar
Hertwig, R. and Wulff, D. U. (2022), A description–experience framework of the psychology of risk, Perspectives on Psychological Science, 17(3): 631651.CrossRefGoogle ScholarPubMed
Herzog, S. M. and Hertwig, R. (2025). Boosting: Empowering citizens with behavioral science. Annual Review of Psychology, 76.Google Scholar
Imhoff, R., Dieterle, L. and Lamberty, P. (2021), Resolving the puzzle of conspiracy worldview and political activism: belief in secret plots decreases normative but increases nonnormative political engagement, Social Psychological and Personality Science, 12(1): 7179.CrossRefGoogle Scholar
Jiménez, L. F. (2015), The dictatorship game: simulating a transition to democracy, PS: Political Science and Politics, 48(02): 353357.Google Scholar
Jones, S., Kirchsteiger, C. and Bjerke, W. (1999), The importance of near miss reporting to further improve safety performance, Journal of Loss Prevention in the Process, 12(1): 5967.CrossRefGoogle Scholar
Kaltwasser, C. R. (2012), The ambivalence of populism: threat and corrective for democracy, Democratization, 19(2): 184208.CrossRefGoogle Scholar
Kar, R. B. and Mazzone, J. (2016), The Garland Affair: what history and the constitution really say about President Obama’s powers to appoint a replacement for Justice Scalia, New York University Law Review Online, 91: 53114. https://nyulawreview.org/online-features/the-garland-affair-what-history-and-the-constitution-really-say-about-president-obamas-powers-to-appoint-a-replacement-for-justice-scalia/ [2 November 2024].Google Scholar
Kayam, O. (2023), Trump’s Rhetorical Way to Presidency, Akande, A.ed, U.S. Democracy in Danger, 277292, Cham: Springer Nature Switzerland.CrossRefGoogle Scholar
Kendall-Taylor, A. and Nietsche, C. (2020), Combating populism: a toolkit for liberal democratic actors. https://www.cnas.org/publications/reports/combating-populismGoogle Scholar
Koch, C. M., Meléndez, C. and Rovira Kaltwasser, C. (2023), Mainstream voters, non-voters and populist voters: what sets them apart?, Political Studies, 71(3): 893913.CrossRefGoogle Scholar
Koch, T. (2017), Again and again (and again): a repetition-frequency-model of persuasive communication, Studies in Communication and Media, 6(3): 218239.CrossRefGoogle Scholar
Kölle, F., Lane, T., Nosenzo, D. and Starmer, C. (2017), Nudging the electorate: what works and why? http://hdl.handle.net/10419/200439www.econstor.euGoogle Scholar
Kozyreva, A., Lorenz-Spreen, P., Hertwig, R., Lewandowsky, S. and Herzog, S. M. (2021), Public attitudes towards algorithmic personalization and use of personal data online: evidence from Germany, Great Britain, and the United States, Humanities and Social Sciences Communications, 8(1): .CrossRefGoogle Scholar
Kozyreva, A., Lorenz-Spreen, P., Herzog, S. M., Ecker, U. K. H., Lewandowsky, S., Hertwig, R., Ali, A., Bak-Coleman, J., Barzilai, S., Basol, M., Berinsky, A. J., Betsch, C., Cook, J., Fazio, L. K., Geers, M., Guess, A. M., Huang, H., Larreguy, H., Maertens, R. and Wineburg, S. (2024), Toolbox of individual-level interventions against online misinformation, Nature Human Behaviour, 8: 10441052.CrossRefGoogle ScholarPubMed
Kozyreva, A., Smillie, L. and Lewandowsky, S. (2023), Incorporating psychological science into policy making, European Psychologist, 28(3): 206224.CrossRefGoogle ScholarPubMed
Kromphardt, C. D. and Salamone, M. F. (2021), “Unpresidented!” or: what happens when the president attacks the federal judiciary on Twitter, Journal of Information Technology and Politics, 18(1): 84100.CrossRefGoogle Scholar
Laebens, M. G. and Lührmann, A. (2021), What halts democratic erosion? The changing role of accountability, Democratization, 28(5): 908928.CrossRefGoogle Scholar
Lazer, D., Baum, M., Benkler, J., Berinsky, A., Greenhill, K., Metzger, M., Nyhan, B., Pennycook, G., Rothschild, D., Sunstein, C., Thorson, E., Watts, D. and Zittrain, J. (2018), The science of fake news, Science, 359(6380): 10941096.CrossRefGoogle ScholarPubMed
Lee, S. and Jones-Jang, S. M. (2024), Cynical nonpartisans: the role of misinformation in political cynicism during the 2020 U.S. presidential election, New Media & Society, 26(7): 42554276.CrossRefGoogle Scholar
Levitsky, S. and Ziblatt, D. (2018), How Democracies Die: What History Reveals About Our Future, New York: Penguin Random HouseGoogle Scholar
Levitsky, S. and Ziblatt, D. (2023, September 8 ), Democracy’s Assassins Always Have Accomplices. The New York Times. https://www.nytimes.com/2023/09/08/opinion/trump-republicans-spain-brazil.htmlGoogle Scholar
Lewandowsky, S., Cook, J., Ecker, U. K. H., Albarracín, D., Amazeen, M. A., Kendeou, P., Lombardi, D., Newman, E. J., Pennycook, G., Porter, E., Rand, D. G., Rapp, D. N., Reifler, J., Roozenbeek, J., Schmid, P., Seifert, C. M., Sinatra, G. M., Swire-Thompson, B., van der Linden, S. and Zaragoza, M. S. (2020a), The Debunking Handbook 2020.Google Scholar
Lewandowsky, S., Ecker, U. K. H. and Cook, J. (2017), Beyond misinformation: understanding and coping with the “post-truth” era, Journal of Applied Research in Memory and Cognition, 6(4): 353369.CrossRefGoogle Scholar
Lewandowsky, S., Smillie, L., Garcia, D., Hertwig, R., Weatherall, J., Egidy, S., Robertson, R. E., O’Connor, C., Kozyreva, A., Lorenz-Spreen, P., Blaschke, Y. and Leiser, M. (2020b), Technology and democracy: understanding the influence of online technologies on political behaviour and decision-making, Luxembourg: Publications Office of the European Union. 10.2760/709177Google Scholar
Lindberg, S. I., Coppedge, M., Gerring, J., Teorell, J., Pemstein, D., Tzelgov, E., Wang, Y. T., Glynn, A., Altman, D., Bernhard, M., Fish, S., Hicken, A., Kroenig, M., McMann, K., Paxton, P., Reif, M., Skaaning, S. E. and Staton, J. (2014), V-Dem: a new way to measure democracy, Journal of Democracy, 25(3): 159169.CrossRefGoogle Scholar
Lorenz-Spreen, P., Oswald, L., Lewandowsky, S. and Hertwig, R. (2022), A systematic review of worldwide causal and correlational evidence on digital media and democracy, Nature Human Behaviour, 7(1): 74101.CrossRefGoogle ScholarPubMed
Lührmann, A., Marquardt, K. L. and Mechkova, V. (2020), Constraining governments: new indices of vertical, horizontal, and diagonal accountability, American Political Science Review, 114(3): 811820.CrossRefGoogle Scholar
Lührmann, A., Tannenberg, M. and Lindberg, S. I. (2018), Regimes of the world (RoW): opening new avenues for the comparative study of political regimes, Politics and Governance, 6(1): 6077.CrossRefGoogle Scholar
Ma, S., Bergan, D., Ahn, S., Carnahan, D., Gimby, N., McGraw, J. and Virtue, I. (2023), Fact-checking as a deterrent? A conceptual replication of the influence of fact-checking on the sharing of misinformation by political elites, Human Communication Research, 49(3): 321338.CrossRefGoogle Scholar
Mackie, D. M. (1986), Social identification effects in group polarization, Journal of Personality and Social Psychology, 50(4): 720728.CrossRefGoogle Scholar
Malmendier, U. and Nagel, S. (2011), Depression babies: do macroeconomic experiences affect risk taking?*, The Quarterly Journal of Economics, 126(1): 373416.CrossRefGoogle Scholar
Malmendier, U. and Nagel, S. (2016), Learning from inflation experiences, The Quarterly Journal of Economics, 131(1): 5387.CrossRefGoogle Scholar
Mansbridge, J. and Macedo, S. (2019), Populism and democratic theory, Annual Review of Law and Social Science, 15(1): 5977.CrossRefGoogle Scholar
Marsden, E. (2022), The Defence in Depth Principle: A Layered Approach to Safety Barriers, Risk Engineering. https://risk-engineering.org/concept/defence-in-depth [22 May 2024].Google Scholar
Mastrolorenzo, G., Petrone, P., Pappalardo, L. and Sheridan, M. F. (2006), The Avellino 3780-yr-B.P. catastrophe as a worst-case scenario for a future eruption at Vesuvius, Proceedings of the National Academy of Sciences, 103(12): 43664370.CrossRefGoogle ScholarPubMed
Maxwell, K. (1991), Spain’s transition to democracy: a model for Eastern Europe?, Proceedings of the Academy of Political Science, 38(1): 3549.CrossRefGoogle Scholar
Mettler, S. and Liebermann, R. C. (2020), Four Threats: The Recurring Crises of American Democracy, New York: St. Martin’s GriffinGoogle Scholar
Mittiga, R. (2022), Political legitimacy, authoritarianism, and climate change, American Political Science Review, 116(3): 9981011.CrossRefGoogle Scholar
Montanaro, D. (2023, April 5 ), Most Republicans would vote for Trump even if he’s convicted of a crime, poll finds. NPR. https://www.npr.org/2023/04/25/1171660997/poll-republicans-trump-president-convicted-crimeGoogle Scholar
Morrison, J. B. and Wears, R. L. (2022), Modeling Rasmussen’s dynamic modeling problem: drift towards a boundary of safety, Cognition, Technology and Work, 24(1): 127145.CrossRefGoogle Scholar
Mudde, C. (2017), Populism: an ideational approach, Kaltwasser, C. R., Taggart, P., Espejo, P. O. and Ostiguy, P.eds, The Oxford Handbook of Populism, Oxford: Oxford University Press.Google Scholar
Mudde, C. and Rovira Kaltwasser, C. (2018), Studying populism in comparative perspective: reflections on the contemporary and future research agenda, Comparative Political Studies, 51(13): 16671693.CrossRefGoogle Scholar
Navia, P. and Osorio, R. (2019), Attitudes toward democracy and authoritarianism before, during and after military rule. The case of Chile, 1972–2013, Contemporary Politics, 25(2): 190212.CrossRefGoogle Scholar
Neundorf, A., Gerschewski, J. and Olar, R.-G. (2020), How do inclusionary and exclusionary autocracies affect ordinary people?, Comparative Political Studies, 53(12): 18901925.CrossRefGoogle Scholar
Nickerson, D. W. and Rogers, T. (2010), Do you have a voting plan? Implementation intentions, voter turnout, and organic plan making, Psychological Science, 21(2): 194199.CrossRefGoogle Scholar
Nord, M., Lundstedt, M., Altman, D., Angiolillo, F., Borella, C., Fernandes, T., Gastaldi, L., God, A. G., Natsika, N. and Lindberg, S. I. (2024), Democracy Report 2024: Democracy Winning and Losing at the Ballot. https://www.v-dem.net/documents/43/v-dem_dr2024_lowres.pdfCrossRefGoogle Scholar
Nyhan, B. and Reifler, J. (2015), The effect of fact-checking on elites: a field experiment on U.S. state legislators, American Journal of Political Science, 59(3): 628640.CrossRefGoogle Scholar
Orhan, Y. E. (2022), The relationship between affective polarization and democratic backsliding: comparative evidence, Democratization, 29(4): 714735.CrossRefGoogle Scholar
Pantazi, M., Hale, S. and Klein, O. (2021), Social and cognitive aspects of the vulnerability to political misinformation, Political Psychology, 42(S1): 267304.CrossRefGoogle Scholar
Pantazi, M., Papaioannou, K. and van Prooijen, J. W. (2022), Power to the people: the hidden link between support for direct democracy and belief in conspiracy theories, Political Psychology, 43(3): 529548.CrossRefGoogle Scholar
Parker, A. and Sanger, D. E. (2016, July 27 ), Donald Trump calls on Russia to find Hillary Clinton’s missing emails. The New York Times. https://www.nytimes.com/2016/07/28/us/politics/donald-trump-russia-clinton-emails.htmlGoogle Scholar
Pedersen, V. M. L., Thaysen, J. D. and Albertsen, A. (2023), Nudging voters and encouraging pre-commitment: beyond mandatory turnout, Res Publica, 30(2): .Google Scholar
Perrow, C. (1984), Normal Accidents: Living with High-Risk Technologies, Princeton University Press, Princeton.Google Scholar
Perrow, C. (1996), The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA, Chicago: University of Chicago PressGoogle Scholar
Pettigrew, T. F. (1998), Intergroup contact theory, Annual Review of Psychology, 49(1): 6585.CrossRefGoogle ScholarPubMed
Pettigrew, T. F. and Tropp, L. R. (2006), A meta-analytic test of intergroup contact theory, Journal of Personality and Social Psychology, 90(5): 751783.CrossRefGoogle ScholarPubMed
Polyák, G. (2019), Media in Hungary: three pillars of an illiberal democracy, Połońska, E. and Beckett, C.eds, Public Service Broadcasting and Media Systems in Troubled European Democracies, 279303, Cham: Springer International Publishing.CrossRefGoogle Scholar
Posada-Carbó, E. (2011), Latin America: Colombia after uribe, Journal of Democracy, 22(1): 137151. https://muse.jhu.edu/article/412899CrossRefGoogle Scholar
Rasmussen, J. (1997), Risk management in a dynamic society: a modelling problem, Safety Science, 27(2–3): 183213.CrossRefGoogle Scholar
Reason, J. (2016), Managing the Risks of Organizational Accidents, London: Routledge.CrossRefGoogle Scholar
Rogers, W. P., Armstrong, N. A., Acheson, D. C., Covert, E. E., Feynman, R. P., Hotz, R. B., Kutyna, D. J., Ride, S. K., Rummel, R. W., Sutter, J. F., Walker, A. B. C., Wheelon, A. D. and Yeager, C. E. (1986), Report to the President by the Presidential Commission on the Space Shuttle Challenger Accident.Google Scholar
Roozenbeek, J., van der Linden, S., Goldberg, B., Rathje, S. and Lewandowsky, S. (2022), Psychological inoculation improves resilience against misinformation on social media, Science Advances, 8 34.CrossRefGoogle ScholarPubMed
Rovira Kaltwasser, C. (2017), ‘Populism and the Question of How to Respond to It’, in Kaltwasser, C. R., Taggart, P., Espejo, P. O. and Ostiguy, P. (eds), The Oxford Handbook of Populismvolume 1, Oxford: Oxford University Press, 489508.CrossRefGoogle Scholar
Schaub, M. (2021), Acute financial hardship and voter turnout: theory and evidence from the sequence of bank working days, American Political Science Review, 115(4): 12581274.CrossRefGoogle Scholar
Schlipphak, B. and Treib, O. (2017), Playing the blame game on Brussels: the domestic political effects of EU interventions against democratic backsliding, Journal of European Public Policy, 24(3): 352365.CrossRefGoogle Scholar
Schulte-Mecklenbeck, M., Wagner, G. G. and Hertwig, R. (2024), How personal experiences shaped risk judgments during COVID-19, Journal of Risk Research, 27(3): 438457.CrossRefGoogle Scholar
Shear, M. D. and Saul, S. (2021, January 3 ), Trump, in taped call, pressured Georgia official to ‘find’ votes to overturn election. The New York Times. https://www.nytimes.com/2021/01/03/us/politics/trump-raffensperger-call-georgia.htmlGoogle Scholar
Siripurapu, A. (2022), The U.S. Inequality Debate, Council on Foreign Relations, https://www.cfr.org/backgrounder/us-inequality-debateGoogle Scholar
Somer, M., McCoy, J. L. and Luke, R. E. (2021), Pernicious polarization, autocratization and opposition strategies, Democratization, 28(5): 929948.CrossRefGoogle Scholar
Steenland, K. (1974), The coup in Chile, Latin American Perspectives, 1(2): 929. https://www.jstor.org/stable/2633976CrossRefGoogle Scholar
Svolik, M. W. (2019), Polarization versus Democracy, Journal of Democracy, 30(3): 2032.CrossRefGoogle Scholar
Svolik, M. W., Avramovska, E., Lutz, J. and Milaèiæ, F. (2023), In Europe, democracy erodes from the right, Journal of Democracy, 34(1): 520.CrossRefGoogle Scholar
Swami, V. (2012), Social psychological origins of conspiracy theories: the case of the Jewish conspiracy theory in Malaysia, Frontiers in Psychology, .CrossRefGoogle ScholarPubMed
Swire, B., Berinsky, A. J., Lewandowsky, S. and Ecker, U. K. H. (2017), Processing political misinformation: comprehending the Trump phenomenon, Royal Society Open Science, 4(3): .CrossRefGoogle ScholarPubMed
Swire-Thompson, B., Ecker, U. K. H., Lewandowsky, S. and Berinsky, A. J. (2020), They might be a liar but they’re my liar: source evaluation and the prevalence of misinformation, Political Psychology, 41(1): 2134.CrossRefGoogle Scholar
Szelényi, Z. (2022), How Viktor Orbán built his illiberal state. The New Republic. https://newrepublic.com/article/165953/viktor-orban-built-illiberal-stateGoogle Scholar
Tsipursky, G., Votta, F. and Mulick, J. A. (2018a), A psychological approach to promoting truth in politics: the Pro-Truth Pledge, Journal of Social and Political Psychology, 6(2): 271290.CrossRefGoogle Scholar
Tsipursky, G., Votta, F. and Roose, K. M. (2018b), Fighting fake news and post-truth politics with behavioral science: the Pro-Truth Pledge, Behavior and Social Issues, 27(1): 4770.CrossRefGoogle Scholar
Uscinski, J., Enders, A., Diekman, A., Funchion, J., Klofstad, C., Kuebler, S., Murthi, M., Premaratne, K., Seelig, M., Verdear, D. and Wuchty, S. (2022), The psychological and political correlates of conspiracy theory beliefs, Scientific Reports, 12(1): 112.CrossRefGoogle ScholarPubMed
van der Linden, S., Albarracín, D., Fazio, L., Freelon, D., Roozenbeek, J., Swire-Thompson, B. and van Bavel, J. (2023), Using psychological science to understand and fight health misinformation, APA Consensus Statement, November, https://www.apa.org/pubs/reports/health-misinformationGoogle Scholar
Varol, O. O. (2015), Stealth authoritarianism, Iowa Law Review, 100(4): 16731742. https://ilr.law.uiowa.edu/sites/ilr.law.uiowa.edu/files/2023-02/ILR-100-4-Varol.pdfGoogle Scholar
Voelkel, J. G., Chu, J., Stagnaro, M. N., Mernyk, J. S., Redekopp, C., Pink, S. L., Druckman, J. N., Rand, D. G. and Willer, R. (2023), Interventions reducing affective polarization do not necessarily improve anti-democratic attitudes, Nature Human Behaviour, 7(1): 5564.CrossRefGoogle Scholar
Wachinger, G., Renn, O., Begg, C. and Kuhlicke, C. (2013), The risk perception paradox—implications for governance and communication of natural hazards, Risk Analysis, 33(6): 10491065.CrossRefGoogle ScholarPubMed
Waisbord, S. (2018), The elective affinity between post-truth communication and populist politics, Communication Research and Practice, 4(1): 1734.CrossRefGoogle Scholar
Waldner, D. and Lust, E. (2018), Unwelcome change: coming to terms with democratic backsliding, Annual Review of Political Science, 21: 93113.CrossRefGoogle Scholar
Weber, E. U. (2006), Experience-based and description-based perceptions of long-term risk: why global warming does not scare us (yet), Climatic Change, 77(1–2): 103120.CrossRefGoogle Scholar
Weber, E. U. and Stern, P. C. (2011), Public understanding of climate change in the United States, American Psychologist, 66(4): 315328.CrossRefGoogle ScholarPubMed
Weber, T. (2022), Als Die Demokratie Starb: Die Machtergreifung der Nationalsozialisten—Geschichte Und Gegenwart, Freiburg: HerderCrossRefGoogle Scholar
Wegwarth, O., Mansmann, U., Zepp, F., Lühmann, D., Hertwig, R. and Scherer, M. (2023), Vaccination intention following receipt of vaccine information through interactive simulation vs text among covid-19 vaccine–hesitant adults during the omicron wave in Germany, JAMA Network Open, 6(2): .CrossRefGoogle ScholarPubMed
Wiesner, K., Bien, S. and Wilson, M. C. (2023), The hidden dimension in democracy (V-Dem Working Paper). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4453098Google Scholar
Wiesner, K., Birdi, A., Eliassi-Rad, T., Farrell, H., Garcia, D., Lewandowsky, S., Palacios, P., Ross, D., Sornette, D. and Thébault, K. (2019), Stability of democracies: a complex systems perspective, European Journal of Physics, 40(1): .CrossRefGoogle Scholar
Wood, M. J., Douglas, K. M. and Sutton, R. M. (2012), Dead and alive: beliefs in contradictory conspiracy theories, Social Psychological and Personality Science, 3(6): 767773.CrossRefGoogle Scholar
Wunsch, N. and Blanchard, P. (2022), Patterns of democratic backsliding in third-wave democracies: a sequence analysis perspective, Democratization, 30(2): 278301.CrossRefGoogle Scholar
Zakrzewski, C. (2023, July 4 ), Judge blocks U.S. officials from tech contacts in First Amendment case. The Washington Post. https://www.washingtonpost.com/technology/2023/07/04/biden-social-lawsuit-missouri-louisiana/Google Scholar
Zmigrod, L. (2020), The role of cognitive rigidity in political ideologies: theory, evidence, and future directions, Current Opinion in Behavioral Sciences, 34: 3439.CrossRefGoogle Scholar
Figure 0

Table 1. Selected historical cases of democratic near misses

Figure 1

Figure 1. Illustration of the drift-to-danger model applied to democratic backsliding. The solid black line represents a gradual drift toward autocracy. Elite norm violations are a principal driver of this drift and can be opposed by behavioural countermeasures. The threshold to autocracy (solid red horizontal line) is protected by a number of safety layers (thin red lines) that can be undermined by risk factors and strengthened by behavioural science interventions. If at least one safety layer holds, making it possible to reverse the drift, a near miss occurs.

Figure 2

Table 2. Factors that undermine democracy by the five V-Dem components

Figure 3

Table A1. Results for Google Scholar search queries