Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-25T16:21:20.822Z Has data issue: false hasContentIssue false

Optimism and pessimism toward science: A new way to look at the public’s evaluations of science and technology discoveries and recommendations

Published online by Cambridge University Press:  27 June 2023

Ki Eun Kang*
Affiliation:
Texas A&M University College Station, The Bush School of Government and Public Service, ISTPP, College Station, TX, USA
Arnold Vedlitz
Affiliation:
Texas A&M University College Station, The Bush School of Government and Public Service, ISTPP, College Station, TX, USA
Carol L. Goldsmith
Affiliation:
Texas A&M University College Station, The Bush School of Government and Public Service, ISTPP, College Station, TX, USA
Ian Seavey
Affiliation:
Texas A&M University College Station, The Bush School of Government and Public Service, ISTPP, College Station, TX, USA
*
Corresponding author: Ki Eun Kang; Email: kieunk@tamu.edu

Abstract

While there have always been those in the American public who mistrust science and scientists’ views of the world, they have tended to be a minority of the larger public. Recent COVID-19 related events indicate that could be changing for some key groups. What might explain the present state of mistrust of science within an important component of the American public? In this study, we delve deeply into this question and examine what citizens today believe about science and technology and why, focusing on core theories of trust, risk concern, and political values and on the important role of science optimism and pessimism orientations. Using national public survey data, we examine the correlates of science optimism and pessimism and test the efficacy of this construct as drivers of biotechnology policy. We find that science optimism and pessimism are empirically useful constructs and that they are important predictors of biotechnology policy choices.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives licence (http://creativecommons.org/licenses/by-nc-nd/4.0), which permits non-commercial re-use, distribution, and reproduction in any medium, provided that no alterations are made and the original article is properly cited. The written permission of Cambridge University Press must be obtained prior to any commercial use and/or adaptation of the article.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of the Association for Politics and the Life Sciences

Scientific discoveries can both identify public problems and suggest governmental solutions. When potential actions, policies, or regulations grow out of scientific research, citizens and decision makers must evaluate the accuracy and relevance of both the science and the proposed policy or regulatory solutions. Most often, neither citizens nor decision makers are experts in the scientific disciplines that have identified the problem and possible solutions. A question arises, then, about how, given this lack of expertise, decision makers and the public are to make informed decisions about the supposed problem and its possible solutions. Who is to be trusted and believed, and how are citizens and decision makers to sort out these complexities and make sensible choices?

U.S. citizen reactions to the COVID-19 crisis and scientific observations and recommendations about medical and safety realities exposed deep fissures within the American public about science and competing sources of understanding about the world around us. More and more, it seems that Americans’ long-standing love affair with science and technology is becoming more of a love/hate relationship. As Miller (Reference Miller2004) has pointed out, the U.S. public has consistently expressed strong belief in the contributions of scientific research to life quality and economic affluence. However, many citizens who, in the past, marveled at technical accomplishments like the Erie and Panama Canals and the Hoover Dam, had their hopes raised by the science that created the polio vaccine and fought the perils of influenza, and had their breath taken away by the science and engineering feats accomplished in sending people to the moon and launching satellites that help us communicate are now skeptical of things that scientists and engineers tell us about real-world relationships and the impact of science and engineering on our society, environment, and economy. There are many sources of this growing sense of doubt and suspicion among Americans, who are no longer sure that science and technological advancements and discoveries are not doing more harm than good to the life they want for themselves and their families.

People make decisions all the time even though they may not be experts in a particular topic or situation. They do this by seeking out guidance from those individuals or institutions they trust to know more about the issue and represent their core attitudes, values, beliefs, and goals. These influencers or cue givers may come from many sources, including scientists, governments, political parties, news outlets, social media, religious leaders, universities, friends, and coworkers (Bolsen et al., Reference Bolsen, Druckman and Cook2014; Darmofal, Reference Darmofal2005). Therefore, citizens may form opinions without much direct information and instead take cues from one or more of these external sources, some of whom may also have formed their opinions based on cue givers rather than direct knowledge. How external cues and other forces frame citizen attitudes on a complex scientific topic has been discussed in a useful case study of climate change (Stoutenborough et al., Reference Stoutenborough, Bromley-Trujillo and Vedlitz2014).

This study directly examines the characteristics that influence individual citizen orientations toward trust in science and technology, scientists in general, and gene drive technology policy specifically in this current polarizing environment. Citizens’ attitudes toward science generally can be conceptualized in different ways (Miller, Reference Miller2004; Miller et al., Reference Miller, Pardo and Niwa1997; Nisbet et al., Reference Nisbet, Scheufele, Shanahan, Moy, Brossard and Lewenstein2002). In our evaluation of the public’s support for science, scientists, and gene drive policy, we include a discussion and analysis of a dimension that we think is important: citizens’ optimistic and pessimistic perceptions of science. We then examine the relationship between optimistic and pessimistic views of science and public opinion on gene drive technology policy support and opposition.

This study examines public opinion on science, scientists, and gene drive policy. We do this by analyzing data collected through a representative national public opinion survey that focused on two primary research questions:

  1. 1. What are the predictors of the public’s views on science optimism and pessimism and on trust in scientists?

  2. 2. How do the public’s optimistic and pessimistic views on science and their trust in scientists affect their views on specific gene drive policies?

Our examination of these questions is guided by several relevant theories that we highlight in the literature discussion. These theories include trust, risk concern, and value predispositions. We used these theories for guidance in the construction of our survey instrument and in our models of analysis.

Trust in science, scientists, and policy support

Public trust in science and scientists constitutes an important element in the utility of scientific knowledge as a driver of much of our nation’s economic, health, and social well-being. Trust is an attitude that taps into a multifaceted construct—individuals’ perceptions of a person’s, institution’s, or government’s competence, values, integrity, openness, and fairness (Allum, Reference Allum2007; Cvetkovich & Nakayachi, Reference Cvetkovich and Nakayachi2007; Kitt et al., Reference Kitt, Axsen, Long and Rhodes2021; Poortinga & Pidgeon, Reference Poortinga and Pidgeon2003; Siegrist, Reference Siegrist2000)—and as such, it can change with circumstances (Gauchat, Reference Gauchat2012). Individuals use rational calculations to make certain choices about trust, and their judgments of trust may derive from previous experience (Coleman, Reference Coleman1994; Hardin, Reference Hardin2002; Kee & Knox, Reference Kee and Knox1970). Numerous studies on public opinion toward science suggest that multiple factors influence a citizen’s choice to trust in science. These factors include perceptions of risks, values, and beliefs (Bauer et al., Reference Bauer, Allum and Miller2007; Kellstedt et al., Reference Kellstedt, Zahran and Vedlitz2008; Stoutenborough & Vedlitz, Reference Stoutenborough and Vedlitz2015; Sturgis & Allum, Reference Sturgis and Allum2004); political orientation (Hochschild et al., Reference Hochschild, Crabill and Sen2012; McCright et al., Reference McCright, Dentzman, Charters and Dietz2013; McCright & Dunlap, Reference McCright and Dunlap2011); religious orientation (Brossard et al., Reference Brossard, Scheufele, Kim and Lewenstein2009; Sturgis & Allum, Reference Sturgis and Allum2004); and sociodemographic factors (Chao & Wei, Reference Chao and Wei2009; Nisbet et al., Reference Nisbet, Scheufele, Shanahan, Moy, Brossard and Lewenstein2002).

Trust is essential in the context of science (Hendriks et al., Reference Hendriks, Kienhues, Bromme and Blöbaum2016), and trust in scientists is becoming a more frequent cue that the public uses to make decisions about problems and their solutions (Johnson & Dieckmann, Reference Johnson and Dieckmann2020; Kahan et al., Reference Kahan, Peters, Wittlin, Slovic, Ouellette, Braman and Mandel2012). Trust in science and in scientists as identifiers of problems and solutions is even more important in a crisis, when stress, uncertainty, and risk may be especially high (Betsch, Reference Betsch2020; Siegrist & Zingg, Reference Siegrist and Zingg2014). The COVID-19 pandemic has illuminated how stressful and uncertain situations may lead to a growing trend to reject many conclusions that scientists are telling us are factual and necessary. Public trust in science and scientists during high-stress crises like the pandemic could influence individual behaviors and compliance in various ways, both positive and negative (Algan et al., Reference Algan, Cohen, Davoine, Foucault and Stantcheva2021; Hutmacher et al., Reference Hutmacher, Reichardt and Appel2022).

For example, it seems that members of the public understood that a lockdown in the beginning of the COVID-19 pandemic was necessary (Bol et al., Reference Bol, Giani, Blais and Loewen2021) and that trust mattered in following safety guidelines and supporting the policies behind them (Robinson et al., Reference Robinson, Ripberger, Gupta, Ross, Fox, Jenkins-Smith and Silva2021). In previous studies, people with a higher level of trust in government, health agencies, or science-based institutions were more likely to adopt behaviors recommended by experts during crises such as the SARS and H1N1 outbreaks (Prati et al., Reference Prati, Pietrantoni and Zani2011; Siegrist & Zingg, Reference Siegrist and Zingg2014). Research on past outbreaks points to continued trust in science to benefit society accompanied by a drop in trust in scientists (Eichengreen et al., Reference Eichengreen, Aksoy and Saka2021). Recent polls on COVID-19 science-related attitudes and behaviors show that public trust is substantially divided along political lines.

Studies that look at trust in scientists ask people to consider the roles that scientists take on as knowledge generators, informers, and advisors and what motivates their work. Around 70% of respondents think that genetic engineering scientists are motivated by improving the public good, but they ascribe less trust to climate scientists (Hochschild et al., Reference Hochschild, Crabill and Sen2012). Aglan et al. (Reference Algan, Cohen, Davoine, Foucault and Stantcheva2021) attribute the change in trust in scientists during the COVID-19 pandemic to statements by President Donald Trump and other key leaders in the American government. Trust might also change with people’s growing belief in the influence of self-interest, government, and industry on the work of scientists (Algan et al., Reference Algan, Cohen, Davoine, Foucault and Stantcheva2021; Bottini et al., Reference Bottini, Rosato, Gloria, Adanti, Corradino, Bergamaschi and Magrini2011).

Our study, which takes place during the COVID-19 pandemic that began in the United States in January 2020, provides another source of information on public trust in science and scientists and their attitudes toward gene drive policies. In our study, we drill deeper into the likely correlates of support for science, scientists, and gene drive policies especially through the inclusion of science and technology optimism and pessimism dimensions not considered in the earlier studies and reports. This is an important gap in the literature because earlier studies found science and technology optimism and pessimism to be a consistent predictor of views about support for science and technology (Hochschild et al., Reference Hochschild, Crabill and Sen2012; Hochschild & Sen, Reference Hochschild and Sen2015; Horrow et al., Reference Horrow, Pacyna, Sutton, Sperry, Breitkopf and Sharp2019; Nisbet & Markowitz, Reference Nisbet and Markowitz2014).

Optimism and pessimism about science and technology

Previous studies show that a large number of citizens consistently express a relatively optimistic view of science (National Science Board, 2018), but they also show a significant and consistent minority of citizens who say they are more pessimistic about the role of science and scientists in national discussions about problems and solutions. These earlier studies indicate that members of the public think that science research produces more benefits than harms (Miller et al., Reference Miller, Pardo and Niwa1997). Miller et al. (Reference Miller, Pardo and Niwa1997) found that U.S. citizens who possess more positive views of science and technology are also more optimistic about science’s impact on our well-being. But those who are less supportive and more pessimistic are also present (Hochschild et al., Reference Hochschild, Crabill and Sen2012). Americans especially worry that emerging science developments may be taking technology too far (Funk, Reference Funk2016). In particular, the public is deeply pessimistic about the harmful effects of automation technologies.

In this study, we conceptualize science optimism and pessimism through a psychological view of gain and loss. We define science optimism as a promotion orientation that focuses on growth, gain, and generating more, even if there are possible losses and risks of errors (Hazlett et al., Reference Hazlett, Molden and Sackett2011; Higgins, Reference Higgins1997; Liberman et al., Reference Liberman, Molden, Idson and Higgins2001). So, our science optimism measures motivations for attaining growth. Science pessimism focuses on prevention orientations, which are associated with protecting against and avoiding possible losses even if possible gains are missed (Hazlett et al., Reference Hazlett, Molden and Sackett2011; Higgins, Reference Higgins1997; Liberman et al., Reference Liberman, Molden, Idson and Higgins2001). Therefore, we view people’s general pessimistic and optimistic evaluative dimensions about the role of science and technology as an additional/new way to conceptualize the positive or negative developments consequences of science and technology developments.

Influences on optimism and pessimism about science and policy support

Concern/risk

Optimism and pessimism about science and technology connect to the broader psychological phenomenon of risk perception, which measures an individual’s level of risk aversion (Hochschild et al., Reference Hochschild, Crabill and Sen2012; Slovic, Reference Slovic1987) and aligns along two key dimensions—risks and benefits (Frewer et al., Reference Frewer, Howard and Shepherd1997, Reference Frewer, Howard and Shepherd1998). The use of science and technology to identify and solve problems involves risks and benefits to multiple dimensions about which people have concern, such as the environment, the economy, government spending, and education. These risk and benefit calculations associated with their relative concern for these dimensions are important for citizens and their cue givers. As discussed earlier, people’s concern may influence their optimism and pessimism about science and technology.

Value predispositions

Many studies highlight the importance of public knowledge about science and technological issues and the gap that often exists between public understanding of a scientific finding and scientists’ views of that same finding (Jasanoff et al., Reference Jasanoff, Markle, Peterson and Pinch1995; Ziman, Reference Ziman1991). The implicit assumption of these researchers is that a lack of information and understanding among members of the public contributes to public skepticism toward science and technology (Durant, Reference Durant1999). This gap has been referred to as the knowledge deficit. However, this so-called deficit has faced scrutiny from many researchers because of inconsistent empirical findings, institutional bias, and questionable measures of scientific understanding (Bauer et al., Reference Bauer, Allum and Miller2007; Kellstedt et al., Reference Kellstedt, Zahran and Vedlitz2008; Stoutenborough & Vedlitz, Reference Stoutenborough and Vedlitz2015; Sturgis & Allum, Reference Sturgis and Allum2004). Historically, even Americans with a low level of science literacy would say that they are optimistic about science and technology (Nisbet, Reference Nisbet2005). However, some studies point out that assessing knowledge is challenging and that it is difficult to fully capture its meaning (Allum et al., Reference Allum, Sturgis, Tabourazi and Brunton-Smith2008).

Prior studies also examine the role of the media in people’s attitudes toward science and scientists. The media often communicates newly identified risks emerging from scientific discoveries that raise the profile of science information on a particular problem (Gauchat, Reference Gauchat2012). How the media portrays these discoveries and the attention people pay to what the media says may exert some influence on people’s attitudes toward these breakthroughs (Nisbet, Reference Nisbet2005). The COVID-19 pandemic is one recent example. However, the polls we cited earlier indicate that political perspectives are more influential in decision-making for some citizens than scientific evidence and prescriptions or media warnings. These differences in political perspectives have resulted in low COVID-19 vaccination rates among Republicans (Albrecht, Reference Albrecht2022; Callaghan et al., Reference Callaghan, Moghtaderi, Lueck, Hotez, Strych, Dor, Fowler and Motta2021; Lin et al., Reference Lin, Tu and Beitsch2021). Clearly, the benefits of vaccination are greater than the risks, circumstances that should induce more people to proceed with a vaccine (Frewer et al., Reference Frewer, Howard and Shepherd1997).

In this study, therefore, rather than addressing knowledge or the media, we highlight the general predisposition attitudes, beliefs, and values of citizens in order to capture a deeper understanding of their perceptions of both science optimism and pessimism and trust in scientists. Research has shown that members of the public are influenced by their predisposition attitudes, beliefs, and values in their evaluations of science and scientists (Brossard & Nisbet, Reference Brossard and Nisbet2007; Ho et al., Reference Ho, Scheufele and Corley2010; Hochschild & Sen, Reference Hochschild and Sen2015; Nowotny et al., Reference Nowotny, Scott and Gibbons2013). Important predisposition factors in their evaluations of science and scientists include social connections, culture, religious commitments, morality, individual risk assessments, sources of information, deference to science authority, and political ideology (Brossard & Nisbet, Reference Brossard and Nisbet2007; Ho et al., Reference Ho, Scheufele and Corley2010; Hochschild & Sen, Reference Hochschild and Sen2015; Nowotny et al., Reference Nowotny, Scott and Gibbons2013; Xiao, Reference Xiao2013).

Previous studies show that individuals’ views of science are related to their attitude toward the development of science and technology. Unlike general science, biotechnology innovations are not widely discussed in the news and social media, nor are they covered by political leaders. So people have a limited understanding of and interest in biotechnology (Scheufele & Lewenstein, Reference Scheufele and Lewenstein2005; Sparks et al., Reference Sparks, Shepherd and Frewer1994). Under these circumstances, people’s attitudes toward biotechnology policies might also be driven by their belief and trust in science.

Political orientation is consistently found to be a strong, divisive factor on issues and policies such as those concerning the environment (Hannibal et al., Reference Hannibal, Liu and Vedlitz2016; Liu et al., Reference Liu, Vedlitz and Shi2014; Wood & Vedlitz, Reference Wood and Vedlitz2007), biotechnologies (Legge & Durant, Reference Legge and Durant2010), and nuclear energy (Akin et al., Reference Akin, Cacciatore, Yeo, Brossard, Scheufele and Xenos2021). As we noted earlier, political party identification and political ideology are often correlated with trust in science, scientists, and biotechnology (Li & Qian, Reference Li and Qian2022). Overall, research has shown that liberals and Democrats are more likely to believe the scientific consensus on the causes of some problems, like climate change, while conservatives and Republicans are more skeptical (McCright & Dunlap, Reference McCright and Dunlap2011). Republicans also tend to show less optimistic views on genomic science than Democrats (Hochschild et al., Reference Hochschild, Crabill and Sen2012). Conservatives are more opposed to science and technology generally and exhibit less trust in scientists generally, and especially in scientists who are more engaged in identifying the environmental and public health impacts of economic production (McCright et al., Reference McCright, Dentzman, Charters and Dietz2013; Mooney, Reference Mooney2012). Steadfast conservatives, compared with those who change political views, see science more favorably and scientists less favorably, suggesting that these views develop through “a conservative scientific repertoire that is learned over time and that helps orient political conservatives in scientific debates that have political repercussions” (Mann & Schleifer, Reference Mann and Schleifer2020, p. 305).

We started to see the reversal of a divide between conservatives and liberals between 1974 and 2010, during which time conservatives changed from having the highest trust in science to the lowest (Gauchat, Reference Gauchat2012). While conservatives were becoming more skeptical, Democrats were becoming more trusting of science (Lee, Reference Lee2021). This fissure within the citizenry about the validity of identification of problems and their solutions through science has deepened with the polarization of climate change science (Liu et al., Reference Liu, Vedlitz, Stoutenborough and Robinson2015; Wood & Vedlitz, Reference Wood and Vedlitz2007), but during the COVID-19 pandemic, these gaps seem more pronounced (Algan et al., Reference Algan, Cohen, Davoine, Foucault and Stantcheva2021) and require further investigation.

According to a survey conducted by the Pew Research Center (Funk et al., Reference Funk, Kennedy and Johnson2020), trust in scientists increased mainly among Democrats during the COVID-19 pandemic. Researchers compared data between 2019 and 2020 to see whether COVID-19 affected people’s evaluations of science and found that support for science (those who said they had a great deal of confidence) among Democrats rose from 43% in 2019 to 52% in 2020. Republicans were much less supportive in both years, with only 27% showing great confidence in 2019 and again in 2020 (Funk et al., Reference Funk, Kennedy and Johnson2020).

In general, studies have shown that religious orientation significantly influences public opinions about science and technology. Many empirical findings indicate that a conflict often exists between religion and science and that highly religious people are more concerned about science and technology, especially nanotechnology and stem cell research (Brossard et al., Reference Brossard, Scheufele, Kim and Lewenstein2009; Gauchat, Reference Gauchat2008, Reference Gauchat2012; Nisbet, Reference Nisbet2005; Sturgis & Allum, Reference Sturgis and Allum2004). However, recent studies have questioned whether such a dispute prevails between religion and science and found that religious individuals do support science. Most Americans have a positive view on science (Funk, Reference Funk2020), and religious people are most likely part of that group (Evans, Reference Evans2012). Some researchers have argued that religion and science no longer clash and that they can both coexist as allies because some scientists also possess religious beliefs (Ecklund, Reference Ecklund2012). Additionally, different institutions, such as the media, may influence the relationship between religious beliefs and science and technology (Evans, Reference Evans2012). Therefore, it is difficult to say that religion itself is a driver of negative public views of science.

As discussed earlier, these different views on science and scientists are related to personal predisposition toward science, trust, religious, and political values. In the case of gene drive technology, those attitudes may be heavily influenced by value predispositions such as their beliefs and values, political ideology, and religious affiliation.

Citizens’ individual-level background variables

Researchers have pointed out the importance of including sociodemographic variables in public opinion studies of science, scientists, and biotechnology. Race, gender, age, education, and income have all been associated at one time or another with acceptance or rejection of science problems and their solutions. Male respondents have higher scientific literacy, more confidence in science, and greater trust in scientists than females (Chao & Wei, Reference Chao and Wei2009; McCright et al., Reference McCright, Dentzman, Charters and Dietz2013; Nisbet et al., Reference Nisbet, Scheufele, Shanahan, Moy, Brossard and Lewenstein2002; Xiao, Reference Xiao2013). Hochschild et al. (Reference Hochschild, Crabill and Sen2012) found that African Americans tend to express less optimistic opinions about genomics than Whites. Previous studies indicate that more educated citizens tend to have higher trust in science and technology (Achterberg et al., Reference Achterberg, De Koster and Waal2015; Vaccarezza, Reference Vaccarezza2007; Xiao, Reference Xiao2013). Some have found that respondents with lower family incomes tend to think that science makes life change too fast (National Science Board, 2018). Sociodemographic variables also predict support for policy (Smith & Leiserowitz, Reference Smith and Leiserowitz2014). Those who are older, male, and have higher incomes indicate more support for agricultural biotechnology (Brossard & Nisbet, Reference Brossard and Nisbet2007). People with more education view biotechnology more favorably (Besley & Shanahan, Reference Besley and Shanahan2005). The Pew Research Center reported that older adults tend to have positive perceptions of science, specifically with regard to vaccines (Funk et al., Reference Funk, Rainie and Page2015). We next build and test our comprehensive models of the likely relationships between our analytic variables and pessimistic or optimistic orientations about science, trust in scientists, and biotechnology policy support among the U.S. public.

Methods

To determine the public’s views about science and technology, scientists, and biotechnology policy, our research team designed a comprehensive national public opinion survey. This survey was administered from January 14 to 26, 2021, by professional polling provider Ipsos Public Affairs using its KnowledgePanel. The survey was sent to 2,681 panel members aged 18 and older residing in the United States through a probability-based sampling designed to be representative of the U.S. population. A total of 1,314 people completed the survey (a completion rate of 49%); of these, 94 respondents were not qualified to include in the data analysis because of missing data. Therefore, our research analysis is based on a representative national sample size of 1,220 qualified respondents.

A section of the survey instrument was designed specifically to focus in depth on citizen understanding and evaluations of science and technology, scientists, and federal policies about gene drive technology. We discuss four models in this article. The first two models seek to explain the factors associated with the public’s pessimistic and optimistic views of science and technology generally. The third model examines factors associated with the public’s trust in scientists specifically. Our fourth model tests associations between the public’s pessimistic and optimistic views toward science, trust in scientists, and support for or opposition to gene drive policies. Our conceptual model linking different factors and science optimism and pessimism, and trust in scientists with innovative biotechnology policy, is diagrammed in Figure 1.

Figure 1. Conceptual model explaining public attitudes toward science optimism and pessimism, trust in scientists, and policy support.

Dependent variables

We include four models in this study. For our first two models, we focus on whether the respondent has a pessimistic or optimistic view of science and technology. We designed a battery of six items in the survey instrument that sought to capture respondents’ science optimism/pessimism orientations. To develop our question battery, we reviewed multiple studies and polls that asked respondents about their overall views on science and technology or specific types of technologies (Gauchat, Reference Gauchat2011; Mehta, Reference Mehta2002; Miller et al., Reference Miller, Pardo and Niwa1997; Muñoz et al., Reference Muñoz, Moreno and Luján2012; National Science Board, 2018; Priest, Reference Priest2006; Vaccarezza, Reference Vaccarezza2007; Xiao, Reference Xiao2013). These studies include items that tap into various aspects of expectation of harms or benefits such as overall negative or positive outcomes or specific negative or positive outcomes. Many of these studies include items focused on expectations about the economic effects of science and technology, in particular opportunities for the next generation or job loss. Several studies also look at the accuracy of scientific knowledge and its role relative to other types of knowledge, such as faith, as well as social changes like isolation that could result from scientific and technological developments.

Respondents were asked to indicate how much they agreed or disagreed, using a 5-point Likert scale, with six statements about science and technology. Among those six statements, three were positive in orientation: (1) “Because of science and technology there will be more opportunities for the next generation”; (2) “Technological progress is essential to U.S. economic competitiveness”; and (3) “Science and technology provide the best and most reliable knowledge about the world.” Three were negative in orientation: (1) “Science and technology have caused more harm than good to humans”; (2) “We give too much value to scientific and technological knowledge compared to other forms of knowledge”; and (3) “Science and technology are creating an artificial and inhuman lifestyle.”

We analyze the optimism and pessimism indices as separate dependent variables. From each of the three sets of questions, we constructed an index variable, one for optimism and one for pessimism. Cronbach’s alpha was adequate or good for both scales (science optimism = 0.76 and science pessimism = 0.79). The univariate distributions for the optimism and pessimism scores across our sample of respondents are presented in Figure 2. The scale scores for the two indices ranged from 3 to 15. We can see that there is a good distribution and central tendency such that, while the two are somewhat reciprocal in nature, it is clear that there is a difference between science optimism and science pessimism.

Figure 2. Univariate distributions of the science optimism and pessimism.

While the optimism and pessimism responses are in opposite directions, it is also clear that they are not perfect reciprocals of one another, indicating that optimism and pessimism about science each has some unique attributes, distinctions, and differences. This is why we treat them as two separate dependent variables in our regression analysis models. We additionally test the Akaike information criterion (AIC) and Schwarz’s Bayesian information criterion (BIC) to compare the quality of our separate optimistic and pessimistic score models and one combined optimism/pessimism measure model. We find that based on both calculations, separate and distinct optimistic and pessimistic orientations fit the data better than one overall optimism/pessimism score. Therefore, we present two distinct optimism and pessimism scale models seeking to identify the analytic variables that seem to be driving separate and distinct optimistic and pessimistic orientations by using an ordinary least squares (OLS) regression model.

To test our third model, we use trust in scientists as our dependent variable. In our survey, we sought to determine how much each respondent distrusted or trusted scientists by assessing their agreement or disagreement with five statements that scientists (1) “create knowledge that is unbiased and accurate”; (2) “create knowledge that is useful”; (3) “advise government officials on policy”; (4) “inform the public on important issues”; and (5) “work with affected communities to design research that is relevant to those groups.” Each statement had five response options: strongly distrust = 1; distrust = 2; neither distrust nor trust = 3; trust = 4; strongly trust = 5. We combined individuals’ responses to these five statements to create a trust in scientists score ranging from 5 to 25, with 5 meaning strongly distrusts and higher numbers indicating stronger trust in scientists. The reliability of the trust in scientists score was tested and showed a Cronbach’s alpha of 0.92.

Our final dependent variable related to gene drive policy asked whether the respondents oppose or support certain federal policy proposals.Footnote 1 Four federal policy proposals were listed: (1) “Create a government agency to oversee and regulate all aspects of genetic engineering, including gene drive”; (2) “Establish programs to inform people about gene drive technology”; (3) “Establish programs to involve people in deciding which types of gene drive technology should be developed, if any”; and (4) “Work with international groups such as the United Nations to establish guidelines for gene drive technology.” Each statement had five response options: strongly oppose = 1; oppose = 2; neither oppose nor support = 3; support = 4; strongly support = 5. We combined these four statements to produce a score for gene drive policy support as a dependent variable for Model 4 (Cronbach’s alpha = .82).

Independent and control analytic variables

For all four of our models, we select as the basic independent and control variables the key constructs noted in the literature section—public concern variables like environmental concern and economic and national concern; value predispositions like values and attitudes, political ideology, and religiosity; and demographic variables like age, race/ethnicity, gender, education, income, and knowledge. The descriptive statistics for all the variables are provided in Table 1.

Table 1. Descriptive statistics of variables.

In this public survey, respondents were first asked how concerned they are about different public issues in the United States; this serves as our principal risk/concern variable. These concerns were divided into eight categories: economic growth, government spending, immigration, pollution, energy supply, climate change, national security, and environment. Responses to each category were coded as not concerned at all = 1; not so concerned = 2; somewhat concerned = 3; very concerned = 4. We ran principal component analysis on each category. As Table 2 shows, two factors were loaded for the eight concern questions. The first factor represents environmental concern and includes pollution, energy supply, climate change, and the environment. The second component contains all other variables: economic growth, government spending, immigration, and national security issues, which indicates economic and national security concern. Based on the factor analysis results, two factor scores were calculated to measure each public concern variable. A higher score means that the respondent is more concerned about that specific topic.

Table 2. Factor loading for public concern variables.

To capture value predispositions, the first independent variable we include is respondents’ opinion on the role of personal values and attitudes in their decision-making in contrast with a belief in the dominance of scientific thought. We asked respondents whether they agree that values and attitudes are as important as scientific knowledge. Their answers to this question were coded as strongly disagree = 1; disagree = 2; neither disagree nor agree = 3; agree = 4; strongly agree = 5. Additionally, we examine the role of political ideology, religion, and knowledge as important factors in respondents’ evaluations of science, scientists, and gene drive technology policy. We asked respondents to rate themselves on a political ideology scale from extremely liberal = 1 to extremely conservative = 7. We also asked respondents how often they attend religious services and coded our religiosity variable as never = 1; once a year or less = 2; a few times a year = 3; once or twice a month = 4; once a week = 5; more than once a week = 6.

In this article, we include, as noted earlier, a number of individual-level background variables such as age, race/ethnicity, education, gender, household income, and knowledge. Age indicates respondents’ actual age. For race/ethnicity, we include non-Hispanic White and other (non-Hispanic White = 1; other = 0). Education is measured using 14 levels that indicate the highest degree received. If respondents have no formal education, they were coded as 1, and if they have a professional, medical, or doctorate degree, they were coded as 14. Gender was coded with male = 0 and female = 1. Household income has 21 categories, with a lowest income group of less than $5,000 and a maximum group of $250,000 or more.

The literature shows that objective and perceived knowledge is associated with people’s attitudes (Stoutenborough et al., Reference Stoutenborough, Sturgess and Vedlitz2013; Stoutenborough & Vedlitz, Reference Stoutenborough and Vedlitz2015; Zhang et al., Reference Zhang, Liu and Vedlitz2020). The objective knowledge variable was calculated as the percentage of correct answers to six true/false statements: (1) “Genes are a basic driver of heredity” (True); (2) “Genetic engineering of crops has not yet been used to manage agricultural pests” (False); (3) “People receive all of their genes from only one of their parents” (False); (4) “A keystone species is one that is critical to a particular ecosystem” (True); (5) “U.S. regulations have always required labels to identify any food that contains genetically modified ingredients” (False); (6) “The goal of mass release of sterile insects is to reduce the number of offspring produced by the wild population” (True). The perceived knowledge variable was measured by four questions assessing respondents’ perceptions of their knowledge about (1) genetic engineering of agricultural crops; (2) genetic engineering of insects; (3) integrated pest management practices in agriculture; and (4) ecological risk assessments used in agricultural pest management. Their answers to this question were coded as not informed at all = 1; not so informed = 2; somewhat informed = 3, or very informed = 4. We generated an additive index from their ratings.

Findings

We find from our public survey that about 55% of the public are optimistic, and nearly 14% are pessimistic about science and technology in general. Similarly, nearly 52% of respondents trust scientists, and only about 14% distrust scientists. Sixty-three percent of respondents are supportive of federal gene drive programs, and 12% are opposed to overall federal policy about gene drive technology. However, approximately one-third of respondents reported neutral answers on science and technology (31%), trust in scientists (34%), and overall gene drive policies (26%). Our descriptive findings from the survey are consistent with the literature and recent survey polls showing that more people have optimistic views about science and scientists in general than have pessimistic views.

To answer our first research question, the correlates of science optimism and pessimism, we ran OLS regression models. In Table 3, we examine the factors associated with optimistic and pessimistic views of science and technology and with trust in scientists. The results show that our first model describes 25% of the variance in our measure of public attitudes toward science and technology optimism, and the second model describes 31% of the variance in our measure of public attitudes toward science and technology pessimism. Model 3 describes 31% of the variance in our measure of public attitudes toward scientists. We tested for but found no serious violations of regression assumptions in all three of our models. All VIF scores are well below ideal conditions (the highest being 1.91), indicating no multicollinearity issues. To explore additional patterns of public views toward science and technology, we calculated correlation coefficients between all the variables and detected no highly correlated predictors (see the Appendix). We also include state-level fixed effects. All p-values are based on two-tailed tests unless otherwise noted.

Table 3. Regression results: Science optimism, pessimism, and trust in scientists.

*** Significant at the .001 level

** significant at the .01 level

* significant at the .05 level (all based on two-tailed tests).

Results from the public views about science and technology model (Model 1 and Model 2) show that environmental concern, economic and national security concern, political ideology, and knowledge are important factors explaining both pessimistic and optimistic views of science and technology but, as expected, in different directions. Additionally, we find that the values and attitudes variable is an important factor that drives only science pessimism.

Respondents who are more concerned about environmental issues tend to exhibit more optimism, while those who are less concerned about environmental issues exhibit more pessimism. The relationship is slightly stronger for the optimism scale. Holding all other variables constant, an increase of 1 in respondents’ environmental concern score increases the science optimism score by 0.38 (p < .001) and decreases the science pessimism score by 0.27 (p < .001). Respondents who are more concerned about economic and national security issues express greater pessimism toward science and technology, while those who are less concerned about those issues tend to exhibit science and technology optimism. Every increase of 1 point in the economic and national security concern score is associated with a decrease of 0.23 in the science optimism score (p < .001) and an increase of 0.48 in the science pessimism score (p < .001), holding the other variables constant. This association is stronger for pessimism. We also find that people who emphasize values and attitudes as equally important as scientific knowledge have more pessimistic views toward science. Holding other variables constant, the science pessimism score is predicted to increase 0.48 when the values and attitudes variable goes up by 1 (p < .001).

The result for political ideology shows that conservative respondents are more pessimistic and liberal respondents are more optimistic about science and technology. Every increase of 1 point in the ideology scale (i.e., in the more conservative direction) is associated with a decrease of about 0.22 in the science optimism score (p < .001) and an increase of 0.26 in the science pessimism score (p < .001), holding the other variables constant. Religious inclination was not an important factor on attitudes toward both science and technology optimism and pessimism.

Our sociodemographic factor results are mostly consistent with the existing literature on public attitudes toward science and technology. Male, better educated, higher household income, more knowledgeable (objective) respondents are more likely to have positive opinions about science and technology. Among our analytic variables in the science and technology model (Model 1 and Model 2), greater environmental concern had the most powerful and important effect on optimism, while greater emphasis on values and attitudes had the most powerful and important effect on pessimism.

In Model 3, we utilized OLS regression to test the effect of the same independent variables on trust in scientists. The results indicate that environmental concern, economic and national security concern, and political ideology are significantly associated with the outcome measure of trust in scientists. Directions and significant results of the trust in scientists model (Model 3) are consistent with the science optimism model (Model 1). Results in our third model show that the two concern variables have the strongest relationship with trust in scientists among other independent variables. Environmental concern is positively associated with trust in scientists, while economic and national security concerns are negatively associated with trust in scientists. Respondents’ level of trust in scientists is predicted to increase by 1 point (p < .001) when the environmental concern score goes up by 1 and to decrease by 1.04 (p < .001) when the economic and national security concern score goes up by 1. Our results also reveal strong evidence that trust in scientists is negatively affected by conservative political ideology. Every increase of 1 point in the ideology scale (i.e., in the more conservative direction) is associated with a decrease of 0.51 in the score of trust in scientists (p < .001). Higher household income and educated respondents tend to have a higher level of trust in scientists.

To evaluate our second research question, the effect of science optimism and pessimism and trust in scientists on science policy support or opposition, we ran OLS regression for Model 4 (Table 4). In Model 4, we test the effect of the same independent variables, plus science optimism and pessimism and trust in scientists, on public opinion toward gene drive policy. Our results show that, controlling for other key variables, science optimism and trust in scientists remain significant and important variables explaining gene drive policy support and opposition. People who are more optimistic about science and technology and trust scientists are more likely to support on different gene drive policies. For a 1-unit increase in science optimism, we would expect a 0.33 score increase (p < .001) in gene drive policy support. A 1-unit increase in trust in scientist scores would result in a 0.15-unit increase (p < .001) in support for gene drive policy while the other variables in the model are held constant. Furthermore, respondents who have environmental concern tend to be supportive of gene drive policy. Every increase of 1 point in the environment concern scale is associated with an increase of about 0.81 (p < .001) in the gene drive policy support score. However, respondents with economic and national security concern tend to less supportive of gene drive policy. Every 1-unit increase in the economic and national security concern should yield a 0.48 unit decrease (p < .001) in support for gene drive policy. Political ideology and values and attitudes are also important factors that drive gene drive policy support. Every increase of 1 point in the ideology scale (i.e., in the more conservative direction) is associated with a decrease of about 0.18 (p < .05) in the gene drive policy support score. Additionally, when the value and attitudes variable rises by 1 unit, the gene drive policy support score rises by 0.42 (p < .001), holding the other variables constant.

Table 4. Regression results: Gene drive policy support.

*** Significant at the .001 level

** significant at the .01 level

* significant at the .05 level (all based on two-tailed tests).

Discussion and conclusion

Clearly, Americans possess mixed opinions about emerging science and technological developments (Funk, Reference Funk2020; MacDonald et al., Reference MacDonald, Balanovic, Edwards, Abrahamse, Frame, Greenaway, Kannemeyer, Kirk, Medvecky, Milfont, Russell and Tompkins2020). Optimists outweighed pessimists in earlier studies, but both were present. The strongly divided reactions to wearing masks, social distancing, testing, and vaccine acceptance during the COVID-19 pandemic suggest a much larger and more significant negative orientation toward science, scientists, and biotechnology today. However, our survey conducted during the pandemic shows that the public continues to feel generally optimistic about science and technology, trusting of scientists, and supportive of gene drive policies, but significant negative and pessimistic sentiments still exist among certain groups.

We empirically tested the utility of science optimism and pessimism constructs and found them to be coherent and useful analytic variables. Results of our models find environmental concern, economic and national security concern, and political ideology strongly associated with science optimism and pessimism dispositions and trust in scientists. Our study also indicates that respondents who have an optimistic view about science have a similar view on scientists. Environmental concern is an important factor in forming people’s attitudes toward science and scientists. Respondents who are more worried about environmental issues tend to be more optimistic about science and technology and more trusting of scientists. Our findings confirm those of Steel et al. (Reference Steel, List, Lach and Shindler2004), who found that those with stronger environmental attitudes support the emerging roles of scientists (advocating for policies, integrating research results, and decision-making) in the environmental policy process.

Contrary to traditional expectations, we observed that increases in economic and national security concern are an important factor associated with greater science pessimism and lower trust in scientists. No extant literature exists that directly discusses public concerns about economic and national security issues and its connections to trust in science and scientists. Economic growth generally is helpful for economic welfare. However, people who have negative expectations for economic growth may think that growth is generated by various new and current technologies that eventually destroy individual livelihoods and relationships (Acemoglu, Reference Acemoglu2009). Therefore, individuals with stronger economic concerns might have more negative views of science and technology and scientists. Additionally, the COVID-19 pandemic might have triggered more of these negative thoughts about science and technology for people who are worried about national and economic issues, because COVID-19 caused massive fear and economic shock across countries, communities, and individuals. This, in turn, could result in some people thinking that science and scientists are not reliable and cannot do much at this point.

Previous research used value predisposition variables to test public attitudes toward science (Brossard & Nisbet, Reference Brossard and Nisbet2007; Ho et al., Reference Ho, Scheufele and Corley2010). In addition to religious affiliation and ideology, we directly asked the public their general thoughts about the importance of values and attitudes compared to scientific knowledge. Our findings illustrate that members of the public who emphasize values and attitudes tend to have more pessimistic views of science and technology. People who value balancing the needs of humans and nature may worry about the effects of science and technology on the environment and may use their original scientific knowledge to analyze the harm of science (Hochschild & Sen, Reference Hochschild and Sen2015).

We found further compelling evidence in our survey taken during the COVID-19 pandemic of political polarization around science and a consistent, relatively strict adherence to the party line on problems identified by science and their related solutions as seen in earlier studies (Malka et al., Reference Malka, Krosnick and Langer2009). These sharp and continuing political divisions may lead to greater challenges for funding and using science and technology to identify and solve our pressing public problems. Our findings on the predominance of political ideology further emphasizes earlier research that shows that individuals who are more conservative tend to be less optimistic about science and technology and less likely to trust in scientists. The politics of science is particularly significant because it divides the citizenry into hostile camps. In addition, science becomes an action arena through which divisions are amplified and echoed throughout the public, often bringing potential policy solutions and government action to a standstill.

Our study directly measured the public’s optimism and pessimism towards science. Our findings illustrate that public views of science optimism and pessimism are related to gene drive technology policy. Respondents’ general science and technology orientation has a strong influence on support for policy. As one might expect, respondents who are more optimistic about science are more supportive of gene drive policy, while respondents who expressed more pessimistic views about science tend to oppose policies that would advance gene drive. If, as Nisbet and Markowitz (Reference Nisbet and Markowitz2014) found, optimistic and pessimistic assessments are stable, and according to the theory of deference to scientific authority, formed early in life (Brossard & Nisbet, Reference Brossard and Nisbet2007), that limits possible near future ways to alter trust in science to the other independent variables, with the primary ones being ideology and concern. Long-term influences could perhaps occur through changes in primary through secondary science education and depictions in the media.

This study identifies several important points that lead to a greater understanding of public views on science, scientists, and biotechnology policy. We designed and used questionnaire items to capture precisely both the optimistic and pessimistic dimensions of opinion about science and technology under conditions of stress related to the pandemic. We also used a measure of trust in scientists that captures opinion about the multiple dimensions of their role as knowledge generators, advisors, and informers. We provide additional information and understanding of the role of political polarization in the shaping of public views of science, scientists, and biotechnology policy. The empirical validation of the concepts of science optimism and pessimism as significant predictors of public support for gene drive policy should add an additional dimension to future research investigating the public’s complex relationship with science and policy. Our results are not new observations but the evidence of these sentiments during the COVID-19 experience indicates a possible expanding polarization that could threaten national trust in science processes and the scientists who use them for discoveries so useful to our continued health, safety, and security.

Acknowledgements

This work is supported by Agriculture and Food Research Initiative Grant No. 2018-67023-27676/Accession No. 1015199 from the USDA National Institute of Food and Agriculture. Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the view of the U.S. Department of Agriculture.

Appendix

Table A1. Pearson correlations.

Note: 1 = Science optimism; 2 = science pessimism; 3 = trust in scientists; 4 = environmental concern; 5 = economic and national security concern; 6 = values and attitudes; 7 = political ideology; 8 = religiosity; 9 = age; 10 = White, non-Hispanic; 11 = education; 12 = gender (female); 13 = household income; 14 = perceived knowledge; 15 = objective knowledge.

Footnotes

* Significant at the .05 level.

1 Prior to the gene drive policy questions, a 4-minute video briefly describing gene drives was provided to the representative sample of Americans included in this survey.

References

Acemoglu, D. (2009). Introduction to modern economic growth. Princeton University Press.Google Scholar
Achterberg, P., De Koster, W., & Waal, J. (2015). A science confidence gap: Education, trust in scientific methods, and trust in scientific institutions in the United States, 2014. Public Understanding of Science, 26(6), 117. https://doi.org/10.1177/0963662515617367Google Scholar
Akin, H., Cacciatore, M. A., Yeo, S. K., Brossard, D., Scheufele, D. A., & Xenos, M. A. (2021). Publics’ support for novel and established science issues linked to perceived knowledge and deference to science. International Journal of Public Opinion Research, 33(2), 422431. https://doi.org/10.1093/ijpor/edaa010CrossRefGoogle Scholar
Albrecht, D. (2022). Vaccination, politics and COVID-19 impacts. BMC Public Health, 22(1), Article 96. https://doi.org/10.1186/s12889-021-12432-xCrossRefGoogle ScholarPubMed
Algan, Y., Cohen, D., Davoine, E., Foucault, M., & Stantcheva, S. (2021). Trust in scientists in times of pandemic: Panel evidence from 12 countries. Proceedings of the National Academy of Sciences, 118(40), e2108576118. https://doi.org/10.1073/pnas.2108576118CrossRefGoogle ScholarPubMed
Allum, N. (2007). An empirical test of competing theories of hazard-related trust: The case of GM food. Risk Analysis, 27(4), 935946. https://doi.org/10.1111/j.1539-6924.2007.00933.xCrossRefGoogle ScholarPubMed
Allum, N., Sturgis, P., Tabourazi, D., & Brunton-Smith, I. (2008). Science knowledge and attitudes across cultures: A meta-analysis. Public Understanding of Science, 17(1), 3554. https://doi.org/10.1177/0963662506070159CrossRefGoogle Scholar
Bauer, M. W., Allum, N., & Miller, S. (2007). What can we learn from 25 years of PUS survey research? Liberating and expanding the agenda. Public Understanding of Science, 16(1), 7995. https://doi.org/10.1177/0963662506071287CrossRefGoogle Scholar
Besley, J. C., & Shanahan, J. (2005). Media attention and exposure in relation to support for agricultural biotechnology. Science Communication, 26(4), 347367. https://doi.org/10.1177/1075547005275443CrossRefGoogle Scholar
Betsch, C. (2020). How behavioural science data helps mitigate the COVID-19 crisis. Nature Human Behaviour, 4(5), 438. https://doi.org/10.1038/s41562-020-0866-1CrossRefGoogle ScholarPubMed
Bol, D., Giani, M., Blais, A., & Loewen, P. J. (2021). The effect of COVID-19 lockdowns on political support: Some good news for democracy? European Journal of Political Research, 60(2), 497505. https://doi.org/10.1111/1475-6765.12401CrossRefGoogle Scholar
Bolsen, T., Druckman, J. N., & Cook, F. L. (2014). The influence of partisan motivated reasoning on public opinion. Political Behavior, 36(2), 235262. https://doi.org/10.1007/s11109-013-9238-0CrossRefGoogle Scholar
Bottini, M., Rosato, N., Gloria, F., Adanti, S., Corradino, N., Bergamaschi, A., & Magrini, A. (2011). Public optimism towards nanomedicine. International Journal of Nanomedicine, 6, 34733485. https://doi.org/10.2147/IJN.S26340CrossRefGoogle ScholarPubMed
Brossard, D., & Nisbet, M. C. (2007). Deference to scientific authority among a low information public: Understanding U.S. opinion on agricultural biotechnology. International Journal of Public Opinion Research, 19(1), 2452. https://doi.org/10.1093/ijpor/edl003CrossRefGoogle Scholar
Brossard, D., Scheufele, D. A., Kim, E., & Lewenstein, B. V. (2009). Religiosity as a perceptual filter: Examining processes of opinion formation about nanotechnology. Public Understanding of Science, 18(5), 546558. https://doi.org/10.1177/0963662507087304CrossRefGoogle Scholar
Callaghan, T., Moghtaderi, A., Lueck, J. A., Hotez, P., Strych, U., Dor, A., Fowler, E. F., & Motta, M. (2021). Correlates and disparities of intention to vaccinate against COVID-19. Social Science & Medicine, 272, 113638. https://doi.org/10.1016/j.socscimed.2020.113638CrossRefGoogle ScholarPubMed
Chao, Z., & Wei, H. (2009). Study of the gender difference in scientific literacy of Chinese public. Science, Technology and Society, 14(2), 385406. https://doi.org/10.1177/097172180901400209CrossRefGoogle Scholar
Coleman, J. S. (1994). Foundations of social theory. Harvard University Press.Google Scholar
Cvetkovich, G., & Nakayachi, K. (2007). Trust in a high‐concern risk controversy: A comparison of three concepts. Journal of Risk Research, 10(2), 223237. https://doi.org/10.1080/13669870601122519CrossRefGoogle Scholar
Darmofal, D. (2005). Elite cues and citizen disagreement with expert opinion. Political Research Quarterly, 58(3), 381395. https://doi.org/10.1177/106591290505800302CrossRefGoogle Scholar
Durant, J. (1999). Participatory technology assessment and the democratic model of the public understanding of science. Science and Public Policy, 26(5), 313319. https://doi.org/10.3152/147154399781782329CrossRefGoogle Scholar
Ecklund, E. H. (2012). Science vs. religion: What scientists really think (Reprint ed.). Oxford University Press.Google Scholar
Eichengreen, B., Aksoy, C. G., & Saka, O. (2021). Revenge of the experts: Will COVID-19 renew or diminish public trust in science? Journal of Public Economics, 193, 104343. https://doi.org/10.1016/j.jpubeco.2020.104343CrossRefGoogle ScholarPubMed
Evans, M. S. (2012). Supporting science: Reasons, restrictions, and the role of religion. Science Communication, 34(3), 334362. https://doi.org/10.1177/1075547011417890CrossRefGoogle Scholar
Frewer, L. J., Howard, C., & Shepherd, R. (1997). Public concerns in the United Kingdom about general and specific applications of genetic engineering: risk, benefit, and ethics. Science, Technology, & Human Values, 22(1), 98124. https://doi.org/10.1177/016224399702200105CrossRefGoogle ScholarPubMed
Frewer, L. J., Howard, C., & Shepherd, R. (1998). Understanding public attitudes to technology. Journal of Risk Research, 1(3), 221235. https://doi.org/10.1080/136698798377141CrossRefGoogle Scholar
Funk, C. (2020, February 12). Key findings about Americans’ confidence in science and their views on scientists’ role in society. Pew Research Center. https://www.pewresearch.org/fact-tank/2020/02/12/key-findings-about-americans-confidence-in-science-and-their-views-on-scientists-role-in-society/Google Scholar
Funk, C., Kennedy, B., & Johnson, C. (2020, May 21). Trust in medical scientists has grown in U.S., but mainly among Democrats. Pew Research Center. https://www.pewresearch.org/science/2020/05/21/trust-in-medical-scientists-has-grown-in-u-s-but-mainly-among-democrats/Google Scholar
Funk, C., Rainie, L., & Page, D. (2015, January 29). Public and scientists’ views on science and society. Pew Research Center. https://www.pewresearch.org/internet/wp-content/uploads/sites/9/2015/01/PI_ScienceandSociety_Report_012915.pdfGoogle Scholar
Funk, C. (2016). Why Americans are wary of using technology to ‘enhance’ humans. Pew Research Center. https://www.pewresearch.org/short-reads/2016/08/04/why-americans-are-wary-of-using-technology-to-enhance-humans/Google Scholar
Gauchat, G. (2011). The cultural authority of science: Public trust and acceptance of organized science. Public Understanding of Science, 20(6), 751770. https://doi.org/10.1177/0963662510365246CrossRefGoogle ScholarPubMed
Gauchat, G. W. (2008). A test of three theories of anti-science attitudes. Sociological Focus, 41(4), 337357. https://doi.org/10.1080/00380237.2008.10571338CrossRefGoogle Scholar
Gauchat, G. W. (2012). Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010. American Sociological Review, 77(2), 167187. https://doi.org/10.1177/0003122412438225CrossRefGoogle Scholar
Hannibal, B., Liu, X., & Vedlitz, A. (2016). Personal characteristics, local environmental conditions, and individual environmental concern: A multilevel analysis. Environmental Sociology, 2(3), 286297. https://doi.org/10.1080/23251042.2016.1197355Google Scholar
Hardin, R. (2002). Trust and trustworthiness. Russell Sage Foundation.Google Scholar
Hazlett, A., Molden, D., & Sackett, A. (2011). Hoping for the Best or Preparing for the Worst? Regulatory Focus and Preferences for Optimism and Pessimism in Predicting Personal Outcomes. Social Cognition, 29(1), 7496.10.1521/soco.2011.29.1.74CrossRefGoogle Scholar
Hendriks, F., Kienhues, D., & Bromme, R. (2016). Trust in science and the science of trust. In Blöbaum, B. (Ed.), Trust and communication in a digitized world: Models and concepts of trust research (pp. 143159). Springer International. https://doi.org/10.1007/978-3-319-28059-2_8CrossRefGoogle Scholar
Higgins, E. T. (1997). Beyond pleasure and pain. American Psychologist, 52(12), 12801300. https://doi.org/10.1037/0003-066X.52.12.1280CrossRefGoogle ScholarPubMed
Ho, S. S., Scheufele, D. A., & Corley, E. A. (2010). Making sense of policy choices: Understanding the roles of value predispositions, mass media, and cognitive processing in public attitudes toward nanotechnology. Journal of Nanoparticle Research, 12(8), 27032715. https://doi.org/10.1007/s11051-010-0038-8CrossRefGoogle ScholarPubMed
Hochschild, J., Crabill, A., & Sen, M. (2012). Technology optimism or pessimism: How trust in science shapes policy attitudes about genomic science. Brookings Issues in Technology Innovation, 21. https://www.brookings.edu/wp-content/uploads/2016/06/genomic-science.pdfGoogle Scholar
Hochschild, J., & Sen, M. (2015). Technology optimism or pessimism about genomic science: Variation among experts and scholarly disciplines. Annals of the American Academy of Political and Social Science, 658(1), 236252. https://doi.org/10.1177/0002716214558205CrossRefGoogle Scholar
Horrow, C., Pacyna, J. E., Sutton, E. J., Sperry, B. P., Breitkopf, C. R., & Sharp, R. R. (2019). Assessing optimism and pessimism about genomic medicine: Development of a genomic orientation scale. Clinical Genetics, 95(6), 704712. https://doi.org/10.1111/cge.13535CrossRefGoogle ScholarPubMed
Hutmacher, F., Reichardt, R., & Appel, M. (2022). The role of motivated science reception and numeracy in the context of the COVID-19 pandemic. Public Understanding of Science, 31(1), 1934. https://doi.org/10.1177/09636625211047974CrossRefGoogle ScholarPubMed
Jasanoff, S., Markle, G., Peterson, J., & Pinch, T. (Eds.) (1995). Handbook of science and technology studies (Revised ed.). Sage Publications. https://doi.org/10.4135/9781412990127CrossRefGoogle Scholar
Johnson, B. B., & Dieckmann, N. F. (2020). Americans’ views of scientists’ motivations for scientific work. Public Understanding of Science, 29(1), 220. https://doi.org/10.1177/0963662519880319CrossRefGoogle ScholarPubMed
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), Article 10. https://doi.org/10.1038/nclimate1547CrossRefGoogle Scholar
Kee, H. W., & Knox, R. E. (1970). Conceptual and methodological considerations in the study of trust and suspicion. Journal of Conflict Resolution, 14(3), 357366. https://doi.org/10.1177/002200277001400307CrossRefGoogle Scholar
Kellstedt, P. M., Zahran, S., & Vedlitz, A. (2008). Personal efficacy, the information environment, and attitudes toward global warming and climate change in the United States. Risk Analysis, 28(1), 113126. https://doi.org/10.1111/j.1539-6924.2008.01010.xCrossRefGoogle ScholarPubMed
Kitt, S., Axsen, J., Long, Z., & Rhodes, E. (2021). The role of trust in citizen acceptance of climate policy: Comparing perceptions of government competence, integrity and value similarity. Ecological Economics, 183, 106958. https://doi.org/10.1016/j.ecolecon.2021.106958CrossRefGoogle Scholar
Lee, J. J. (2021). Party polarization and trust in science: What about Democrats? Socius, 7, 23780231211010100. https://doi.org/10.1177/23780231211010101CrossRefGoogle Scholar
Legge, J. S. Jr. & Durant, R. F. (2010). Public opinion, risk assessment, and biotechnology: Lessons from attitudes toward genetically modified foods in the European Union. Review of Policy Research, 27(1), 5976. https://doi.org/10.1111/j.1541-1338.2009.00427.xCrossRefGoogle Scholar
Li, N., & Qian, Y. (2022). Polarization of public trust in scientists between 1978 and 2018: Insights from a cross-decade comparison using interpretable machine learning. Politics and the Life Sciences, 41(1), 4554. https://doi.org/10.1017/pls.2021.18CrossRefGoogle Scholar
Liberman, N., Molden, D. C., Idson, L. C., & Higgins, E. T. (2001). Promotion and prevention focus on alternative hypotheses: Implications for attributional functions. Journal of Personality and Social Psychology, 80(1), 518. https://doi.org/10.1037/0022-3514.80.1.5CrossRefGoogle ScholarPubMed
Lin, C., Tu, P., & Beitsch, L. M. (2021). Confidence and receptivity for COVID-19 vaccines: A rapid systematic review. Vaccines, 9(1), Article 1. https://doi.org/10.3390/vaccines9010016Google Scholar
Liu, X., Vedlitz, A., & Shi, L. (2014). Examining the determinants of public environmental concern: Evidence from national public surveys. Environmental Science & Policy, 39, 7794. https://doi.org/10.1016/j.envsci.2014.02.006CrossRefGoogle Scholar
Liu, X., Vedlitz, A., Stoutenborough, J. W., & Robinson, S. (2015). Scientists’ views and positions on global warming and climate change: A content analysis of congressional testimonies. Climatic Change, 131(4), 487503. https://doi.org/10.1007/s10584-015-1390-6CrossRefGoogle Scholar
MacDonald, E. A., Balanovic, J., Edwards, E. D., Abrahamse, W., Frame, B., Greenaway, A., Kannemeyer, R., Kirk, N., Medvecky, F., Milfont, T. L., Russell, J. C., & Tompkins, D. M. (2020). Public opinion towards gene drive as a pest control approach for biodiversity conservation and the association of underlying worldviews. Environmental Communication, 14(7), 904918. https://doi.org/10.1080/17524032.2019.1702568CrossRefGoogle Scholar
Malka, A., Krosnick, J. A., & Langer, G. (2009). The association of knowledge with concern about global warming: Trusted information sources shape public thinking. Risk Analysis, 29(5), 633647. https://doi.org/10.1111/j.1539-6924.2009.01220.xCrossRefGoogle ScholarPubMed
Mann, M., & Schleifer, C. (2020). Love the science, hate the scientists: Conservative identity protects belief in science and undermines trust in scientists. Social Forces, 99(1), 305332. https://doi.org/10.1093/sf/soz156CrossRefGoogle Scholar
McCright, A. M., Dentzman, K., Charters, M., & Dietz, T. (2013). The influence of political ideology on trust in science. Environmental Research Letters, 8(4), 044029. https://doi.org/10.1088/1748-9326/8/4/044029CrossRefGoogle Scholar
McCright, A. M., & Dunlap, R. E. (2011). The politicization of climate change and polarization in the American public’s views of global warming, 2001–2010. Sociological Quarterly, 52(2), 155194. https://doi.org/10.1111/j.1533-8525.2011.01198.xCrossRefGoogle Scholar
Mehta, M. D. (2002). Public perceptions of food safety: Assessing the risks posed by genetic modification, irradiation, pesticides, microbiological contamination and high fat/high calorie foods. Pierce Law Review, 1(1–2), 6984.Google Scholar
Miller, J. D. (2004). Public understanding of, and attitudes toward, scientific research: What we know and what we need to know. Public Understanding of Science, 13(3), 273294. https://doi.org/10.1177/0963662504044908CrossRefGoogle Scholar
Miller, J. D., Pardo, R., & Niwa, F. (1997). Public perceptions of science and technology: A comparative study of the European Union, the United States, Japan and Canada. BBV Foundation.Google Scholar
Mooney, C. (2012). The Republican brain: The science of why they deny science—and reality. John Wiley & Sons.Google Scholar
Muñoz, A., Moreno, C., & Luján, J. L. (2012). Who is willing to pay for science? On the relationship between public perception of science and the attitude to public funding of science. Public Understanding of Science, 21(2), 242253. https://doi.org/10.1177/0963662510373813CrossRefGoogle ScholarPubMed
National Science Board. (2018). Science and technology: Public attitudes and understanding. In Science and Engineering Indicators 2018 (Chap. 7). National Science Foundation. https://www.nsf.gov/statistics/2018/nsb20181/assets/404/science-and-technology-public-attitudes-and-understanding.pdfGoogle Scholar
Nisbet, M. C. (2005). The competition for worldviews: Values, information, and public support for stem cell research. International Journal of Public Opinion Research, 17(1), 90112. https://doi.org/10.1093/ijpor/edh058CrossRefGoogle Scholar
Nisbet, M. C., & Markowitz, E. M. (2014). Understanding public opinion in debates over biomedical research: Looking beyond political partisanship to focus on beliefs about science and society. PLOS ONE, 9(2), e88473. https://doi.org/10.1371/journal.pone.0088473CrossRefGoogle ScholarPubMed
Nisbet, M. C., Scheufele, D. A., Shanahan, J., Moy, P., Brossard, D., & Lewenstein, B. V. (2002). Knowledge, reservations, or promise? A media effects model for public perceptions of science and technology. Communication Research, 29(5), 584608. https://doi.org/10.1177/009365002236196CrossRefGoogle Scholar
Nowotny, H., Scott, P. B., & Gibbons, M. T. (2013). Re-thinking science: Knowledge and the public in an age of uncertainty. John Wiley & Sons.Google Scholar
Poortinga, W., & Pidgeon, N. F. (2003). Exploring the dimensionality of trust in risk regulation. Risk Analysis, 23(5), 961972. https://doi.org/10.1111/1539-6924.00373CrossRefGoogle ScholarPubMed
Prati, G., Pietrantoni, L., & Zani, B. (2011). Compliance with recommendations for pandemic influenza H1N1 2009: The role of trust and personal beliefs. Health Education Research, 26(5), 761769. https://doi.org/10.1093/her/cyr035CrossRefGoogle ScholarPubMed
Priest, S. H. (2006). The public opinion climate for gene technologies in Canada and the United States: Competing voices, contrasting frames. Public Understanding of Science, 15(1), 5571. https://doi.org/10.1177/0963662506052889CrossRefGoogle Scholar
Robinson, S. E., Ripberger, J. T., Gupta, K., Ross, J. A., Fox, A. S., Jenkins-Smith, H. C., & Silva, C. L. (2021). The relevance and operations of political trust in the COVID-19 pandemic. Public Administration Review, 81(6), 11101119. https://doi.org/10.1111/puar.13333CrossRefGoogle Scholar
Scheufele, D. A., & Lewenstein, B. V. (2005). The public and nanotechnology: How citizens make sense of emerging technologies. Journal of Nanoparticle Research, 7(6), 659667. https://doi.org/10.1007/s11051-005-7526-2CrossRefGoogle Scholar
Siegrist, M. (2000). The influence of trust and perceptions of risks and benefits on the acceptance of gene technology. Risk Analysis, 20(2), 195204. https://doi.org/10.1111/0272-4332.202020CrossRefGoogle ScholarPubMed
Siegrist, M., & Zingg, A. (2014). The role of public trust during pandemics: Implications for crisis communication. European Psychologist, 19(1), 2332. https://doi.org/10.1027/1016-9040/a000169CrossRefGoogle Scholar
Slovic, P. (1987). Perception of risk. Science, 236(4799), 280285. https://doi.org/10.1126/science.3563507CrossRefGoogle ScholarPubMed
Smith, N., & Leiserowitz, A. (2014). The role of emotion in global warming policy support and opposition. Risk Analysis, 34(5), 937948. https://doi.org/10.1111/risa.12140CrossRefGoogle ScholarPubMed
Sparks, P., Shepherd, R., & Frewer, L. J. (1994). Gene technology, food production, and public opinion: A UK study. Agriculture and Human Values, 11(1), 1928. https://doi.org/10.1007/BF01534445CrossRefGoogle Scholar
Steel, B., List, P., Lach, D., & Shindler, B. (2004). The role of scientists in the environmental policy process: A case study from the American West. Environmental Science & Policy, 7(1), 113. https://doi.org/10.1016/j.envsci.2003.10.004CrossRefGoogle Scholar
Stoutenborough, J. W., Bromley-Trujillo, R., & Vedlitz, A. (2014). Public support for climate change policy: Consistency in the influence of values and attitudes over time and across specific policy alternatives. Review of Policy Research, 31(6), 555583. https://doi.org/10.1111/ropr.12104CrossRefGoogle Scholar
Stoutenborough, J. W., Sturgess, S. G., & Vedlitz, A. (2013). Knowledge, risk, and policy support: Public perceptions of nuclear power. Energy Policy, 62, 176184. https://doi.org/10.1016/j.enpol.2013.06.098CrossRefGoogle Scholar
Stoutenborough, J. W., & Vedlitz, A. (2015). Knowledge, information, and views of climate change: An examination of coastal stakeholders along the Gulf of Mexico. Climate, 3(4), Article 4. https://doi.org/10.3390/cli3040983CrossRefGoogle Scholar
Sturgis, P., & Allum, N. (2004). Science in society: Re-evaluating the deficit model of public attitudes. Public Understanding of Science, 13(1), 5574. https://doi.org/10.1177/0963662504042690CrossRefGoogle Scholar
Vaccarezza, L. S. (2007). The public perception of science and technology in a periphery society: A critical analysis from a quantitative perspective. Science, Technology and Society, 12(1), 141163. https://doi.org/10.1177/097172180601200107CrossRefGoogle Scholar
Wood, B. D., & Vedlitz, A. (2007). Issue definition, information processing, and the politics of global warming. American Journal of Political Science, 51(3), 552568.10.1111/j.1540-5907.2007.00267.xCrossRefGoogle Scholar
Xiao, C. (2013). Public attitudes toward science and technology and concern for the environment: Testing a model of indirect feedback effects. Environment and Behavior, 45(1), 113137. https://doi.org/10.1177/0013916511414875CrossRefGoogle Scholar
Zhang, Y., Liu, X., & Vedlitz, A. (2020). How social capital shapes citizen willingness to co‐invest in public service: The case of flood control. Public Administration, 98(3), 696712. https://doi.org/10.1111/padm.12646CrossRefGoogle Scholar
Ziman, J. (1991). Public understanding of science. Science, Technology, & Human Values, 16(1), 99105. https://doi.org/10.1177/016224399101600106CrossRefGoogle Scholar
Figure 0

Figure 1. Conceptual model explaining public attitudes toward science optimism and pessimism, trust in scientists, and policy support.

Figure 1

Figure 2. Univariate distributions of the science optimism and pessimism.

Figure 2

Table 1. Descriptive statistics of variables.

Figure 3

Table 2. Factor loading for public concern variables.

Figure 4

Table 3. Regression results: Science optimism, pessimism, and trust in scientists.

Figure 5

Table 4. Regression results: Gene drive policy support.

Figure 6

Table A1. Pearson correlations.