1. Introduction
Journalism scholars and educators have long abandoned the idea that the only skills required for journalism are those of gathering information and presenting it to the public—of observing, interviewing, and storytelling. This simple conception of journalism has been criticized by those keenly aware of failures in science reporting, who have come to emphasize that narrative aggregation is not enough: what matters is how journalists assess the information they gather, and which sources (human or documentary) they consider fit to provide it.
The controversy created by one disgraced researcher’s unfounded claims of a link between the measles, mumps, and rubella (MMR) vaccine and autism is a tired example of these difficulties. Despite scientific evidence to the contrary, many journalists in the 2000s reported on the issue as an open question that required equal consideration of “both sides”: those who thought there was a causal link between vaccines and autism, and those who did not. Certain politicians and anti-vaccine activists cited these media articles in their efforts to prompt regulative changes in health care. Geographic clusters of parents who refused vaccination for their children appeared, and measles outbreaks and clusters in the United Kingdom, United States, and Canada were attributed to the non-vaccination of children (Rao & Andrade, Reference Rao and Andrade2011).
Many blamed journalists for these avoidable, unfortunate results: “The research never justified the media’s ludicrous overinterpretation,” Goldacre (Reference Goldacre2008) wrote. “If they had paid attention, the scare would never have even started” (244). The criticism is not simply that journalists reported on the issue with false balance, treating both sides of the issue as equally worthy of consideration. Science journalists ought not “air both sides of a debate when only one is correct” (Blum, Hatch, & Jackson, Reference Blum, Hatch and Jackson2020, 76), but the important point here is what is presumed to have caused such an error in judgment. Despite their best intentions, journalists had failed to consider readily available scientific evidence regarding vaccines and autism. Perhaps they did not know how to assess the available evidence, they were misled by their sources, or they wrongly judged the sufficiency of their investigations; either way, a deficiency in their understanding of the topic had led them to identify as scientifically legitimate and open for debate a controversy that was neither of these things.Footnote 1
The inclination is to interpret the MMR-vaccine case as showing not the impossibility of good journalism on politicized scientific issues, but rather the dangers of bad journalism: if science journalists had done their job properly, the idea goes, their work would have served to illuminate scientific knowledge instead of obfuscating it—to defuse unnecessary controversy and public confusion concerning scientific issues. A competent science journalist should therefore have skills beyond information-gathering and storytelling that allow them to navigate the complex landscape bridging science and society.
I am interested in this normative perspective on science journalism. In what sense did journalists fail to grasp the relevant science regarding vaccines and autism, as their critics have claimed? More generally, to what epistemic norms ought we hold science journalists today? I call this the Epistemic Challenge for Science Journalism (ECSJ).
In this article, I aim to answer the ECSJ by bringing together insights from different disciplines: (i) practicing journalists and journalism educators; (ii) scholars of science communication, who are interested in the proper dissemination of scientific information; and (iii) epistemologists, who are interested in the philosophical study of knowledge. I describe this work as a project in applied epistemology (cf. Lackey, Reference Lackey2021). On the account I propose, science journalists—like all journalists—are governed by what I call a constitutive norm of confirmation: competent science journalists have “checked” the scientific facts they report. The central notion is thus confirmation, which I understand in a distinctive way, inspired by recent research on norms of editorial fact-checking.Footnote 2
Here is the plan. In Section 2, I detail the MMR-vaccine case alluded to above and outline the ECSJ. In Section 3, I present the dominant answer to the ECSJ in the science-communication and journalism-education literature, which I call the Knowledge-Based Solution, and I argue that it is unconvincing: knowledge of science (as it is conceptualized by these authors) is neither sufficient nor necessary for good science journalism. In Section 4, I propose an alternative, which I call the Confirmation-Based Solution, and I show that it is able to make sense of the MMR-vaccine case and other cases in a satisfying way. In Section 5, I turn my attention to recent debates about journalistic objectivity, and I argue that the Confirmation-Based Solution can respond to important concerns voiced by journalists and their audiences. Finally, Section 6 discusses my proposal in the context of philosophical debates about epistemic norms of assertion, and I argue that it represents a distinctive way of carving up the “epistemic territory” in nonideal contexts. Section 7 concludes.
2. MMR-Vaccine Case and the ECSJ
The MMR-vaccine trouble stems from 1998, when Andrew Wakefield, a physician who has since been banned from practicing medicine in Britain, and colleagues published a study in The Lancet claiming a possible causal link between the MMR vaccine and autism-related symptoms in children. At the time, the scientific and medical communities did not pay much attention to the paper; a professional consensus had already been reached that vaccines were safe, and Wakefield’s study did not provide convincing evidence to the contrary.Footnote 3 Science journalists largely followed suit, at least at first, careful to report on the story in its proper context and with the appropriate caveats (Goldacre, Reference Goldacre2008).
However, there was already growing anxiety among the public about the safety of childhood vaccines and possible environmental causes of autism (Goldacre, Reference Goldacre2008; Lewis & Speers, Reference Lewis and Speers2003; Offit, Reference Offit2007), and political actors soon intervened to cause a media fanfare (Brainard, Reference Brainard2013; Li, Stroud, & Jamieson, Reference Li, Stroud, Jamieson, Jamieson, Kahan and Scheufele2017; Rao & Andrade, Reference Rao and Andrade2011). By 2005, journalists in the United Kingdom and the United States were publishing articles about the “legitimate scientific debate” regarding vaccines (a quote from Schulman, Reference Schulman2005), most of which relied heavily on Wakefield’s claims as well as the testimony of concerned parents.
At the same time, the research that led to the 1998 paper was exposed as fraudulent, largely due to the efforts of an investigative journalist at The Sunday Times Magazine in the UK, Brian Deer, who spent years analyzing Wakefield and his colleagues’ scientific misconduct, including the unethical treatment of children, undisclosed conflicts of interest, and doctored medical histories (Godlee, Reference Godlee2011; Rao & Andrade, Reference Rao and Andrade2011).Footnote 4 Deer’s reporting, which led to The Lancet’s retraction of the paper in 2010, served to dampen the media’s practice of naïve “he-said/she-said” reporting on MMR vaccines. It is worth emphasizing, however, that Wakefield’s claims had never been taken seriously by the scientific community; by the time the paper was retracted, Wakefield’s exposure as fraudulent served only as ammunition for one side of a debate that was taking place within media, politics, and popular culture—not within science. As Brainard (Reference Brainard2013) writes for the Columbia Journalism Review:
Only a small group of [science] researchers ever even entertained the theory about autism. The [media] coverage rarely emphasized this, if noted it at all, and instead propagated misunderstanding about vaccines and autism and gave credence to what was largely a manufactured controversy.
Still today, the MMR-vaccine case is considered one of the most serious examples of flawed scientific journalism (see, e.g., Lewis & Speers, Reference Lewis and Speers2003; Mnookin, Reference Mnookin2011; Allsop, Reference Allsop2020).Footnote 5 A number of factors conspired to create this situation—including political, financial, and other structures, many of which were beyond any individual journalist’s control. However, in this article, I am interested in focusing on a subset of concerns voiced by certain commentators, journalism educators, and scholars of science communication: those related to journalists’ epistemic responsibilities when covering scientific stories, and what can go wrong when those responsibilities are not met.Footnote 6 This focus on the epistemic norms of journalism corresponds to how a number of different disciplines—including journalists (Blum et al., Reference Blum, Hatch and Jackson2020; Rensberger, Reference Rensberger2000), educators (Kovach and Rosenstiel, Reference Kovach and Rosenstiel2021), science communicators (Li et al., Reference Li, Stroud, Jamieson, Jamieson, Kahan and Scheufele2017), and applied epistemologists (Simion, Reference Simion2017)—have thought about the norms of journalism. Although each discipline brings its own perspective to the issue, there is broad agreement that the problem exemplified by the MMR-vaccine case is, at least in part, an epistemic one.
Journalists covering any science story, whether politicized or not, are expected to make a number of informed choices—regarding which story to report, whom to interview, which documentary sources to consult, which facts to feature and in which order, when to publish, and so on. All of these decisions affect whether a story will be considered epistemically defensible—by which I mean, as a first gloss, whether it will be considered reasonable, rigorous, and responsible science journalism. In the case of MMR-vaccine coverage, the criticism goes, the controversy’s severity and persistence were caused in part by science journalists’ poor choices, including their flawed assessment of what could be considered a scientific fact and whether Wakefield was qualified to provide them. These poor choices are at the heart of the criticism of “false balance” in media coverage of the issue.Footnote 7
Of course, the condemnation of these journalists’ choices as “poor” is a defensive attitude, formed in response to those critics of journalism who interpret the MMR-vaccine case as evidence that journalists are ill-suited to report on scientific stories at all. Footnote 8 These critics complain that most journalists do not have a scientific background and are not part of the scientific research community; moreover, they are bound to norms of relevance and newsworthiness in publishing that may disincentivize epistemically responsible pursuits. Especially in the context of polarization and science denialism, there may be no good reason to think that journalists are ever qualified to communicate scientific findings to the public.
I find this cynical attitude toward journalism unappealing. It is best not to prematurely dismiss the potential of competent science journalism—and, given the normative framework with which I am operating, it is also unnecessary. The goal of epistemology of journalism, as I see it, is not to provide a descriptive account of journalistic practices that accommodates all people who have once called themselves journalists. Rather, the goal is to articulate what journalists and people who study journalism think journalism can and should be, and the norms to which journalists hold themselves and their peers today.
Most journalists accept the criticism of past MMR-vaccine coverage but nevertheless insist that their profession is capable of reporting on such topics in principle; they protest that those who reported on the vaccine–autism link as a “legitimate scientific debate” failed to meet the epistemic norms of their profession.Footnote 9 The implication is that there is a divide between competent and incompetent journalism on the topic of vaccines—a divide captured by the journalists’ epistemic relationship to the science at issue.
One fundamental question is thus how to describe the competence that is required of science journalists, if we want to uphold the belief that good coverage of politicized or controversial scientific topics is, in principle, possible. I call this the ECSJ:
ECSJ: Which additional epistemic skills or attributes must a competent journalist possess in order to produce competent science journalism?
I assume that the ECSJ is not a hopeless, senseless, or misguided question—that the epistemological divide between competent and incompetent science journalism can be helpfully described, and that this description can be useful in making sense of cases such as the MMR-vaccine scandal.
The ECSJ is concerned only with the epistemic requirements of science journalism. We can assume, then, that the epistemic requirements of journalism generally—whatever they are, likely related to researching, interviewing, and storytelling—have already been met: the question is what is required to turn a good journalist into a good science journalist. Some people might believe that there is no substantial epistemic difference between journalism and science journalism; their answer to the ECSJ would then be trivial. Others, however, believe that competent generalist journalists are not necessarily fit to be science journalists—and thus, they might coherently criticize a journalist’s coverage of MMR vaccines without condemning their competence as a journalist tout court. These are the kinds of distinctions that can be made using the ECSJ as a guide.
As will become clear, I lean toward a solution to the ECSJ that does not rely on a substantial difference in epistemic requirements between journalism and science journalism. In this sense, I diverge from some of the other answers in the literature.
3. The Knowledge-Based Solution and its Flaws
The beginning of an answer to the ECSJ can be found in recent proposals, touted by some journalism educators and science-communication scholars, to conceive of journalism as a “knowledge-based profession,” in which journalists use their “specialized knowledge” or “expertise” to report on topics such as science (Donsbach, Reference Donsbach2014; Nisbet & Fahy, Reference Nisbet and Fahy2015, Reference Nisbet, Fahy, Jamieson, Kahan and Scheufele2017; Patterson, Reference Patterson2013).
One of the earliest and strongest articulations of the knowledge-based journalism program comes from Patterson (Reference Patterson2013), a communications scholar who focused on the demands of political reporting in the United States. Patterson argued that journalists had gone too far in depending on the testimony of politicians and other authority figures instead of acquiring and using the specialized knowledge that would allow them to report only the truth on the topics in question. He blamed journalists’ “knowledge deficit” (66) for the practice of “he-said/she-said” reporting, according to which journalists covering a controversial topic published quotes from politicians on “either side” of the controversy. In principle, such a practice is meant to allow journalists to inform the public about the disagreement without “taking a side.” But because it involves deference to the loudest voices in an argument without acknowledgment of which is the most reasonable or credible, it also fosters a professional culture in which “‘power’ rather than truthfulness is the operative standard” (52).
Patterson proposed knowledge-based journalism as a revolution in journalism education. He encouraged journalists to gain knowledge on whatever topic they would like to report on in advance, so that they could not be manipulated by the sources whom they subsequently interviewed. Other journalists and scholars followed suit: Donsbach (Reference Donsbach2014) proposed that journalism become the “new knowledge profession”; Stephens (Reference Stephens2014) promoted a new standard of “wisdom journalism.” Over the following years, many journalism education programs began training their students in specialist topics, such as politics, economics, and science.Footnote 10
Although Patterson wrote only briefly about science journalism, he classified it among the specializations that required targeted subject-matter training. Scholars of science communication have picked up on the idea that science journalists’ “knowledge deficit” could explain their past epistemic failures, including the problematic reporting of various controversies (such as the MMR-vaccine case) as “he-said/she-said” affairs.Footnote 11 Indeed, similar ideas have long circulated in science-communication circles. Already in 2005, Mooney and Nisbet had written about problematic “balanced” reporting on the topic of teaching creationism in American schools:
What is a good editor to do about the very real collision between a scientific consensus and a pseudo-scientific movement that opposes the basis of that consensus? At the very least, newspaper editors should think twice about assigning reporters who are fresh to the evolution issue…. As journalism programs across the country systematically review their curriculums and training methods, the evolution “controversy” provides strong evidence in support of the contention that specialization in journalism education can benefit not only public understanding, but also the integrity of the media.
Drawing from Patterson and others, I call this the Knowledge-Based Solution to the ECSJ:
KBS: In order to produce competent science journalism, journalists need to have knowledge of science.
In other words, knowledge of science is all that a competent journalist requires in order to become a competent science journalist.
The KBS is certainly appealing. It is easy to understand its popularity in media scholarship and journalism education circles; few people would resist the idea that science journalism requires some specialist skillset. However, I do not think it is helpful to conceive of that specialization in terms of knowledge of science. As it stands, the formulation is too vague to be of use: What does it mean for a journalist to “know science”? If this still feels like a substantial question, it is because—as long as we allow ourselves to conceive of knowledge as both theoretical and practical—we find ourselves at the same place we were in trying to answer the ECSJ: knowledge of what, exactly, is doing the epistemic work in competent science journalism? The knowledge-based journalism literature is often a bit more precise in its formulations, although not always uniform, and so I propose that the KBS can be faithfully clarified in three different ways:
KBS-(1): Science journalists ought to know science in a practical sense: they ought to be scientists.
KBS-(2): Science journalists ought to know science in a rudimentary sense: they ought to be science-literate.Footnote 12
KBS-(3): Science journalists ought to know science in a specialized sense: they ought to know scientific facts about the subject of their reporting.
In the rest of this section, I will argue against each of these elaborations of the KBS. The norms articulated by KBS-(1), -(2), and -(3) miss their mark: they equip the science journalist with too much of the wrong kind of epistemic good, and not enough of the right kind. Along the way, we will see the outlines of what I take to be a more credible solution to the ECSJ, which is grounded not in knowledge but in a more fine-grained epistemic competence.
Let us begin with the excessive requirements of KBS-(1): to expect that science journalists be scientists is clearly to go too far. After all, science is too vast and specialized a field for anyone to become a competent practitioner in the breadth of subdisciplines that are covered by the typical science journalist. Moreover, journalists are rarely full-fledged members of the communities on which they report: we do not expect sports journalists to be athletes, political commentators to be politicians, or food critics to be chefs. Is there any reason to think that science is so much more impenetrable than these other fields? Some of the best work in science journalism—for example, the work published in Undark, Quanta, and Nautilus magazines—is created by nonscientists.
On the contrary, science journalists’ skills are not in science but in journalism: they are responsible, rigorous communicators of science. As Boyce Resnberger, a veteran science journalist, put the following point in 2014 (quoted in Blum et al., Reference Blum, Hatch and Jackson2020): “Good science journalism stands apart from science.”
A journalist’s job is typically to report on other disciplines: to open a line of communication from the outside of a discipline to the inside and back out again. In sociology, this is described as knowledge-brokering: science journalists are “translators” who “move back and forth between different social worlds” and distribute “knowledge that has been deassembled and reassembled” (Meyer, Reference Meyer2010, 122–123). Although this requires a variety of skills, the skills are arguably not scientific; they are something else entirely.Footnote 13 Science journalists possess specialized skills that go beyond the scientific domain: they are able to assess scientific claims, contextualize them (perhaps alongside nonscientific claims), and communicate them to the public.
In the philosophy of science, a complementary insight comes from the distinction between contributory and interactional expertise (Collins & Evans, Reference Collins and Evans2015). Roughly, contributory expertise is the ability to contribute to some area, while interactional expertise is the ability to discuss it; applied to science, “the core idea is that someone possessing interactive expertise may communicate with scientists in a discipline without being able to contribute to their research” (Gerken, Reference Gerken2022, 25–26). Science journalists are typically described as professionals who possess interactional expertise but lack contributory expertise—and this is not framed as criticism, but rather reflective of the demands of their discipline: they need to communicate with scientists and about science, not conduct scientific research themselves.
KBS-(1) is therefore an unsatisfying solution to the ECSJ, and this much seems to me to be relatively uncontroversial. Most KBS proponents seem to be arguing instead for either KBS-(2) or KBS-(3), or a combination of the two: science journalists ought to be science-literate and/or know specialized facts about the scientific domain on which they are reporting. However, it is difficult to see how declarative knowledge of (either rudimentary or specialized) scientific facts could capture the important epistemic difference between competent and incompetent science journalists.
First, science is not a body of facts—and a proper understanding of science does not result simply from declarative knowledge of scientific truths. To equate the two is to repeat the mistake committed by proponents of the so-called Deficit Model in science communication, which has been extensively criticized.Footnote 14 According to this model, “scientists and other experts possess crucial knowledge that nonscientists lack, and the purpose of science communication is to ‘fill the knowledge gaps’ in a largely one-way flow of information from expert to layperson” (Reincke et al., Reference Reincke, Bredenoord and van Mill2020, 1). As the idea goes, once the public knows the relevant scientific facts, they will be able to perform appropriate inferences and consequently make informed decisions regarding science policy and other applied matters.
The KBS is slightly different from the Deficit Model: it is not targeted at the kind of knowledge that the public needs to understand science, but rather the kind of knowledge that journalists need in order to communicate science. However, both positions are vulnerable to the same kinds of objections. Gerken (Reference Gerken2022) describes the Deficit Model as “an optimistic enlightenment picture… [that is] largely insensitive to the epistemically non-ideal aspects of human psychology” (176). In short: knowledge of scientific facts is not enough to meet the goals of science communication.
Here, again, the literature on scientific expertise is helpful: it is standardly accepted that someone could know every fact about a certain discipline while nevertheless not being an expert in that discipline—in neither an interactional nor a contributory sense.Footnote 15 Instead of knowing every fact that has been previously published about the MMR vaccine, a science journalist needs to know how those facts hang together, which are the most relevant to the topic at hand, which are robust and why, and what kind of evidence it would take to confirm or refute them; they need to know how a study such as Wakefield’s ought to be conducted, and the context in which its results ought to be understood. This requires a kind of epistemic competence that goes beyond the simple propositional knowledge of the subject at hand. In this way, knowledge of scientific facts does not suffice for science journalism.
Furthermore, science journalists do not simply aggregate scientific facts from journal articles or conduct interviews with scientists; they also provide services beyond a scientific purview. They contextualize, explain, discuss, humanize, and criticize scientific findings—and to do so, they engage with a number of sources, only part of which are properly scientific. A science journalist must consequently navigate the epistemic world outside of science just as well as that inside of science; part of what it means to be a good science reporter is knowing how to transition seamlessly between these worlds. Propositional knowledge accounts of the KBS do not account for this broader skillset that is required of the science journalist: in order for them to do their job competently, their epistemic relationship to science must be appropriately contextualized.
KBS-(3) seems to imply that a competent science journalist ought to know the facts regarding the topic they intend to report on in advance of reporting on it. However, this gets the order of events wrong: a journalist’s job is to research—interview experts, uncover and analyze documentary sources, and so forth—in order to gain access to the facts at issue. Instead of expecting that science journalists know the “state of the field” regarding any specific subset of science before their research begins, we expect that they know how to properly uncover it in the course of their research. Consequently, their assessment of the sources and evidence they encounter will not be an assessment based on knowledge of scientific facts, but rather one based on reason, rigor, and other such epistemic virtues. The requirements of KBS-(2) and -(3) go too far.Footnote 16
No propositional knowledge about vaccines and autism, as prescribed by KBS-(3), will help a journalist determine which skeptical questions are appropriate when interviewing Wakefield upon the publication of the 1998 Lancet study. The important thing is knowing how to use one’s knowledge in the appropriate way, and how to corroborate the information one has gathered (including information that one does not yet know to be true). Indeed, the KSJ Science Editing Handbook (Blum et al., Reference Blum, Hatch and Jackson2020) opens with the idea that science journalists “must ask tough questions, analyze information, and demand that extraordinary claims be supported by extraordinary evidence” (7); these tasks do not require knowledge of science, but rather appropriate skepticism toward scientific claims.
Finally, knowledge of science is neither realistic nor useful. Science is a constantly developing affair; when science journalists report on a story, they are often (though not always) engaging in a domain in which the facts may not yet be entirely settled—and the details of which certainly go beyond the basic facts that are used to evaluate scientific literacy. Scientific domains are made up not only of facts but also of conjectures, hypotheses, inferences, theories, and the like. Knowledge, understood as justified true belief, is therefore not an epistemic achievement that science journalists should expect to always reach when reporting on developing stories. Journalists might never know the truth about the possible side effects of vaccination—after all, scientists might not either. Rather, we expect journalists to report on established information while aiming for the truth. We expect the information they publish to be well-grounded in science and attained by rational means.
I have argued that KBS-(1), -(2), and -(3) do not present viable solutions to the ECSJ. Knowledge of science—cashed out as knowledge of how to be a scientist, of rudimentary scientific facts, or of specialized scientific facts—will not provide a satisfactory solution to whatever problem was ailing the incompetent science journalist. Although some kind of scientific knowledge is certainly part of a competent science journalist’s toolbox, this knowledge is derivative of more fundamental epistemic skills and attributes, and it is typically not of the kind envisioned by the KBS literature.
Based on the considerations that have been outlined in this section, the ECSJ will be answered best by an account that takes into consideration more fine-grained epistemic notions, such as responsibility, rigor, and reasonableness. The remainder of this article is dedicated to articulating one such account, grounded in evidentialist epistemic norms.
4. The Confirmation-Based Solution
I argued in Section 3 that science journalists ought to report claims that are well-grounded in science; this task involves communicating about science, asking appropriate questions of experts, and identifying scientific facts and their justification. In summary, we might say that science journalists need to be able to competently navigate the epistemic world of science, including scientists’ operative notions of facts, justification, and the rest. Science journalists thus need to possess whatever epistemic skills or attributes will allow them to do so reliably—which, as we have established, need not be knowledge of science, but rather something that is both more general, to accommodate the broad scope of journalists’ activities, and more precise, to allow for more fine-grained distinctions.
In my opinion, the best approach to answering the ECSJ is by reflecting on the epistemic norms of journalism in general, and then determining what it would take for them to be properly applied in a scientific context. This supports the idea—contra the KBS above—that the epistemic norms of science journalism are no different in kind from those of journalism in general; science journalism is just journalism applied (in part) to the scientific world. (In this sense, I take an appropriate solution to the ECSJ to be concerned with the refinement of journalistic skills, as opposed to the addition of wholly different epistemic goods such as scientific knowledge.) I propose that we answer the ECSJ, then, by turning to practicing journalists.
Kovach and Rosenstiel (Reference Kovach and Rosenstiel2021) have proposed an account of journalism as essentially a “discipline of verification,” in which journalists aim to find not just “the fact” but also the “truth about the fact” (71). Journalistic objectivity, on their account, is grounded in journalists’ checking of the facts before those facts are published. Indeed, empirical research has found that journalists take verification to be the “essential element” of their practice (Shapiro, Brin, Bédard-Brûlé, & Mychajlowycz, Reference Shapiro, Brin, Bédard-Brûlé and Mychajlowycz2013).
In line with these insights, one can argue—as we have in Baker and Fairbank (Reference Baker and Fairbank2022)—that journalism is guided by one fundamental norm: that a journalist should report a piece of information only if that information has been checked. This Confirmation Norm, as we might call it, applies to all kinds of journalism—daily news and longform, print and broadcast, opinion and explanatory. Under it, journalists are bound to check every fact they wish to publish: names, dates, descriptions, quotes from sources and the information included within those quotes, causal explanations, the factual bases of opinions, and more. There must consequently always be two distinct steps to establishing a statement in a journalistic story: first, it is reported, and second, it is “checked.” The first step is exploratory, while the second is more focused: the journalist already has a fact and a set of sources from which it has been reported, and their job is to check that the fact is well supported by the appropriate kind of evidence. I discuss the Confirmation Norm further in Sections 5 and 6.
This norm can be implemented in a variety of ways depending on the resource constraints of the journalist in question. Some (lucky) newsrooms can afford to have an in-house fact-checking department and devote substantial time to the process, while others work on tighter deadlines with fewer staff. The important point is not how checking is implemented, practically speaking, but rather that every piece of information is checked—and this, I take it, is constitutive of the practice.Footnote 17 I thus propose the Confirmation-Based Solution to the ECSJ:
CBS: In order to produce competent science journalism, journalists need to be able to check scientific facts.
In other words, the refinement (in the scientific realm) of a competent journalist’s prior ability to check facts is all that is required for them to become a competent science journalist.
To evaluate this proposal, one first needs to understand what is meant by “checking facts.” Fact-checking is a nuanced term in journalistic circles—it is, as I argue in Section 6, an epistemic concept that extends beyond straightforward epistemological concerns, at least as philosophers have traditionally understood them. Therefore, it is worth examining the practice of fact-checking in more detail before evaluating CBS’s suitability as a solution to the ECSJ.
Editorial fact-checking is unrelated to political or media fact-checking, although the name is similar; it is an independent editorial process in journalism that takes place after a story has been reported and edited and before it is published, and which is usually not visible to the public.Footnote 18 Once a reporter has worked on a story, a fact-checker—either a different journalist or the same original reporter, depending on the structure of the newsroom in question—will go through the story and independently confirm each statement; they call back every person who was interviewed (to ensure the accuracy of the quotes and information attributed to them), go through every document referenced (to ensure that all information has been properly conveyed and attributed), and re-research the factual context of every evaluative claim (to ensure that everything is understood and nothing important overlooked). As they work, the fact-checker notes suggested changes to the piece, ranging from the correction of factual inaccuracies, to the rephrasing of a sentence, to the revision of (sections of) a piece, to the cancellation of its publication. The editorial team makes all necessary modifications to the story before publication—thus ensuring, in principle, that no part of the published work can be objected to “on factual grounds.”Footnote 19
Editorial fact-checking has a long history in professional journalism, and it is considered to be the normative ideal for long-form and investigative journalists (Baker & Fairbank, Reference Baker and Fairbank2022). Therefore, it is a good place to look for an explicit codification of the norm of confirmation that governs journalism in general. As Borel (Reference Borel2018) emphasizes, even if editorial fact-checking is not practiced by every organization, the fact-checking mindset remains: in daily newsrooms, where the nature of breaking news does not allow for extended checking, “stories go through layers of experienced editors, who challenge iffy claims or storylines and, when necessary, send the journalists to do more reporting” (2). Copy editors and lawyers also play the role of fact-checker in small newsrooms, as do specialized research editors.Footnote 20
Editorial fact-checking is not a simple as it might initially sound; journalists are used to thinking about the confirmation of facts in a way that might not line up with standard assumptions. In contemporary journalism, these insights have led to the establishment of nuanced norms of confirmation.Footnote 21
Journalists standardly rely on two kinds of sources: gathered sources (documents, audio and video recordings, online resources, books and articles, etc.) and interviewed sources (people, experts, witnesses, etc.). One of the central notions in play when checking journalistic facts is that of an authoritative source: given a particular statement and a source taken to be evidence for that statement, a journalist needs to determine whether the evidence is sufficient—whether the source in question really does confirm the fact in question, and whether it can be considered “authoritative” in that regard. Authority is understood as fact-relative and gradable: given a certain fact, certain sources are more authoritative for confirming or refuting it, while others are less so or not at all.Footnote 22
The general principle is that every fact in a story should be confirmed with at least one authoritative source or several independent non-authoritative sources. If that is not possible, the fact in question may need to be rephrased or removed, or an explicit acknowledgment of the insufficiency of evidence may need to be added, thereby weakening the strength of the assertion. Journalists also rely heavily on the notion of triangulation, which involves using different sources to corroborate the same fact. As a matter of principle, if several authoritative sources exist to check a fact, the journalist should consult them all and seek to explain any discrepancies among them.
A large part of journalists’ work during the confirmation stage, then, is the assessment of evidence and the search for further evidence. Assessment of evidence, here, has a wide scope: journalists pay attention not only to straightforward accuracy (are the individual facts, as far as we can tell, true?) but also to context and implication (is the audience of this work likely to conclude from it things that are, as far as we can tell, true?) (Blum et al., Reference Blum, Hatch and Jackson2020). Ivan Oransky, founder of Retraction Watch, says that a good fact-checker “digs in” to find what is missing from a story (Borel, Reference Borel2018, 22). They do so by consulting every piece of evidence that ought to be consulted on the topic at hand (they are not restricted to the reporter’s original sources), as well as speaking with relevant experts in order to review the facts and their assessment of those facts. The “expert interview,” in which a journalist calls an expert “on background” to discuss the facts of a story before it is published and make sure no important context is missing, is a foundation of editorial fact-checking.
Further norms govern interviewing practices and the assessment of evidence during fact-checking; I do not discuss these here, although I return to them briefly in Sections 6 and 7. The central point is simply that such norms exist and do guide journalistic practice, and they are sufficient, I think, to capture the benefits of the CBS as an answer to the ECSJ. Editorial fact-checking is not foolproof, but it is something like a “sensitive” method (Becker, Reference Becker2007): if a journalist is minimally competent, fact-checking gives them a way of reliably finding out if and when they have made a factual mistake in their reporting, and they will be able to rectify it. Confirmation increases the chances that a journalist has succeeded in their epistemic goals, and decreases the chance that they have gone far astray.
The CBS has many advantages over the KBS. First, as I have outlined above, it is representative of contemporary science journalism. Borel (Reference Borel2018) writes that “science journalists—like all journalists—should have formal processes to make sure their stories are accurate,” and that fact-checking is the best way to meet this goal (1). Importantly, fact-checkers usually do not know scientific facts in advance; they only know how to appropriately go about corroborating the information they have been given.Footnote 23
Second, the CBS is able to respond to many of the objections that were raised against the KBS: the ability to check scientific facts is, I argue, what makes the difference between competent and incompetent science journalists. Confirmation does not require knowledge: it requires deference to the appropriate sources, and the ability to recognize them as such. In philosophical terms, we can describe the practice of fact-checking as a journalist’s search for the proper justification of the facts they aim to publish—and so part of the skills required by the CBS are those of identifying scientific facts and their justification (cf. Gerken, Reference Gerken2022). However, journalists also need to place those findings in their appropriate contexts, so the types of evidence a journalist consults will not always be solely scientific. This explains the broader skillset that is required of the science journalist, as outlined in Section 3.
Of course, there are limits to the usefulness of confirmation: the fact that one’s body of evidence strongly supports a conclusion does not entail that the conclusion is true. The CBS allows for the possibility that good science journalists will sometimes report false things. I take this to be a virtue of the account: as I outlined in Section 3, given the nature of scientific research, it would be unreasonable to demand that science journalists report only scientific truths. Even the most competent science journalists are restricted by the limits of current scientific research.
Finally, despite its divergence from the KBS, the CBS is able to respond to the worries that motivated such a solution. Recall that Patterson (Reference Patterson2013) and other proponents of knowledge-based journalism criticized journalists’ reliance on testimony because it represented a significant vulnerability—an overreliance on officials or other figures of authority when gathering facts, which led to problematic “he-said/she-said” reporting. Of course, the solution is not for journalists to stop relying on testimony; that would prevent journalists from doing their jobs at all.Footnote 24 Instead, Patterson asks that journalists interview their sources from a “position of strength” (76). The CBS, I think, provides a way to understand this requirement without turning to knowledge: journalists are bound to check the facts that they have gathered during an interview, to triangulate the resulting testimonial evidence with other evidence, and to find authoritative sources for the fact in question.Footnote 25
Let us return to the MMR-vaccine case. Fact-checking minimizes the chance that science journalists will report scientific inaccuracies, such as the claim that MMR-vaccines cause autism; and, perhaps more importantly, it also minimizes the chance that journalists will imply scientific inaccuracies in their work, whether consciously or not. This is because of the broad scope of fact-checking, which examines facts and their accuracy in context as opposed to in isolation. Katie Palmer, a science editor at Quartz, says that “fact-checking is more than checking facts: It is also about checking assumptions” (Blum et al., Reference Blum, Hatch and Jackson2020, 152).
Norms for science fact-checking show how this is done in practice (Baker & Fairbank, Reference Baker and Fairbank2022; Borel, Reference Borel2018). If a journalist chose to report on Wakefield and his colleagues’ paper from 1998, they would be bound to assess the scientific evidence—and, on this basis, they would be bound to dismiss Wakefield’s claims, or at the very least, severely dampen them. (A similar point is made by Gerken (Reference Gerken2022), who writes that “there is extremely strong and growing scientific justification that there is no correlation between MMR vaccines and autism” (142).) They would also be bound to triangulate anti-vaccine activists’ claims with the evidence provided by other sources, thus placing them in their proper context. In addition, they would be bound to assess the authority of the various sources concerned: parents of unvaccinated children, for example, would be considered authoritative on certain facts regarding their experiences, but not on the scientific facts. These tools are enough to explain how the CBS prevents “false balance” in reporting on MMR vaccines. The failure of journalists who reported on the MMR-vaccine case with false balance was a failure to properly fact-check.Footnote 26
More complicated cases of science journalism can also be addressed by the CBS. As I have emphasized throughout this article, being a good science journalist does not mean taking only scientific sources seriously; it means knowing how to check scientific facts and knowing how to place them in their proper context, alongside sources that may be nonscientific but nevertheless relevant to the story at hand. (These are skills that are particular to the science journalist, and hence are under the remit of the ECSJ, but they involve experience beyond the scientific realm.) The CBS should capture why, in the MMR-vaccine case, where the science is settled, a journalist ought not publish work that falsely communicates the science in question. However, it should also capture why, in other contexts where the scientific consensus is uncertain, a journalist ought to take seriously the testimony of people with relevant experience. “Long COVID” (Yong, Reference Yong2023) is emerging as one contemporary example of this: in the context of a fluctuating information landscape, as well as contradictions between patient testimonies and medical records, the CBS provides a way forward for competent science journalists who are interested in reporting on the illness. Instead of trying to determine what the scientific facts are on their own, these journalists need to gather testimony from relevant parties, place them in their proper context, and triangulate. The results of their reporting may shift as the science evolves.
In keeping with the CBS, discussions among science journalists about how to report responsibly and with rigor on issues such as long COVID often boil down to discussions about the proper identification of evidence and authority. In Body Politic’s Comprehensive Guide to Covering Long COVID, for example, there is discussion of the fact that positive PCR tests are often not sufficient for diagnosing long COVID, and hence, such medical evidence must be properly contextualized (Lowenstein, Reference Lowenstein2021). The guide also emphasizes the importance of “talking to patients and considering them to be experts on their lived experience,” while remembering that patients may still not be experts on what is causing their lived experience. “Similarly, while clinicians or researchers may be able to provide theories behind a patient’s experience, if they have not lived with the illness, these experts are not always helpful sources to speak to the lived experience.” No source is de facto more authoritative than another; it all depends on the fact that one is trying to confirm.
The assumption is that journalists do not have specialized knowledge of long COVID; Body Politic’s guide provides instructions for reporting on a topic about which there is little common knowledge, and about which journalists are learning while they report. The important thing is that they are doing so in an established, reasonable, and rigorous manner, as the CBS demands.
Ed Yong, for example, an established science reporter, has written about the dangers of relying exclusively on published scientific papers when reporting on long COVID:
As energy-depleting illnesses that disproportionately affect women, long COVID and M.E./C.F.S. are easily belittled by a sexist society that trivializes women’s pain, and a capitalist one that values people according to their productivity. Societal dismissal leads to scientific neglect, and a lack of research becomes fodder for further skepticism. (Yong, Reference Yong2023).
A competent science journalist, then, knows not only how to confirm scientific facts, but also how to identify situations when the scientific facts are insufficient or unconvincing, and when nonscientific evidence may nevertheless be of relevance to a scientific topic. It is because of the amenability of the CBS to such complex situations that I think it provides a convincing solution to the ECSJ.
In what follows, I will address two possible concerns for the CBS: one stemming from journalists’ and others’ recent criticisms of journalistic objectivity, and the other stemming from philosophers’ conceptions of epistemic norms for assertion.
5. Journalism’s “Objectivity Wars”
The CBS is based on the idea that journalism is governed by the Confirmation Norm:
Confirmation Norm: A journalist should report a piece of information only if that information has been checked.
Because of this, some journalists might worry that the CBS is wedded to a problematic notion of objectivity in journalism and thus unconvincing: even if practicing journalists feel bound by the Confirmation Norm, these critics might say, that does not mean they ought to. Since journalistic objectivity is a flawed concept that ought to be discarded, the CBS is of no use; it relies on too naïve a picture of journalism, and too naïve a separation of epistemic and ethical matters within the discipline. This kind of concern matters, because a solution to the ECSJ should provide epistemological guidance for actual science journalists.
In response, I want to defend the CBS by looking at why some journalists have abandoned the notion of objectivity, and arguing that the Confirmation Norm is well suited to answer their concerns. There is good reason to think that journalism is governed by an epistemic norm—whether we choose to describe it as objectivity is a further issue—and something like the Confirmation Norm is well suited to capture it.
Journalistic objectivity has long been thought to distinguish the profession as an epistemically rigorous enterprise; in the twentieth Century, professional journalists’ claim to objectivity was taken as a justification for their considerable influence in public discourse (Anderson & Schudson, Reference Anderson, Schudson, Wahl-Jorgensen and Hanzitsch2019). In recent years, however, there have been substantial disagreements over how the objectivity ideal should be understood, what purpose it serves, what harms it causes, and whether it should be preserved in any form.Footnote 27
One fruitful way of understanding the Objectivity Wars in journalism is as a debate about what the standard of assessment for objectivity ought to be. Here are the main contenders, as I see them:
Descriptive: A journalist is objective if they describe the world “as it really is,” free from social/personal biases.
Perspectival: A journalist is objective if they “remain impersonal” and find “balance” between conflicting views and evidence.
Procedural: A journalist is objective if they are maximally rigorous and exhaustive in establishing facts.
Few people in the objectivity debate disagree that these three elements (aiming for truth, assessing evidence, and establishing facts rigorously) are important journalistic ideals; indeed, these characterizations of objectivity are not necessarily in conflict with one another. The debate concerns which one(s), if any, should be taken as foundational and thereby accorded explanatory priority, and whether any such standard is even desirable or attainable.
Over the past 150 years, there has been a notable turn away from the descriptive and perspectival accounts of objectivity, and toward the procedural account (Ward, Reference Ward2015; Kovach & Rosenstiel, Reference Kovach and Rosenstiel2021; Schudson, 1981). Alongside the professionalization of the discipline, two important developments have encouraged this progression: the rise of journalism studies among sociologists, anthropologists, and ethnographers, which has led to an internal critique of journalists’ self-understanding (e.g., Tuchman, Reference Tuchman1972); and the criticism of mainstream journalism by members of marginalized communities, which has led to a reevaluation of the very concept of objectivity. I will focus on the second of these developments here.
Contemporary critics have complained that the descriptive account of objectivity is unrealistic and misguided, and that in practice, it had served to prevent certain people from participating in the profession, since they have been considered incapable of reaching the mythical “objective realm of facts” in which journalists are expected to operate. “What sounds like fact and news in newsrooms”—that is, what is considered objective—“is more likely related to what and who is considered to be rational, able to report, and/or distanced enough” (Callison & Young, Reference Callison and Young2020, 36). In a now famous essay for the New York Times, journalist Wesley Lowery stated the following:
Since American journalism’s pivot many decades ago from an openly partisan press to a model of professed objectivity, the mainstream has allowed what it considers objective truth to be decided almost exclusively by white reporters and their mostly white bosses.… The views and inclinations of whiteness are accepted as the objective neutral. (Lowery, Reference Lowery2020)
Critics have similarly exposed important flaws in the perspectival account of objectivity, arguing that journalists who operate with a conception of objectivity as “a view from nowhere” often end up treating the testimonies of marginalized communities as not credible, thereby excluding their voices entirely from journalists’ assessment and adjudication of the evidence. In this vein, journalist Pacinthe Mattar has complained that “there’s an added burden of proof, for both journalists and sources, that accompanies stories about racism.….How can the media be trusted to report on what Black and other racialized people are facing when it doesn’t even believe them?”
The criticism of objectivity is that it hides moral failure under the cloak of epistemology: it makes bias look like neutrality. What is touted as epistemic authority is really just an application of power. If we take these criticisms seriously, as I think we should, the remaining options are to conclude either that objectivity is not a reasonable ideal for journalism in the first place, or that the third, procedural account of objectivity is better able to respond to critics’ reasonable complaints. Those journalists in the latter camp have underlined the distinction between objectivity as a standard for journalists’ epistemic methods, which they endorse, and objectivity as a standard for journalists’ moral behavior or beliefs, which they do not. Kovach and Rosentiel (Reference Kovach and Rosenstiel2021) accordingly write that:
Objectivity was not meant to suggest that journalists were without bias. To the contrary, precisely because journalists could never be objective, their methods had to be. In the recognition that everyone is biased, in other words, the news, like science, should flow from a process for reporting that is defensible, rigorous, and transparent. (xxviii)
I have defended a similar account of objectivity grounded in contemporary practices of fact-checking, which consider the complications of fact-checking stories about lived experience in the context of structural injustice and marginalization (Baker & Fairbank, Reference Baker and Fairbank2022; Fairbank, Reference Fairbank2021).
The idea is that criticisms of journalism arising from the Objectivity Wars are legitimate, but they are not fatal to journalistic objectivity in principle; rather, they arise from the fact that (some) journalists and (some of) their audience have largely misunderstood journalists’ epistemic obligations and operative concepts. Adhering to a procedural version of objectivity, according to which “journalists aim to be maximally rigorous and exhaustive in establishing facts,” does not prevent us from acknowledging the kinds of considerations brought up by critics; to the contrary, it gives us a basis from which to criticize those journalists who did not appropriately live up to the norms in question. In short: if a journalist unfairly dismisses a source as irrelevant or untrustworthy, they are failing to live up to the epistemic standards of their discipline, in addition to further ethical ones. Indeed, Lowery’s complaints about objectivity seem to be in line with this sentiment, since he calls for reporters to focus instead on “being fair and telling the truth, as best as one can, based on the given context and available facts.” This is an epistemic demand.
Of course, it may well be that journalistic objectivity is by now too polluted of a concept to be of use, and that it is best abandoned; clearly, the concept has been misused to justify exclusion and bias. I am tempted to maintain the ideal while clarifying the norms that underlie it, but nothing hangs on this terminological issue. Although objectivity in journalism is controversial, the importance of epistemic norms and verification is not.
The practice of editorial fact-checking shows how complex the relationship is between epistemic and other norms in journalism. For example, it is expected that whenever appropriate and within reason, a journalist will give people a choice over how their stories will be reported and fact-checked, but not whether the relevant facts will be checked. (This is called the Collaboration Principle for fact-checking in Baker and Fairbank (Reference Baker and Fairbank2022).) This is a way of ensuring that the relationship between journalists and their sources and audience is one of respect—and this, in turn, ensures that journalists’ reporting is epistemically responsible. Interviewing sources in a trauma-informed manner, too, is understood as a way of being rigorous and of aiming for accuracy: journalists are more likely to get to the truth if their sources trust them (Baker & Fairbank, Reference Baker and Fairbank2022). In this sense, regarding journalistic practice, ethics and epistemology are not in tension, but rather go hand in hand. This is an insight that pairs well with procedural accounts of journalistic objectivity.
Critics of journalistic objectivity have rightly emphasized the discipline’s failure to acknowledge the voices of marginalized communities and to treat their testimonies as credible. This is not a reason to abandon the priority of epistemic norms in journalism, or to reject the CBS; rather, it underlines the fact that questions of evidence are intricately related to questions of equity—including whether some voices are treated as more reliable than others.
6. Epistemic Norms of Journalism Assertion
In this final section, I consider a further concern that epistemologists may raise: that the Confirmation Norm is not properly epistemic at all. Indeed, as I have emphasized in Section 5, the Confirmation Norm seems to incorporate ethical and practical considerations, such as the Collaboration Principle, that go beyond the scope of “pure” epistemology. I argue that this is not a problem for the account, but rather an interesting way of showing how applied contexts differ from the idealized contexts in which certain epistemological conversations occur.
It might be said that journalism is a kind of assertion, and if this is the case, then it ought to be governed by whatever epistemic norm governs assertion. Indeed, Simion (Reference Simion2017) has provided a convincing account of the normative framework of news publishing: when a journalist reports an item of news, they are performing an informative speech act, which is a species of asserting that p with the necessary characteristics that “S reports that p only if (1) S asserts that p for at least one hearer H, (2) H uptakes S’s assertion, and (3) the purpose of S asserting that p is to inform H that p” (415). News reporting is therefore always a public act of assertion intended to inform the audience. Simion argues that understanding journalism intuitively allows for the identification of epistemic and prudential norms for news reporting.Footnote 28
Epistemologists of journalism can adopt Simion’s framework but take one step back and ask: What is the relevant norm of assertion for journalism? The Confirmation Norm is credibly one way of answering these questions: what matters is that, somehow, journalists check their facts before publication. However, the Confirmation Norm is not in line with the standard answers in epistemology—the most popular of which are a truth norm, a knowledge norm, a warrant norm, and a belief norm (Brown & Cappelen, Reference Brown and Cappelen2011). How does the Confirmation Norm fit, if at all?
We should, I think, accommodate the intuition that journalists’ activities are to be distinguished from our everyday epistemic interactions. It is commonly acknowledged, for example, that reading something in a published work of journalism is of greater evidential importance than overhearing the same thing being discussed among friends over dinner. We assume that a journalist has an individual ethical responsibility to believe the stories of victims of sexual violence but a professional responsibility to fact-check those stories before publishing them. The Confirmation Norm for journalistic assertion might thus be taken as evidence that the norms for assertion vary with context, as has been argued by Brown (Reference Brown2010), Gerken (Reference Gerken2012), and Goldberg (Reference Goldberg2015), among others. Another option might be to understand the Confirmation Norm as a strengthened version of a warrant norm, such as Lackey’s (Reference Lackey2007) Reasonable to Believe norm.
An epistemologist might complain, however, that the last two sections of this article have only emphasized how much baggage is packed into the journalistic notion of confirmation—much more than what an epistemologist would traditionally support. Are the requirements to collaborate with one’s sources or to be trauma-informed during fact-checking, as outlined in Section 5, really epistemic requirements? My response is that, although such principles seem to go beyond “pure” epistemology, they are epistemologically justified: they are conducive to the attainment of verified (i.e., objective, journalistic) facts—indeed, I think, they are required.
This highlights the importance of not over-idealizing epistemology (cf. Kukla, Reference Kukla and Lackey2021). Responsible journalism must consider how our assessments of authority and expertise can be tainted by bias and systemic injustice: our epistemic activities are inevitably influenced by non-epistemic factors. The solution is to understand confirmation as a social epistemic affair, which involves the triangulation of sources and deference to those who know best. In this way, the Confirmation Norm makes room for nonideal theorizing about the actual practices of journalists, and science journalists in particular.
I take this nonideal perspective on journalism to be a virtue of my account, as well as incentive for further work in the epistemology of journalism. The notions of rigor and responsibility that are fundamental to editorial fact-checking can arguably never be described in “purely” epistemic terms once they are placed in a nonidealized context—as, in journalistic practice, they always are. This is not to say that there is no distinction between epistemic and ethical or other norms in journalism; to the contrary, the fact that there is such a distinction is what motivates my defense of the CBS. Rather, journalists tend to carve up the “epistemic territory” in an interesting way that diverges from how it is done in abstract epistemology.
7. Conclusion
I hope that this article serves as an example of the kind of interdisciplinary work that can be conducted in the philosophy of journalism. I have argued that the Knowledge-Based Solution to the ECSJ is unsatisfactory, and that the Confirmation-Based Solution, according to which science journalists are governed by a norm of confirmation, shows the way forward. Thinking about the norms of science journalism had led us to reflect on the norms of journalism more generally, and to see how, in applied cases, epistemology is a nuanced affair.
Acknowledgments
Every point about fact-checking made here is indebted to past conversations and collaborations with Allison Baker. Sandy Goldberg and Susanna Siegel helped the author think through early versions of the ideas presented in this article. The author would also like to thank Jessica Brown for her remarkably helpful comments on a draft of this article, as well as three anonymous reviewers for their useful questions and suggestions.
Viviane Fairbank is a PhD student at the St Andrews and Stirling Graduate Programme in Philosophy. Her research interests include the philosophy of logic, feminist philosophy, and epistemology. She has previously worked as a journalist.