Skip to main content Accessibility help
×
Hostname: page-component-78c5997874-ndw9j Total loading time: 0 Render date: 2024-11-13T15:23:28.376Z Has data issue: false hasContentIssue false

5 - The Right to Science and the Evolution of Scientific Integrity

from Part I - The Right to Science, Then

Published online by Cambridge University Press:  25 November 2021

Helle Porsdam
Affiliation:
University of Copenhagen
Sebastian Porsdam Mann
Affiliation:
University of Copenhagen

Summary

Chapter Five presents the history of the development, in the United States as well as in Europe, of ethical concerns in science. Science is one of the highest expressions of human thought and makes a crucial contribution to the wellbeing and progress of society. This is why the right to freely conduct science is expressly protected by international human rights law (Art. 15, para 3, ICESCR). As the object of this right is ‘science’, activities conducted by scientists are protected by this right insofar as they satisfy the requirements set up by ethical guidelines and professional standards. Practices that involve fabrication or falsification of data and plagiarism contradict the very essence of science, as they encompass acts of deception of the scientific community and society. Over the past few decades, awareness has grown about the importance of adhering to ethical standards in the conduct of science. Scientific misconduct became the subject of significant public attention beginning in the 1980s, which led to public statements and guidelines by academic and funding agencies, as well as to procedures for dealing with allegations of misconduct in science.

Type
Chapter
Information
The Right to Science
Then and Now
, pp. 91 - 104
Publisher: Cambridge University Press
Print publication year: 2021
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCCreative Common License - ND
This content is Open Access and distributed under the terms of the Creative Commons Attribution licence CC-BY-NC-ND 4.0 https://creativecommons.org/cclicenses/

5.1 Introduction

As Aristotle famously claimed in the opening line of his Metaphysics, “all human beings, by nature, desire to know.” In other words, the pursuit of knowledge is connatural to us. We cherish knowledge for its own sake, simply because we want to better understand the world in which we live, and ourselves, and not primarily for any practical utility or for the satisfaction of other human interests. We see knowledge as a good in itself, as an irreducible good, and one of the most important aspects of human flourishing. This is why the pursuit of knowledge deserves to be protected by legal norms and, in particular, by human rights norms.

The search for the “why” of things is one of the key features of the scientific enterprise. Indeed, science represents one of the highest expressions of human intellectual ability and contributes to a deeper and better understanding of both nature and ourselves. Besides its intrinsic, irreducible value, scientific research makes a crucial contribution to the well-being and progress of humankind by delivering new tools that help improve quality of life, and provide new diagnostic, preventive, and treatment measures for various diseases and conditions.

For these reasons, science should enjoy the greatest freedom to advance in the different fields in which it is carried out and to be promoted at all levels. This basic human interest is formally recognized by international law, which expressly protects the “freedom indispensable for scientific research and creative activity.”Footnote 1 Although the freedom to conduct scientific research was not explicitly included in the founding instrument of the human rights movement, the Universal Declaration of Human Rights, it is generally regarded as implicit in the freedom of thought, and in the freedom of opinion and expression, protected by Articles 18 and 19 of the Declaration, respectively.

At the European level, the 2000 European Charter of Fundamental Rights expressly recognizes that “[t]he arts and scientific research shall be free of constraint” (Article 13). The Explanations Relating to the Charter specify that the freedom enshrined in Article 13 “is deduced primarily from the right to freedom of thought and expression” and “is to be exercised having regard to Article 1 and may be subject to the limitations authorised by Article 10 of the ECHR.”Footnote 2 This explanation is of great relevance as it makes clear that freedom of scientific research, like most freedoms, is not absolute, but may be subject to some limitations in the interests of other individuals and society. There is no doubt that scientific research, like any other activity in society, cannot operate at the margins of the ethical and legal principles that are basic to any democratic society, such as respect for human dignity and human rights, and other important societal values. The first limitation mentioned by the Explanations relates to Article 1 of the Charter, which enshrines the principle of respect for human dignity. Scientific research, even if motivated by the best of intentions, cannot be conducted in ways that involve the violation of people’s dignity (for instance, medical research cannot be conducted without participants’ free and informed consent). The second category of limitations is included in Article 10 of the ECHR, which stipulates that freedom of expression may be subject to such limitations as prescribed by law and “are necessary in a democratic society in the interests of public safety, for the protection of public order, health or morals, or for the protection of the rights and freedoms of others.”

At this point, it should be emphasized that the right to science is a multifaceted notion, as it includes both the freedom to do science and the right to enjoy the benefits of science.Footnote 3 This right is therefore addressed to both scientists, whose efforts to conduct scientific research should not be hindered by the State, and to the public in general, who should have access to the results of scientific developments. Strangely, in spite of its enormous importance, especially in modern technological societies, the right to science has long been overlooked, with the consequence that its legal development is still rudimentary, and the scholarly literature around its meaning, scope, and practical implications is still relatively sparse.Footnote 4

This chapter focuses on the first of the two components of the right to science mentioned above: the freedom to conduct scientific research, and discusses the limitations to that freedom that result from the rules generally recognized for the responsible conduct of research. The claim is that activities done by scientists that seriously violate the ethical requirements for conducting scientific research do not deserve to be awarded the label of “scientific.” Practices involving, for instance, the fabrication or falsification of data and plagiarism contradict the very essence of science, as they encompass acts of deception intended to mislead the scientific community and society as a whole. Thus, these practices attack the very heart of scientific research, as they involve the manipulation of truth and thereby betray the purpose of science itself. This is especially clear if science is understood as “the quest for knowledge obtained through systematic study and thinking, observation and experimentation.”Footnote 5

Awareness of the importance of adhering to ethical standards in the conduct of science has increased significantly over the past few decades. Scientific misconduct became the subject of public attention beginning in the 1980s, which led to public statements and guidelines by academic and funding agencies, as well as the adoption of procedures for dealing with allegations of misconduct in science. After introducing the concept of scientific integrity, this chapter briefly presents the history of this development.

5.2 What Is Scientific Integrity and Why Does Misconduct Occur?

The term “integrity” refers to the state of being whole and undivided, in the sense that the individual’s behavior is not marked by duplicity, but is consistent with ethical principles. Integrity is, therefore, exactly the opposite of deceptive behavior; in a word, it is synonymous with honesty. What does this term imply when it is associated to scientific research? It means that “integrity is expected because science is built upon a foundation of trust and honesty.”Footnote 6 Indeed, for researchers integrity embodies above all “a commitment to intellectual honesty and personal responsibility for one’s actions and to a range of practices that characterize responsible research conduct.”Footnote 7

Science, which is often characterized as the “search for truth,” is intrinsically incompatible with the manipulation of facts and data, and with the resort to falsehood and deception. The reputation of science in society is critically dependent upon adherence to the rules of good scientific practice, which have been developed by the scientific community itself. Therefore, it is unsurprising that, each time that a new case of scientific misconduct is reported, public trust in the work of scientists deteriorates. This also leads to broader skepticism in society about the scientific community’s willingness and ability to self-regulate in order to ensure compliance with ethical principles.

Although misconduct occurs in all areas of science, it is interesting to note that the great majority of cases that surface take place in the field of medicine and closely related sciences (biology, for example). This can be explained primarily by two factors: firstly, the huge social expectations and enormous financial benefits that accompany scientific developments that could contribute to the prevention and treatment of diseases; and, secondly, the difficulty in reproducing experiments in the life sciences, due in large part to the biological variability that exists between organisms.Footnote 8 As Goodstein points out, “if two identical rats are treated with the same carcinogen, they are not expected to develop the same tumour in the same place at the same time.”Footnote 9 These factors – the financial incentives coupled with the fact that actual fraud may be hard to even uncover let alone prove – make the manipulation of truth much more tempting in the life sciences than in other domains.

However, misconduct is not limited to the life sciences. Research activities in the social and human sciences are themselves not exempt from fraud, although how it manifests is slightly different. With the possible exceptions of sociology and psychology, social and human sciences generally use methods that are primarily not empirical, but rather analytical, critical, conceptual, hermeneutical, or normative. Dishonesty in these sciences often consists of the use of the ideas or words of others without proper acknowledgment (what is known as plagiarism), and in the violation of rules for authorship (for instance, the use of “honorary authorships”). Over the past decade in Europe, there have been a number of scandals concerning plagiarized doctoral dissertations in the legal field by high-level politicians. As a consequence, the topic of plagiarism in doctoral studies has received renewed attention from both the general public and the academic community, who have become more aware of the urgency in promoting scientific integrity also in the fields of social and human sciences.

The first and most obvious question that arises when discussing scientific misconduct is: Why does it happen? What strange attraction leads scientists to act in a way that so openly contradicts the goal of the scientific enterprise? The preliminary answer to this question is simple: scientific research, like any other human activity, is often exposed to temptations that call for dishonesty. After all, “scientists are not different from other people.”Footnote 10 When they enter their office, laboratory, or research unit, scientists continue having the same negative passions and driving ambitions to which all human beings are vulnerable. They are tempted, like any other individual, to transgress the boundaries of ethical behavior in order to achieve their personal and professional goals more rapidly. This is to say that it is naïve to assume – as was traditionally thought until the 1970s – that scientists are necessarily honest and always comply with ethical standards simply because they have chosen to embark on the disinterested pursuit of knowledge.

In addition, it should be noted that in our increasingly globalized and competitive world, science is not just – or maybe it is no longer – a vocation, but primarily a career. Scientific research has become increasingly competitive, complex, and expensive, often demanding collaboration and leading to a diffusion of individual responsibility. Moreover, researchers are regularly under pressure from academic structures and funding agencies to be successful and produce quick results. They are expected to make original discoveries, publish as many articles as possible (“publish or perish”), obtain grants for research, receive awards, be appointed to scientific societies, and eventually become professors. Competition and the pressure to be successful at any price are sometimes too high and the temptation to pass over the rules of honesty is a great one.

David Goodstein, who has studied a number of cases of scientific misconduct, points out three underlying motives that are present in most cases: (1) scientists were under career pressure; (2) they believed they “knew” the answer to the problem they were considering, and that it was unnecessary to go to all the trouble of doing the work properly; (3) they were working in a field – such as life sciences – where experiments offer data that are not precisely reproducible, therefore, as the data manipulation is more difficult to detect, the temptation to cheat is greater.Footnote 11

Besides the above-mentioned factors of misconduct in science, there is another element that should also be taken into account when approaching this phenomenon: there is not always a clear line between the accepted and the not-accepted practices that define what is called the “scientific method.” According to most textbooks, scientists study existing information, formulate a hypothesis to explain certain facts, and then, through experimentation, try to test the hypothesis. The problem is that, as Bauer points out, the “scientific method” is, to some extent, a myth.Footnote 12 Scientific research rarely proceeds by the organized and systematic approach that is reflected in textbook presentations. The formulation itself of hypothesis is affected by the knowledge, opinions, biases, and resources of the scientist. Furthermore, hypotheses are subject to experimental testing by means of methods selected by scientists, who very often already have in mind a theory they want to prove. There is a more or less conscious self-deception in scientific research that paves the way for a deception of other colleagues and the public in general.Footnote 13 David Goodstein describes this myth of the scientific method very well when he notes:

every scientific paper is written as if that particular investigation were a triumphant procession from one truth to another. All scientists who perform research, however, know that every scientific experiment is chaotic, like war. You never know what is going on; you cannot usually understand what the data mean. But in the end, you figure out what it was all about and then, with hindsight, you write it up describing it as one clear and certain step after the other. This is a kind of hypocrisy, but it is deeply embedded in the way we do science.Footnote 14

The myth of entirely objective, impersonal, and disinterested scientific research leads the public to an unrealistic perception of science and scientists; it may also encourage scientists to be unrealistic about themselves and “to neglect the importance of cultivating consciously ethical behavior.”Footnote 15 This is why, in order to avoid unrealistic expectations, it would be preferable to regard the “scientific method” as an ideal to strive for (even knowing that it is unattainable in its fullest form) and not as the description of an actual practice in scientific research.Footnote 16

The preceding remarks do not amount, of course, to a denial of the fact that there are honest and dishonest, and acceptable and unacceptable, ways of doing science. However, the line between right and wrong in scientific research is not always crystal clear, and there can be many grey areas in between that deserve careful examination before assessing whether, in a particular case, the rules of the responsible conduct of research have been complied with or not.

5.3 A Historical Perspective on Scientific Misconduct

As a consequence of the scandals of scientific misconduct that have arisen in several countries in the past three or four decades, governments, funding agencies, scientific societies, and academic institutions began to recognize the need to do more to hold scientists accountable for their research practices. Since the mid-1980s in the USA, and since the end of the 1990s in Europe, governments and academic institutions have established specific bodies for dealing with allegations of scientific misconduct and developed guidelines and procedures to address these issues and to punish violations of codes of conduct.

Scientific misconduct became a public issue in the USA in the 1980s, when several cases of fabricated research by high-profile scientists were discovered in prestigious academic institutions. These were publicly prosecuted and widely reported by the news media. However, it would be a mistake to think that questionable research behavior is confined to recent times and that scientists from previous decades and centuries have always acted honestly. The Piltdown Man forgery of the early twentieth century is perhaps the most famous fraud in the history of anthropology. In 1912, Charles Dawson, an English lawyer and amateur anthropologist, claimed to have found pieces of a skull and parts of an apelike jaw in a gravel pit in Sussex, England, which he said was the “missing link” between humans and apes. This allegation was controversial from the outset, as many claimed that the skull was inconsistent with other hominid fossils. It was only forty years later, when Dawson had already died, that physical and chemical tests proved that the purported missing link in human evolution was a complete hoax. The upper part of the skull was from a modern human being, the jaw came from an orangutan, and the teeth were from a chimpanzee. The pieces of the skull had been treated with chemicals to make them appear to be fossils.Footnote 17

Science journalists William Broad and Nicolas Wade have closely examined the work done by famous scientists from the past and have shown that they were not always as honest as one might believe.Footnote 18 For instance, such scientists did not always obtain the experimental results they reported, or omitted data that were contrary to their hypothesis, or took ideas from others without proper acknowledgment: Isaac Newton, the founder of modern physics, “adjusted” his calculations on the velocity of sound and altered some data in order to make the predictive power of his theory seem much greater than it actually was; Charles Darwin took ideas on natural selection and evolution from another naturalist, Alfred Russell Wallace, without proper acknowledgment; Gregor Mendel, the founder of genetics, selected data from his experiments with peas so as to make them agree with his theory; Louis Pasteur, whose work led to the development of vaccines for anthrax and rabies, prepared his vaccine for anthrax using a chemical method developed by his competitor, Henri Toussaint, while publicly claiming that he had employed his own method; Robert Millikan, the American physicist who won the Nobel Prize in 1923 for determining the electric charge of the electron, extensively misrepresented his work in order to make his experimental results seem more convincing.

Although it is clear that scientific dishonesty has always existed, it was only in the 1980s that a number of high-profile cases of data fabrication and falsification by scientists in the USA started to be publicly prosecuted and covered by the media. Before that decade, public trust in science was very high. There was a naïve optimism that scientists always acted honestly and could perfectly self-regulate their own activities. But these high-profile cases increased public awareness of this problem, opening eyes to the sad news that science could also fall victim to the unethical behavior of some of its practitioners.

In 1981, then Representative Albert Gore, Jr., chaired a US Congress committee that looked at the question of fraud in science and held the first hearings on the emerging problem. In the following years, several cases of data fabrication and falsification were directly investigated by Congress, as it was evident that research institutions were inadequately responding to allegations of misconduct, or were trying to protect their own researchers. In 1985, the Congress passed the Health Research Extension Act, which mandates that any research institution receiving financial support from the National Institutes of Health (NIH) must have an established administrative process to review reports of scientific fraud.

In 1986, the so-called Baltimore case became public and attracted attention for a decade.Footnote 19 The case had at its center Nobel Prize winner David Baltimore, immunologist and Professor of Biology at MIT. His name appeared on a paper published in the prestigious journal Cell and listed as first author Thereza Imanishi-Kari, a colleague at MIT. A junior scientist working in the same laboratory, Margaret O’Toole, became convinced that the paper contained fabricated data and reported her concern to several senior colleagues at the institution. As a consequence, an investigation was launched, first by MIT, then by an NIH panel and subsequently by the Office of Research Integrity (ORI). Even Congress and the Secret Service became involved in the investigation. In the end, in 1996, an appeals panel at the Department of Health and Human Services determined that there was not enough evidence to prove that Imanishi-Kari committed misconduct, but in the meantime the public was surprised to learn that the work done by serious scientists could be doubted, and that coauthors on scientific papers often have contributed very little to the actual work done.

In 1989, the Public Health Service (PHS) created the Office of Scientific Integrity (OSI), renamed in 1992 the Office of Research Integrity (ORI), as the government office charged with oversight of scientific integrity within biomedicine. The 1990s began with the articulation of definitions and rules about scientific misconduct, and institutions receiving federal funds had to have policies in place for pursuing allegations of misconduct. Political attention began to shift away from attaching blame to scientists and focused instead on improving the investigatory procedures for dealing with misconduct and on preventing it through the education of young scientists in the area.Footnote 20 The current situation in the US is that every institution and research center that receives federal funding has the primary responsibility for responding to allegations of scientific misconduct. The ORI conducts oversight reviews of all investigations. When the ORI receives a report of an institutional inquiry, it examines the institution’s report to determine whether the findings are defensible, well supported by the evidence, and acceptable as a final resolution of the allegations. Then, on the basis of the ORI’s recommendations, a final decision is made by the PHS, which may impose sanctions when research misconduct is found.

European concern about scientific misconduct only began in the 1990s in some countries, such as Germany and Denmark, and much later in others. In 1997, the German scientific community was shocked by a strong suspicion that a large number of papers published by two eminent cancer researchers, Friedhelm Hermann and Marion Brach, included fabricated data. Once this was confirmed by preliminary investigations, a scandal unfolded which marked a turning point in the history of scientific misconduct in Germany. In 2000, the German Research Foundation, Deutsche Forschungsgemeinschaft (DFG), created a task force to investigate the case, which found evidence of data manipulation in at least ninety-four papers coauthored by both researchers.Footnote 21 The Hermann and Brach case prompted the two major German research agencies (the DFG and the Max Planck Society) to develop guidelines defining the rules for good scientific practice and establishing procedures for dealing with allegations of scientific misconduct.Footnote 22

In Denmark, scientific misconduct investigations began in 1992 with the establishment of the Danish Committees on Scientific Dishonesty (DCSD), which was a group of committees tasked with handling allegations of research misconduct based on complaints brought by individuals or institutions. This body was, and still is, the only centralized national authority in a European country for dealing with the violation of rules of good scientific practice. In 2017, the DCSD was replaced by the Danish Committee on Research Misconduct.Footnote 23 In the same year, the Danish Parliament passed the Research Misconduct Act, which distinguishes between scientific misconduct and questionable research practice. While the centralized committee continues to deal with allegations of scientific misconduct, cases of questionable research practice have to be handled internally at each research institution. Since 2014, a national Code of Conduct for Research Integrity defines the rules of good scientific practice. Although the Code is not legally binding in itself, researchers can adhere to it and research institutions can integrate the document into their own guidelines.

The former Danish Committees on Scientific Dishonesty became embroiled in controversy in 2003 after its decision concerning the book The Skeptical Environmentalist by political scientist Bjørn Lomborg. According to Lomborg, claims by environmentalists about global warming, overpopulation, and deforestation, and other related matters, have not been scientifically proven. The DCSD considered that the book was “objectively speaking, deemed to fall within the concept of scientific dishonesty” due to the author’s biased choice of data and arguments. However, the DCSD concluded that Lomborg could not be convicted of subjectively intentional misconduct or gross negligence.Footnote 24 This decision was heavily criticized by social scientists, who considered that Lomborg’s book ought not to be judged by the same criteria used to assess dishonesty in the natural and medical sciences. They pointed out that the selection of information and arguments to develop a theory is an integral part of many social sciences.Footnote 25

Since the end of the 1990s, a number of serious cases of scientific misconduct have taken place in various European countries. To take a few examples:

  • Andrew Wakefield, a former physician at the Royal Free Hospital in London, published a paper in The Lancet in 1998, claiming a possible link between the measles, mumps, and rubella (MMR) vaccine and autism and other childhood diseases or conditions.Footnote 26 The British General Medical Council conducted an inquiry into the case and found Wakefield guilty of dishonesty in his research and banned him from practicing medicine. The British Medical Journal pointed out that “the MMR scare was based not on bad science but on a deliberate fraud” and that it was hard to find a parallel of a paper with such potential to damage public health in the history of medical science.Footnote 27 It is noteworthy that the 1998 paper was retracted only twelve years later by The Lancet. Wakefield’s study has been linked to a steep decline in vaccination rates in the United Kingdom and a corresponding rise in measles cases, resulting in serious illness and fatalities.Footnote 28

  • Diederik Stapel is a Dutch social psychologist, former professor at Tilburg University in the Netherlands, and former Dean of the Social and Behavioural Sciences Faculty. In 2011, three of his junior researchers reported they suspected he had fabricated data for a large number of his papers. Stapel’s most recent work at that time included one article published in Science, where he claimed that a dirty or messy environment may lead to racist behavior in individuals.Footnote 29 A few days earlier, he received media attention for a study (not published in a scientific journal) claiming that eating meat made people selfish and less social. Both studies, based entirely on faked data, are just a small sample of the kind of “scientific research” Stapel had conducted for over a decade. Three investigative committees that studied the case concluded that at least fifty-five of Stapel’s publications included fabricated or manipulated data.Footnote 30 As a result of these findings, Tilburg University suspended him from his position as professor.

  • Paolo Macchiarini, an Italian surgeon and former researcher at Karolinska Institute in Stockholm, was famous for transplanting synthetic tracheas coated with stem cells into more than a dozen patients. In 2014, following the death of two of the three patients operated on by Macchiarini at the Karolinska Institute, an investigation was opened. Two separate internal reports concluded that research results had been described in overly positive terms in Macchiarini’s papers, which incorrectly describe the postoperative status of the patients and the functionality of the implants. An external investigation conducted one year later concluded that “there were data in the papers that could not be found in the medical records.” The number of mismatches leads to the conclusion that there was “a systemic misrepresentation of the truth that lead the reader to have a completely false impression of the success of the technique.”Footnote 31

  • In 2011, Karl-Theodor zu Guttenberg was German Defense Minister and a star politician, when a newspaper reported that his doctoral thesis from the University of Bayreuth’s Faculty of Law included several passages that had been plagiarized, taken almost verbatim from various sources, mainly newspaper articles. The university began an investigation and concluded that Guttenberg had “grossly violated standard research practices and in so doing deliberately deceived.” Based on the “extensive violations” of doctoral regulations by the omission of the source citations, his doctoral degree was revoked and he was forced to step down as Defense Minister.Footnote 32

In an attempt to contribute to the prevention of such cases of misconduct and to promote the responsible conduct of research in Europe, a new European Code of Conduct for Research Integrity was developed in 2017 by the national academies of sciences and humanities through their umbrella organization, the All European Academies (ALLEA) federation, in close cooperation with the European Commission.Footnote 33 After specifying in its first section the principles that are relevant for guiding researchers in their work (reliability, honesty, respect for others, and accountability for the research), the Code goes on to describe, in its second section, good research practices in respect of various areas such as: research environment, training, supervision, and mentoring; research procedures; safeguards to prevent harm to public health and the environment; data management; collaborative working; publication and dissemination; and review process of publications. The Code’s third section defines the various practices that are regarded as violations of research integrity and recommends some principles for handling allegations of scientific misconduct.

At the global level, the UNESCO Recommendation on Science and Scientific Researchers, adopted in 2017, also demonstrates this renewed concern for the ethical aspects of scientific research. The Recommendation, which is a revised, updated, and extended version of the 1974 Recommendation on the Status of Scientific Researchers, is more explicit than its predecessor about the need to ensure that scientific research is conducted with full respect for human rights and human dignity (for instance, the rights of research subjects and the confidentiality of personal data). It also strengthens the importance of honesty in data use and data sharing as well as the need to promote open access publications and dialogue science-society. After recognizing in the Preamble “the value of science as a common good” and that “academic freedom lies at the very heart of the scientific process, and provides the strongest guarantee of accuracy and objectivity of scientific results,” the Recommendation stipulates that Member States, “in order to have sound science,” should establish “suitable means to address the ethics of science and the use of scientific knowledge and its applications” (Article 5 c). It also draws attention to the fact that effective scientific research requires researchers’ integrity and intellectual maturity, as well as respect for ethical principles (Article 12). For these reasons, educational initiatives should be designed “to incorporate or develop in each domain’s curricula and courses the ethical dimensions of science and of research” and “intellectual integrity, sensitivity to conflict of interest, respect for ethical principles pertaining to research” (Article 13).

Even more so than the UNESCO Recommendation, and in a more succinct manner, the 2010 Singapore Statement on Research Integrity can be regarded as a global guide to the responsible conduct of research. This document addresses all the major themes relating to research integrity, including data integrity, data sharing, record keeping, authorship, publication, peer review, conflict of interest, reporting misconduct, communicating with the public, complying with regulations, and social responsibilities. The Statement also includes four ethical principles: honesty in all aspects of research, accountability in the conduct of research, professional courtesy and fairness in working with others, and good stewardship of scientific resources.Footnote 34

5.4 Conclusion

Science is an enterprise producing reliable knowledge which is based on the assumption of honesty on the part of scientists.Footnote 35 Today, there is a widespread international agreement that, on the one hand, scientists should enjoy freedom to conduct their studies, but also that, on the other, such a freedom presupposes that research is conducted in a way that conforms to principles of respect for human rights and human dignity, and according to the procedures generally established for good scientific practice. In other words, scientific freedom to advance knowledge is tied to a responsibility to act honestly. The scientific community has, over time, developed commonly agreed standards in the production and sharing of knowledge. All forms of dishonest science violate that agreement and therefore violate a defining characteristic of science.Footnote 36 Today, in our increasingly technological, globalized, and science-driven societies, there is a need to find an adequate balance between the freedom of scientific research, and other rights, interests and values that are also crucial for society. Honesty in the conduct and communication of scientific results is undoubtedly one of those values.

Footnotes

1 Art. 15, para. 2, International Covenant of Economic, Social and Cultural Rights, 1966.

2 “Explanations Relating to the Charter of Fundamental Rights” (2007/C 303/02), December 14, 2007. Available at: https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:C:2007:303:0017:0035:EN:PDF.

3 Art. 15, para. 1b and para. 3, respectively, of the International Covenant of Economic, Social and Cultural Rights, 1966.

4 See, for instance, Yvonne Donders, “The Right to Enjoy the Benefits of Scientific Progress: In Search of State Obligations in Relation to Health,” Medicine, Health Care and Philosophy, 2011, 14: 371381; William A. Schabas, “Study of the Right to Enjoy the Benefits of Scientific and Technological Progress and Its Application,” in Yvonne Donders, and Vladimir Volodin (eds.), Human Rights in Education Science and Culture–Legal Developments and Challenges, Paris: UNESCO/Ashgate Publishing, 2007, pp. 273308; Audrey R. Chapman, “Towards an Understanding of the Right to Enjoy the Benefits of Scientific Progress and Its Applications,” Journal of Human Rights, 2009, 8(1): 136; Sebastian Porsdam Mann, Helle Porsdam, Christine Mitchell and Yvonne Donders , “The Human Right to Enjoy the Benefits of the Progress of Science and Its Applications,” The American Journal of Bioethics, 2017, 17: 10, 3436; Richard P. Claude, “Scientists Rights and the Human Right to the Benefits of Science,” in Audrey Chapman and Sagel Russel (eds.), Core Obligations: Building a Framework for ESCR, Antwerp: Intersentia, 2002, pp. 249278.

5 All European Academies (ALLEA), European Code of Conduct for Research Integrity, 2017, Preamble.

6 Francis L. Macrina, Scientific Integrity. Text and Cases in Responsible Conduct of Research. 4th ed. Washington DC: ASM Press, 2014, p. 1.

7 US National Academy of Sciences. Integrity in Scientific Research. Creating an Environment that Promotes Responsible Conduct, Washington DC: National Academies Press, 2002.

8 Reproducibility is generally regarded as an important marker of the scientific nature of a study, especially in natural sciences. It means that other scientists are able to repeat the experiment and obtain similar results. However, the difficulty to reproduce a study results does not automatically imply that there has been misconduct.

9 David Goodstein, On Facts and Fraud. Cautionary Tales from the Front Lines of Science, Princeton: Princeton University Press, 2010, p. 4.

10 William Broad and Nicholas Wade, Betrayers of Truth: Fraud and Deceit in the Halls of Science, New York: Simon & Schuster, 1982, p. 19.

11 Goodstein, pp. 3–4.

12 Henry H. Bauer, Scientific Literacy and the Myth of the Scientific Method, Chicago: University of Illinois Press, 1992.

13 Macrina, p. 5.

14 Goodstein, p. 5.

15 Bauer, p. 40.

16 Footnote Ibid., p. 39.

17 David B. Resnik, “Scientific Misconduct and Research Integrity,” in Henk ten Have and Bert Gordijn (eds.), Handbook of Global Bioethics, Dordrecht: Springer, 2014, pp. 799810.

18 Broad and Wade.

19 Daniel J. Kevles, The Baltimore Case: a Trial of Politics, Science, and Character, New York: W.W. Norton, 1998.

20 Marcel C. LaFollette, “The Evolution of the Scientific Misconduct Issue: An Historical Overview,” Proceedings of the Society for Experimental Biology and Medicine, 2000, (4): 211215.

21 Annette Tuffs, “Fraud Investigation Concludes That Self-Regulation Has Failed,” British Medical Journal, 2000, 321(7253): 72.

22 DFG, Proposals for Safeguarding Good Scientific Practice. Bonn: Deutsche Forschungsgemeinschaft, 1998; Max-Planck-Society, Rules of Good Scientific Practice & Rules of Procedure in Cases of Suspected Misconduct. Munich: Max Planck Society, 2000 (revised in 2009).

24 Alison Abbott, “Ethics Panel Attacks Environment Book,” Nature, 2003, vol. 421: 201.

25 Alison Abbott, “Social Scientists Call for Abolition of Dishonesty Committee,” Nature, 2003, vol. 421: 681.

26 Wakefield A. et al. “Ileal-Lymphoid-Nodular Hyperplasia, Non-Specific Colitis, and Pervasive Developmental Disorder in Children,” Lancet, 1998, 351 (9103): 637641.

27 Fiona Godlee, “The Fraud Behind the MMR Scare,” British Medical Journal, 2011: 342.

28 Sarah Boseley, “Young People in England Urged to Have MMR Vaccine Following Mumps Surge,” The Guardian, February 14, 2020.

29 Diederik Stapel and Siegwart Lindenberg, “Coping with Chaos: How Disordered Contexts Promote Stereotyping and Discrimination,” Science, 2011, 332(6026): 251253 (retracted).

30 Levelt, Noort and Drenth Committees, Flawed Science: The Fraudulent Research Practices of Social Psychologist Diederik Stapel. Tilburg: Commissioned by the Tilburg University, University of Amsterdam and the University of Groningen, 2012.

31 Gretchen Vogel, “Report Finds Trachea Surgeon Committed Misconduct,” Science, May 19, 2015.

32 Helen Pidd, “German Defence Minister Resigns in PhD Plagiarism Row,” The Guardian, March 1, 2011.

33 See https://ec.europa.eu/research/participants/data/ref/h2020/other/hi/h2020-ethics_code-of-conduct_en.pdf. The Code is a revised and updated edition of the original version published in 2011.

34 Second World Conference on Research Integrity, Singapore Statement on Research Integrity, 2010. Available at: https://wcrif.org/statement.

35 US National Academy of Sciences, Engineering and Medicine, Fostering Integrity in Research, Washington DC: National Academies Press, 2017, p. 31.

36 Footnote Ibid., p. 32.

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×