Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-26T07:54:31.507Z Has data issue: false hasContentIssue false

Epistemic Injustice as a Basis for Failure? Health Research Regulation, Technological Risk and the Foundations of Harm and Its Prevention

Published online by Cambridge University Press:  14 January 2020

Mark L FLEAR*
Affiliation:
School of Law, Queen’s University Belfast; email: m.flear@qub.ac.uk.

Abstract

I use the examples of medical devices, clinical trials and health data, to look at the framing of harm through the language of technological risk and failure. Across the examples, there is little or no suggestion of failure by those formally responsible. Failure is seen as arising when harm becomes refracted through calculative techniques and judgments, and reaches a point where the expectations of safety built into technological framings of regulation are thwarted. Technological framings may marginalise the contribution patients, research participants and others can make to regulation, which may in turn underlie harm and lead to the construction of failure. This marginalisation may amount to epistemic injustice. Epistemic injustice and its link to failure, which has normative weight over and above harm, can present a risk to organisational standing and reputation. This risk can be used to improve the knowledge base to include stakeholder knowledges of harm, and to widen responsibilities and accountabilities. This promises to allow regulation to better anticipate and prevent harm and failure, and improve the efficacy and legitimacy of the health research enterprise.

Type
Symposium on European Union Governance of Health Crisis and Disaster Management
Copyright
© Cambridge University Press 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

Many thanks to all those with whom I have discussed the ideas set out in this article, especially: Richard Ashcroft, Ivanka Antova, Tammy Hervey, Katharina Paul, Barbara Prainsack, Anniek de Ruijter, Daithi Mac Sithigh and Ilke Turkmendag. The ideas in this article were also strongly informed by my work on expectations in regulation, which I presented at invited seminars and lectures at number of institutions: Amsterdam, Durham (Centre for Ethics and Law in the Life Sciences), Edinburgh (Mason Institute), Manchester, Maynooth, Oxford (Centre for Socio-Legal Studies) and Vienna (Department of Politics and Centre for the Study of Contemporary Solidarity). In addition, I had the pleasure of presenting early ideas on expectations and failure at the Society of Legal Scholars conference held at Queen Mary, University of London. I am grateful to participants in these events for their feedback and to the organisers for their hospitality.

References

1 As the other contributions to this special issue underline.

2 Black’s definition of regulation encompasses the focus of this chapter, in that it is “the intentional use of authority to affect behaviour of a different party according to set standards, involving instruments of information-gathering and behaviour modification”: see J Black, “Critical Reflections on Regulation” (2002) 27 Australian Journal of Legal Philosophy 1. This understanding of regulation includes “hard law”, “soft law”, social norms, standards and the market. For other understandings, see R Baldwin et al, “Regulation, the Field and the Developing Agenda” in R Baldwin et al (eds), The Oxford Handbook on Regulation (Oxford University Press 2011); R Baldwin et al, Understanding Regulation: Theory, Strategy, and Practice (2nd edn, Oxford University Press 2012).

3 L Kurunmäki and P Miller, “Calculating Failure: The Making of a Calculative Infrastructure for Forgiving and Forecasting Failure” (2013) 55(7) Business History 1100, at p 1100, emphasis added.

4 M Power, Organised Uncertainty (Oxford University Press 2007) p 5, emphasis added.

5 Relatedly, see S Macleod and S Chakraborty, Pharmaceutical and Medical Device Safety (Hart Publishing 2019).

6 See E Jackson, Law and the Regulation of Medicines (Hart Publishing 2012) pp 4–5.

7 This was not the first scandal concerning silicone breast implants: see generally C Greco, “The Poly Implant Prothése Breast Prostheses Scandal: Embodied Risk and Social Suffering” (2015) 147 Social Science and Medicine 150; M Latham, “‘If It Ain’t Broke Don’t Fix It’: Scandals, Risk and Cosmetic Surgery” (2014) 22(3) Medical Law Review 384.

8 Heneghan, Cet al, “Ongoing Problems with Metal on Metal Hip Implants” (2012) 344 BMJ 1349CrossRefGoogle Scholar.

9 “Vaginal Mesh to Treat Organ Prolapse Should Be Suspended, says UK Health Watchdog”, The Independent, 15 December 2017.

10 Quoted in “Review into PiP Implant Scandal Published”, available at <www.gov.uk/government/news/review-into-pip-implant-scandal-published> last accessed 10 December 2019. This references the report: Department of Health, Poly Implant Prothèse (PIP) Silicone Breast Implants: Review of the Actions of the Medicines and Healthcare products Regulatory Agency (MHRA) and Department of Health (2012). Also see AK Deva, “The ‘Game of Implants’: A Perspective on the Crisis-Prone History of Breast Implants” (2019) 39(S1) Aesthetic Surgery Journal S55; D Spiegelhalte et al, “Breast Implants the Scandal the Outcry and Assessing the Risks” (2012) 9(6) Significance 17.

11 The Guardian, 26 November 2018, emphasis added.

12 This may extend beyond physical harm to social harm, environmental harm “and so on”: see R Brownsword, Rights, Regulation and the Technological Revolution (Oxford University Press 2008) p 119. Also see pp 102–105.

13 Firestein describes ‘a continuum of failure’ and explains how ‘[f]ailures can be minimal and easily dismissed; they can be catastrophic and harmful. There are failures that should be encouraged and others that should be discouraged’ – before stating ‘[t]he list could go on’: see S Firestein, Failure. Why Science is So Successful (Oxford University Press 2016) pp 8–9.

14 M Fricker, Epistemic Injustice: Power and the Ethics of Knowing (Oxford University Press 2007), emphasis added. Also see IJ Kidd and H Carel, “Epistemic Injustice and Illness” (2017) 34(2) Journal of Applied Philosophy 172.

15 BA Turner, Man-Made Disasters (Wykeham 1978); BA Turner and NF Pidgeon, Man-Made Disasters, 2nd edn (Butterworth-Heinemann 1997).

16 BM Hutter and M Power (eds), Organizational Encounters With Risk (Cambridge University Press 2005) p 1. Some failures are “normal accidents” and cannot be organised out of existence: see C Perrow, Normal Accidents: Living with High-Risk Technologies (Basic Books 1984). “Normal accidents” are inevitable rather than common (at ibid, p 174). In Perrow’s view, whereas the partial meltdown at the Three Mile Island nuclear power station, Pennsylvania, in 1979 was a “normal accident”, disasters involving the Challenger spacecraft in 1986, the Union Carbide gas tragedy in Bhopal, India in 1984, and the Chernobyl nuclear power station in 1986, were not “normal accidents”. The examples considered in this article are related to limitations in the knowledge base, which is capable of adjustment, and as such they are not inevitable.

17 Kurunmäki and Miller, supra, note 3, at p 1101, emphasis added.

18 Brownsword and Goodwin describe how “while there is agreement across normative frameworks on the importance of minimising harm, differences arise in determining what it means to say that someone has been harmed. Utilitarians are likely to interpret harm in a minimalist manner; deontologists in an expansive fashion, for example, so as to include the harm done to human dignity… Harm is also likely to include, for some, violations of individual human rights”: see R Brownsword and M Goodwin, Law and the Technologies of the Twenty-First Century: Text and Materials (Cambridge University Press 2012) p 208. Also see pp 205–210.

19 Indeed, PIP silicone breast implants and vaginal mesh have been the subject of litigation – for discussion of each, see Macleod and Chakraborty, supra, note 5, at pp 232–234 and 259–263 respectively.

20 Appadurai explains that “failure is not seen in the same way at all times and in all places”: see A Appadurai, “Introduction” to Special Issue on “Failure” (2016) 83(3) Social Research xx. A key question for related work is: does the understanding of failure in terms of harm co-exist with the understanding of failure in terms of neglect or omission? This is a central question of risk regulation, where it is queried whether a regulatory focus displaces causal claims premised on negligence or nuisance or recklessness. See, for instance, S Shavell, “Liability for Harm versus Regulation of Safety” (1984) 13(2) The Journal of Legal Studies 357.

21 G Laurie, “Liminality and the Limits of Law in Health Research Regulation: What Are We Missing in the Spaces In-Between?” (2016) 25(1) Medical Law Review 47.

22 M Foucault, The Birth of Biopolitics: Lectures at the Collège de France, 1978–1979 (Palgrave Macmillan 2008). Also see T Lemke, Biopolitics: An Advanced Introduction (New York University Press 2013). For application of this thinking, see ML Flear, Governing Public Health: EU Law, Regulation and Biopolitics (Hart Publishing 2015 hb; 2018 (revised) pb), especially ch 2, ch 7, and ch 8.

23 For instance: A Riles, “A New Agenda for the Cultural Study of Law: Taking on the Technicalities” (2005) 53 Buffalo Law Review 973, at p 975.

24 H van Lente and A Rip, “Expectations in Technological Developments: An Example of Prospective Structures to Be Filled in by Agency” in C Disco and B van der Meulen (eds), Getting New Technologies Together: Studies in Making Sociotechnical Order (De Gruyter 1998) p 205. Also see H van Lente, Promising Technology: The Dynamics of Expectations in Technological Development (University of Twente 1993).

25 T Carroll et al, “Introduction: Towards a General Theory of Failure” in T Carroll et al (eds), The Material Culture of Failure: When Things Go Wrong (Bloomsbury 2018) p 15, emphasis added.

26 R Bryant and DM Knight, The Anthropology of the Future (Cambridge University Press 2019) p 28.

27 ibid, at p 134.

28 ibid, at p 58, emphasis added.

29 ibid, at p 63.

30 Beckert lists past experience amongst the social influences on expectations: see J Beckert, Imagined Futures: Fictional Expectations and Capitalist Dynamics (Harvard University Press 2016) p 91.

31 A Appadurai, “Introduction” to Special Issue on “Failure” (2016) 83(3) Social Research xxi, emphasis added. Also see A Appadurai, Banking on Words: The Failure of Language in the Age of Derivative Finance (University of Chicago Press 2016).

32 Brownsword, supra, note 12; K Yeung, “Towards an Understanding of Regulation by Design” in R Brownsword and K Yeung (eds), Regulating Technologies: Legal Futures, Regulatory Frames and Technological Fixes (Hart Publishing 2008); K Yeung and M Dixon-Woods, “Design-based Regulation and Patient Safety: A Regulatory Studies Perspective” (2010) 71(3) Social Science and Medicine 613.

33 T Dant, Materiality and Society (Open University Press 2005).

34 D MacKenzie and J Wajcman (eds), The Social Shaping of Technology, 2nd edn (Open University Press 1999); L Neven, “‘But Obviously It’s Not for Me’: Robots, Laboratories and the Defiant Identity of Elder Test Users” (2010) 32 Sociology of Health and Illness 335; G Walker et al, “Renewable Energy and Sociotechnical Change: Imagined Subjectivities of ‘The Public’ and Their Implications” (2010) 42 Environment and Planning A 931; L Winner, “Do Artefacts Have Politics?” (1980) 109 Daedalus 121.

35 Beckert, supra, note 30.

36 S Jasanoff and S-H Kim, “Containing the Atom: Sociotechnical Imaginaries and Nuclear Regulation in the US and South Korea” (2009) 47(2) Minerva 119, at p 120. Also see K Konrad et al, “Performing and Governing the Future in Science and Technology” in U Felt et al (eds), The Handbook of Science and Technology Studies, 4th edn (MIT Press 2017) p 467. This chapter provides a summary of current key understandings of “expectations”, “visions” and “imaginaries”.

37 S Jasanoff, “Future Imperfect: Science, Technology, and the Imaginations of Modernity” in S Jasanoff and S-H Kim, Dreamscapes of Modernity. Sociotechnical Imaginaries and the Fabrication of Power (University of Chicago Press 2015) p 4. For wider discussion of imaginaries in STS see M McNeil et al, “Conceptualising Imaginaries of Science, Technology, and Society” in Felt et al, supra, note 36.

38 For related discussion, albeit not discussing “failure” as such, see ML Flear, “‘Technologies of Reflexivity’: Generating Biopolitics and Institutional Risk to Supplement Global Public Health Security” (2017) 8 EJRR 658.

39 Within the European Union, the applicable law is subject to transition from a trio of directives to a duo of regulations: Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC (Text with EEA relevance) OJ L117/1; Regulation (EU) 2017/746 of the European Parliament and of the Council of 5 April 2017 on in vitro diagnostic medical devices and repealing Directive 98/79/EC and Commission Decision 2010/227/EU (Text with EEA relevance) OJ L117/176. Implementation of this legislation is left to national competent authorities, including, at the time of writing, the UK’s Medicines and Healthcare Products Regulatory Agency. The competent authorities designate notified bodies to assess medical device conformity with “essential requirements”. The focus in conformity assessments is on the intended purpose and risk of a device. Where a conformity assessment finds a medical device to be compliant with the regulations, the manufacturer of the device can brand it with the CE (Conformité Européenne) mark and trade it within the EU internal market.

40 Medical devices are defined by their intended function, as defined by the manufacturer, for medical purposes. Art 2(1) Medical Devices Regulation defines “medical device” as “any instrument, apparatus, appliance, software, implant, reagent, material or other article intended by the manufacturer to be used, alone or in combination, for human beings for one or more of the following specific medical purposes:

  • diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of disease,

  • diagnosis, monitoring, treatment, alleviation of, or compensation for, an injury or disability,

  • investigation, replacement or modification of the anatomy or of a physiological or pathological process or state,

  • providing information by means of in vitro examination of specimens derived from the human body, including organ, blood and tissue donations, and which does not achieve its principal intended action by pharmacological, immunological or metabolic means, in or on the human body, but which may be assisted in its function by such means” (emphasis added).

41 The classification of medical devices ranges from I for low-risk devices which are non-invasive, such as spectacles; IIa for low- to medium-risk devices, which are usually installed within the body for only between 60 minutes and 30 days, such as hearing aids, blood transfusion tubes, and catheters; IIb for medium- to high-risk devices, which are usually devices installed within the body for 30 days or longer, such as ventilators and intensive care monitoring equipment; to III for high-risk devices, which are invasive long-term devices. Under new EU legislation, Class III devices include the examples already noted and pacemakers that are used on a long-term basis, ie normally intended for continuous use for more than 30 days (Point 1.3, Annex VIII, Medical Devices Regulation, ibid). The only medical devices that are required to evidence therapeutic benefit or efficacy in controlled conditions before marketing are those that incorporate medicinal products. It is not necessary to perform clinical investigations where a device is custom-made or can be deemed similar or substantially equivalent.

42 Specifically, the evidence required to demonstrate conformity with essential safety requirements involves a clinical evaluation. Clinical evaluation: “means a systematic and planned process to continuously generate, collect, analyse and assess the clinical data pertaining to a device in order to verify the safety and performance, including clinical benefits, of the device when used as intended by the manufacturer” (Art 2(44) Medical Devices Regulation, emphasis added). The clinical evaluation verifies safety, performance, and an acceptable benefit/risk ratio. The clinical investigation is a subset of the clinical evaluation and involves “any systematic investigation involving one or more human subjects, undertaken to assess the safety or performance of a device” (Art 2(45) Medical Devices Regulation, emphasis added). The definitions of clinical evaluation and investigation align with that for medical devices in that the assessment of performance is in accordance with intended function.

43 C Allan et al, “Europe’s New Device Regulations Fail to Protect the Public” (2018) 363 BMJ 4205, at p 4205.

44 CJ Heneghan et al, “Trials of Transvaginal Mesh Devices for Pelvic Organ Prolapse: A Systematic Database Review of the US FDA Approval Process” (2017) 7 BMJ Open e017125, p 1, emphasis added.

45 This is a point of comparison for new medical devices seeking approval. One study traced the origins of 61 surgical mesh implants to just two original devices approved in the United States in 1985 and 1996 – see ibid.

46 Macleod and Chakraborty, supra, note 5, at p 238.

47 See references in supra, note 40.

48 Recital 1 Medical Devices Regulation, supra, note 40.

49 Allan et al, supra, note 43, at p 4205, emphasis added.

50 Art 123(2) Medical Devices Regulation, supra, note 40.

51 See, for instance, Recital 63 Medical Devices Regulation, supra, note 40.

52 Cambridge Design Partnership, “About Us” available at <www.cambridge-design.com/about-us> last accessed 15 November 2019.

53 ibid, emphasis added.

54 See, for instance: C Heneghan et al, “Transvaginal Mesh Failure: Lessons for Regulation of Implantable Devices” (2017) 359 BMJ 5515.

55 Cambridge Design Partnership, supra, note 52, emphasis added. Moreover: “Under the [Medical Devices Directive], many people chose to look to the EU Clinical Trials Directive (and subsequent regulation) and the associated Good Clinical Practice guidelines from the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use”. That said, under the Medical Devices Regulation, supra, note 40: “The rules on clinical investigations should be in line with well-established international guidance in this field, such as the international standard ISO 14155:2011 on good clinical practice for clinical investigations of medical devices for human subjects”. The relevant global standards for trials of new medical devices and new medicines is the same: “World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects” (Recital 64, Medical Devices Regulation, supra, note 40).

56 References to consent are embedded throughout the Regulation (EU) 536/2014 on clinical trials on medicinal products for human use, and repealing Directive 2001/20/EC [2014] OJ L158/1. The Regulation is planned to apply from 2019. Key references include: Recitals 15, 17, 27, 44, 76 and 80, Art 3 and Ch V (on protection of subjects and informed consent). Preceding the latter was Directive 2001/20/EC on the approximation of the laws, regulations and administrative provisions of the Member States relating to the implementation of good clinical practice in the conduct of clinical trials on medicinal products for human use [2001] OJ L121/34. References to GCP are underpinned by ICH, Integrated Addendum to ICH E6(R1): Guideline for Good Clinical Practice E6(R2), Current Step 4 Version Dated 9 November 2016 (this version amends that finalised and adopted in 1996) and Declaration of Helsinki – Ethical Principles for Medical Research Involving Human Subjects (1964, as revised, the last time in 2013). The Helsinki Declaration is an instrument of the World Medical Association. Consent is a primary value in each of these. Consent is also a principle within the Universal Declaration on Bioethics and Human Rights (2005) – see Art 6 (consent) and Art 7 (persons without the capacity to consent).

57 Directive 2001/83/EC on the Community Code relating to medicinal products for human use [2001] OJ L 311/67.

58 There are four phases of clinical trials. Phase I trials use between 20–80 healthy volunteers in order to determine the tolerable dose range of a new drug. Phase II trials use between 100–300 subjects who have the disease or condition to be treated in order to evaluate efficacy and safety of the drug. Phase III trials tend to be multi-centred and might involve up to 10,000 people located in 10–20 countries. This phase generates more safety and efficacy data and the research participants included in the protocol for this type of trial tend to be those suffering from the condition the new drug is intended to treat. Phase IV trials are used to generate post-marketing data on safety and efficacy. This phase can involve millions of people.

59 Double-blind randomised controlled trials are considered the best for Phase II and III clinical trials. These trials involve random allocation to the control or the active arm of the study. Participants allocated to the control group are provided with either the best standard treatment for their condition or a placebo (an inert substance).

60 H Attarwala, “TGN1412: From Discovery to Disaster” (2010) 2(3) Journal of Young Pharmacists 332.

61 Medicines for Human Use (Clinical Trials) Regulations 2004 which implement Directive 2001/20/EC, supra, note 56.

62 G Vince, “UK Drug Trial Disaster – The Official Report”, New Scientist, 25 May 2006, available at <www.newscientist.com/article/dn9226-uk-drug-trial-disaster-the-official-report>, last accessed 10 December 2019.

63 MHRA, Investigations into Adverse Incidents during Clinical Trials of TGN1412, App 1 GCP Inspection of Parexel – Findings, 25 May 2006, available at <webarchive.nationalarchives.gov.uk/20141206175945/http://www.mhra.gov.uk/home/groups/comms-po/documents/websiteresources/con2023821.pdf> last accessed 10 December 2019.

64 MHRA, Investigations into Adverse Incidents during Clinical Trials of TGN1412, 5 April 2006, available at <webarchive.nationalarchives.gov.uk/20141206222245/http://www.mhra.gov.uk/home/groups/comms-po/documents/websiteresources/con2023519.pdf> last accessed 10 December 2019.

65 M Day, “Agency Criticises Drug Trial” (2006) 332(7553) BMJ 1290.

66 MHRA, Phase I Accreditation Scheme Requirements, Version 3, 28 October 2015, available at <assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/473579/Phase_I_Accreditation_Scheme.pdf>, last accessed 10 December 2019. For discussion, see KE Brown, “Revisiting CD28 Superagonist TGN1412 as Potential Therapeutic for Pediatric B Cell Leukemia: A Review” (2018) 6(2) Diseases 41.

67 An interim report was published on 20 July 2006 and followed-up by Expert Scientific Group on Phase One Clinical Trials, Final Report, 30 November 2006, available at: <webarchive.nationalarchives.gov.uk/20130105090249/http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_063117>, last accessed 10 December 2019. For more on reviews and international action of vaginal mesh implant complications, see S Barber, “Surgical Mesh Implants”, Briefing Paper, Number CBP 8108, 4 September 2019.

68 At the time of this scandal the legislation applicable to personal data was Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data [1995] OJ L281/31. This Directive is now replaced by Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) [2016] OJ L119/1. Consent can be understood more as an ongoing process and not just a one-time only event, see P Allmark and S Mason, “Improving the Quality of Consent to Randomised Controlled Trials by Using Continuous Consent and Clinician Training in the Consent Process” (2006) 32 Journal of Medical Ethics 439; DC English, “Valid Informed Consent: A Process, Not a Signature” (2002) American Surgeon 45.

69 Laurie, supra, note 21, at p 53.

70 P Carter et al, “The Social Licence for Research: Why care.data Ran into Trouble” (2015) 41 Journal of Medical Ethics 404.

71 See supra, note 68.

72 Human Rights Act 1998 and in particular the right to privacy under Art 8 European Convention on Human Rights as incorporated via Sch 1, Part 1.

73 Carter et al refer to a “rupture in traditional expectations”: supra, note 70, at p 407.

74 Carter et al, supra, note 70.

75 Laurie, supra, note 21, at p 53, emphasis added. Citing Carter et al, supra, note 70.

76 BM Hutter and S Lloyd-Bostock, Regulatory Crisis: Negotiating the Consequences of Risk, Disasters and Crises (Cambridge University Press 2017) p 209, emphasis added. For discussion, see M Lodge, “The Wrong Type of Regulation? Regulatory Failure and the Railways in Britain and Germany” (2002) 22(3) Journal of Public Policy 271; M Lodge, “Risk, Regulation and Crisis: Comparing National Responses to Food Safety Regulation” (2011) 31(1) Journal of Public Policy 25; R Schwartz and A McConnell, “Do Crises Help Remedy Regulatory Failure? A Comparative Study of the Walkerton Water and Jerusalem Banquet Hall Disasters” (2009) 52 Canadian Public Administration 91; G Wilson, “Social Regulation and Explanations of Regulatory Failure” (1984) 31 Political Studies 203.

77 Hutter and Lloyd-Bostock, ibid, at p 8, emphasis added. For further discussion of the relationship between regulatory failure and reform, see ibid, at pp 5–6.

78 Hutter and Lloyd-Bostock, supra, note 76, at p 3.

79 ibid, at p 3.

80 Kurunmäki and Miller, supra, note 3, at p 1101, emphasis added.

81 In the examples discussed in this article, legal procedures and modes of judgement seem to have played a limited role in the interpretation of harm and construction of failure – although there has been some litigation, see supra, note 19.

82 See J Law and A Mol, “Notes on Materiality and Sociality” (1995) 43 Sociological Review 279; T Pinch and WE Bijker, “The Social Construction of Facts and Artefacts: Or How the Sociology of Science and the Sociology of Technology Might Benefit Each Other” (1984) 14 Social Studies of Science 399. Also see A Faulkner et al (eds), “Material Worlds: Intersections of Law, Science, Technology, and Society”, Special Issue (2012) 39(1) Journal of Law and Society; V Toom, “Bodies of Science and Law: Forensic DNA Profiling, Biological Bodies, and Biopower” (2012) 39(1) Journal of Law and Society 150.

83 I Hacking, Historical Ontology (Harvard University Press 2002) p 106. The key concept here is dynamic nominalism – “a fancy way of saying name-ism” – see I Hacking, The Social Construction of What? (Harvard University Press 1999) p 82. Applied within a variety of literatures, including feminist scholarship (D Haraway, The Haraway Reader (Routledge 2004)), critical race scholarship (D Roberts, “The Social Immorality of Health in the Gene Age: Race, Disability and Inequality” in J Metzl and A Kirkland (eds), Against Health (NYU Press 2010)) and disability studies (B Allen, “Foucault’s Nominalism” in S Tremain (ed), Foucault and the Government of Disability (University of Michigan Press 2018); M Oliver, Social Work and Disabled People (Macmillan 1983); M Oliver, “The Social Model of Disability: Thirty Years On” (2013) 28 Disability and Society 1024).

84 Kurunmäki and Miller, supra, note 3, at p 1101.

85 Hutter and Lloyd-Bostock, supra, note 76, at pp 19–21 for framing and routines, and also at pp 9–18.

86 Adjusting discussion of “medical gaze” in M Foucault, The Birth of the Clinic (Tavistock 1973).

87 For some starting points on the salience of gender to bodies and embodiment, see M Fox and T Murphy, “The Body, Bodies, Embodiment: Feminist Legal Engagement with Health” in M Davies and VE Munro (eds), The Ashgate Research Companion to Feminist Legal Theory (Ashgate 2013).

88 See, for example, DE Hoffmann and AJ Tarzian, “The Girl Who Cried Pain: A Bias Against Women in the Treatment of Pain” (2001) 29 Journal of Law, Medicine & Ethics 13; RW Hurley and MCB Adams, “Sex, Gender and Pain: An Overview of a Complex Field” (2008) 107(1) Anesthesia & Analgesia 309.

89 As does the decision, in 2014 by the US National Institutes of Health, that the preclinical research it funds will in future ensure that investigators account for sex as a biological variable as part of a rigour and transparency initiative. For discussion, see JA Clayton, “Studying Both Sexes: A Guiding Principle for Biomedicine” (2016) 30(2) The FASEB Journal 519.

90 MR Nolan and T-L Nguyen, “Analysis and Reporting of Sex Differences in Phase III Medical Device Clinical Trials – How Are We Doing?” (2013) 22(5) Journal of Women’s Health 399.

91 NICE, Urinary Incontinence and Pelvic Organ Prolapse in Women: Management, NICE Guideline [NG123], 2 April 2019, available at <www.nice.org.uk/guidance/ng123/chapter/Recommendations>, last accessed 10 December 2019. This guidance was issued in response to the NHS England Mesh Working Group – see Mesh Oversight Group Report, 25 July 2017, available at <www.england.nhs.uk/publication/mesh-oversight-group-report/>, last accessed 10 December 2019. Also see “Mesh Working Group”, available at <www.england.nhs.uk/mesh/>, last accessed 10 December 2019.

92 H Pike, “NICE Guidance Overlooks Serious Risks of Mesh Surgery” (2019) 365 BMJ 1537, emphasis added.

93 The contract research organisation, Parexel, failed to complete the full medical background of a trial subject in writing. One principal investigator of the trial failed to update the medical history file in writing after conducting a verbal consultation with one of the trial volunteers.

94 G Pohlhaus, “Discerning the Primary Epistemic Harm in Cases of Testimonial Injustice” (2014) 28(2) Social Epistemology 99, at p 107.

95 See, for example, R Flynn, “Health and Risk” in G Mythen and S Walklate (eds), Beyond the Risk Society (Open University Press 2006). In general see F Ewald, “Insurance and Risk” in G Burchell et al (eds), The Foucault Effect: Studies in Governmentality (University of Chicago Press 1991); F Ewald and S Utz, “The Return of Descartes’ Malicious Demon: An Outline of a Philosophy of Precaution” in T Baker and J Simon (eds), Embracing Risk: The Changing Culture of Insurance and Responsibility (University of Chicago Press 2002). More generally see, for example: RV Ericson, Crime in an Insecure World (Polity 2007); L Zedner, “Fixing the Future? The Pre-emptive Turn in Criminal Justice” in B McSherry et al (eds), Regulating Deviance: The Redirection of Criminalisation and the Futures of Criminal Law (Hart Publishing 2009).

96 Power, supra, note 4, at pp 3–4. Risk objects are a type of bounded object. A bounded object approach is “where law creates artificial constructs that become the object of regulatory attention of dedicated regulators who operate within legally defined spheres of influence or ‘silos’”: see Laurie, supra, note 21, at p 11. There are, of course, related and to some extent overlapping objects, including “marketised objects”, “innovation objects”, etc. For discussion, which draws on Laurie, supra, note 21, see M Quigley and S Ayihongbe, “Everyday Cyborgs: On Integrated Persons and Integrated Goods” (2018) 26(2) Medical Law Review 276. For discussion of another contemporary risk object, “children”, see A-M McAlinden, Children as “Risk” (Cambridge University Press 2018). For further discussion, see Power, supra, note 4, at pp 7–12 and 24–28.

97 Cf R Brownsword, Rights, Regulation and the Technological Revolution (Oxford University Press 2008) pp 118–119.

98 R Sparks, “Degrees of Estrangement: The Cultural Theory of Risk and Comparative Penology” (2001) 5(2) Theoretical Criminology 159, at p 169, drawing on D Garland, Punishment and Modern Society (Oxford University Press 1990) emphasis added. On risk and cultural context, see M Douglas and A Wildavsky, Risk and Culture: An Essay on the Selection of Technical and Environmental Dangers (University of California Press 1982). Also see N Luhmann, Risk: A Sociological Theory (De Gruyter 1993).

99 N Rose and P Miller, “Political Power Beyond the State: Problematics of Government” (1992) 43(2) British Journal of Sociology 172, at p 178.

100 K Knorr Cetina, “Laboratory Studies: The Cultural Approach to the Study of Science” in S Jasanoff et al (eds), Handbook of Science and Technology Studies (London 1995); B Latour, Science in Action: How to Follow Scientists and Engineers Through Society (Harvard University Press 1987); M Lynch and S Woolgar (eds), Representation in Scientific Practice (MIT Press 1990); A Pickering (ed) Science as Practice and Culture (University of Chicago Press 1992).

101 S Jasanoff, “The Idiom of Co-Production” in S Jasanoff (ed), States of Knowledge: The Co-production of Science and the Social Order (Routledge 2004) p 3.

102 ibid, at pp 2–3.

103 Jasanoff, supra, note 101. Also see K Knorr Cetina, “Laboratory Studies: The Cultural Approach to the Study of Science” in S Jasanoff et al (eds), Handbook of Science and Technology Studies (London 1995); T Kuhn, The Structure of Scientific Revolutions, 2nd edn (University of Chicago Press 1970); M Lynch and S Woolgar (eds), Representation in Scientific Practice (MIT Press 1990); A Pickering (ed), Science as Practice and Culture (University of Chicago Press 1992).

104 Power, supra, note 4, at p 25, emphasis added. Also see Sparks, supra, note 98.

105 Laurie, supra, note 21, at p 52, emphasis added.

106 This includes a focus on enterprise, which as Power explains entails “control [that] is indirect and exercised by autonomous value-creating selves. It must be self-governing…constitutive of freedom and the capacity to innovate” – Power, supra, note 4, at p 197, emphasis added. The logic of enterprise and elements that constitute governance and regulation – framing, knowledges, discourses and practices that regulate everyday life, including law and expectations – are underpinned and directed by neoliberal rationality. Rose and colleagues describe rationality as “a way of doing things that… [is] oriented to specific objectives and that… [reflects] on itself in characteristic ways”: see N Rose et al, “Governmentality” (2006) 2 Annual Review of Law Society and Science 83, at p 84, emphasis added. Neoliberal rationality prioritises technical reason and means-end, or instrumental, market rationality, and disseminates it through the organisation of governance “at a distance” – see M Dean, Governmentality: Power and Rule in Modern Society, 2nd edn (Sage Publications 2009); P O’Malley, Risk, Uncertainty and Government (Routledge 2004).

107 ML Flear, “Regulating New Technologies: EU Internal Market Law, Risk, and Socio-Technical Order” in M Cremona (ed), New Technologies and EU Law (Oxford University Press 2016) p 7. See further ML Flear et al, European Law and New Health Technologies (Oxford University Press 2013).

108 Black notes that the rhetoric of risk is a “useful legitimating device”: J Black, “The Emergence of Risk-Based Regulation and the New Public Risk Management in the United Kingdom” (2005) Public Law 512, at p 519; Power, supra, note 4.

109 E Goffman, Frame Analysis: An Essay on the Organisation of Experience (Harvard University Press 1974); M Hajer and D Laws, “Ordering Through Discourse” in M Moran et al (eds), The Oxford Handbook of Public Policy (Oxford University Press 2006); VA Schmidt, “Discursive Institutionalism: The Explanatory Power of Ideas and Discourse” (2008) 11 American Review of Political Science 303; M Rein and D Schön, “Problem Setting in Policy Research” in C Weiss (ed), Using Social Research in Public Policy Making (Lexington Book 1977).

110 M Akrich, “The De-Scription of Technical Objects” in WE Bijker and J Law (eds), Shaping Technology/Building Society: Studies in Sociotechnical Change (MIT Press 1992); M Borup et al, “The Sociology of Expectations in Science and Technology” (2006) 18(3–4) Technology Analysis and Strategic Management 285; N Brown and M Michael, “A Sociology of Expectations: Retrospecting Prospects and Prospecting Retrospects” (2003) 15(1) Technology Analysis and Strategic Management 4.

111 Y Ezrahi, Imagined Democracies (Cambridge University Press 2012) p 38, emphasis added.

112 ibid, at p 42, emphasis added. Also see B Anderson, Imagined Communities (Verso 2006); C Taylor, Modern Social Imaginaries (Duke University Press 2004).

113 W Brown, Regulating Aversion (Princeton University Press 2006) p 15; S Jasanoff, Designs on Nature (Princeton University Press 2005) pp 5–6.

114 D Bell, The Coming of Post-Industrial Society: A Venture in Social Forecasting (Basic Books 1976); M Castells, The Rise of the Network Society (The Information Age, Vol I) (Blackwell 1996); K Knorr Cetina, Epistemic Cultures. How the Sciences Make Knowledge (Harvard University Press 1999); N Stehr, Knowledge Societies (Sage Publications 1994).

115 For discussion of neoliberalism, see references supra, note 106.

116 S Jasanoff and B Wynne, “Science and Decision-Making” in S Rayner and EL Malone (eds), Human Choice and Climate Change, Volume 1: The Societal Framework (Battelle Press 1998).

117 D Callaghan, “The Social Sciences and the Task of Bioethics” (1999) 128(4) Daedalus 275, at p 276.

118 P Farmer, Pathologies of Power: Health, Human Rights, and the New War on the Poor (University of California 2003) pp 204–205.

119 JR Garrett, “Two Agendas for Bioethics: Critique and Integration” (2015) 29 Bioethics 440, at p 442.

120 AM Hedgecoe, “Critical Bioethics: Beyond the Social Science Critique of Applied Ethics” (2004) 18 Bioethics 120, at p 125. Also see B Hoffmaster (ed), Bioethics in Social Context (Temple University Press 2001).

121 RA Dahl, Democracy and its Critics (Yale University Press 1989) p 335.

122 J Benjamin Hurlbut, “Remembering the Future: Science, Law, and the Legacy of Asilomar” in S Jasanoff and S-H Kim, Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power (University of Chicago Press 2015) p 129, original emphasis.

123 For discussion, see Hutter and Lloyd-Bostock, supra, note 76, at pp 13–15. In general see I Hacking, The Taming of Chance (Cambridge University Press 1990); TM Porter, Trust in Numbers: The Pursuit of Objectivity in Science and Public Life (Princeton University Press 1995); A Desrosières, The Politics of Large Numbers: A History of Statistical Reasoning (Harvard University Press 1998); WN Espeland and ML Stevens, “A Sociology of Quantification” (2008) 49(3) European Journal of Sociology 401.

124 M Foucault, Society Must Be Defended (Penguin Books 2004) p 7.

125 Fricker, supra, note 14, referring to text cited in the introduction.

126 J Medina, “Hermeneutical Injustice and Polyphonic Contextualism: Social Silences and Shared Hermeneutical Responsibilities” (2012) 26(2) Social Epistemology 201, at p 217, emphasis added.

127 Heneghan et al, supra, note 54, at p 5.

128 A Irwin and M Michael, Science, Social Theory and Public Knowledge (Open University Press 2003); A Kerr et al, “The New Genetics and Health: Mobilising Lay Expertise” (1998) 7(1) Public Understanding of Science 41; BE Wynne, “Knowledges in Context” (1991) 16 Science, Technology and Human Values 111; BE Wynne, “Misunderstood Misunderstandings: Social Identities and Public Uptake of Science” (1992) 1 Public Understanding of Science 281.

129 S Jasanoff, “Technologies of Humility: Citizen Participation in Governing Science” (2003) 41 Minerva 223; Jasanoff, supra, note 113, ch 10 “Civic Epistemology”. Also see B Wynne, “Uncertainty and Environmental Learning: Reconceiving Science and Policy in the Preventive Paradigm” (1992) 2(2) Global Environmental Change 111.

130 See A Wylie, “Why Standpoint Matters” in S Harding (ed), The Feminist Standpoint Reader: Intellectual and Political Controversies (Routledge 2004); V Rabeharisoa et al, “Evidence-based Activism: Patients’, Users’ and Activists’ Groups in Knowledge Society” (2014) 9(2) BioSocieties 111.

131 Cf A Barry, Political Machines: Governing a Technological Society (Athlone Press 2001).

132 N Rose, The Politics of Life Itself: Biomedicine, Power and Subjectivity in the 21st Century (Princeton University Press 2007).

133 P Rabinow, Essays on the Anthropology of Reason (Princeton University Press 1996); S Gibbon and C Novas (eds), Biosocialities, Genetics and the Social Sciences (Routledge 2007).

134 Such as: genetic citizens (D Heath et al, “Genetic Citizenship” in D Night and J Vincent (eds), A Companion to the Anthropology of Politics (Blackwell 2004)); stories of resistance (E Loja et al, “Disability, Embodiment and Ableism: Stories of Resistance” (2013) 28(2) Disability & Society 190)); moral pioneers (R Rapp, Testing Women, Testing the Fetus: The Social Impact of Amniocentesis in America (Routledge 2000)); biological citizens (N Rose and C Novas, “Biological Citizenship” in A Ong and S Collier (eds), Global Assemblages: Technology, Politics, and Ethics as Anthropological Problems (Blackwell 2005), cf J Biehl, Will to Live: AIDS Therapies and the Politics of Survival (Princeton University Press 2007)); therapeutic citizens (V-K Nguyen, “Antiretroviral Globalism, Biopolitics, and Therapeutic Citizenship” in A Ong and SJ Collier (eds), Global Assemblages: Technology, Politics, and Ethics as Anthropological Problems (Blackwell Publishing 2005).

135 A Moore, “Beyond Participation: Opening-Up Political Theory in STS” (2010) 40(5) Social Studies of Science 793. This is a review of MB Brown, Science in Democracy: Expertise, Institutions and Representation (MIT Press 2009).

136 Medina, supra, note 126, at p 218. This responsibility is grounded in virtue theory. For discussion see Fricker, supra, note 14.

137 Medina, supra, note 126, at p 215.

138 M Fineman, “The Vulnerable Subject and the Responsive State” (2010) 60 Emory Law Journal 251.

139 Including precarity (J Butler, Precarious Life: The Power of Mourning and Violence (Verso 2005)); the capabilities approach (M Nussbaum, Creating Capabilities (Harvard University Press 2011); A Sen, “Equality of What?” in S McMurrin (ed), Tanner Lectures on Human Values, Volume 1 (Cambridge University Press 1980)); depletion (B Goldblatt and SM Rai, “Recognizing the Full Costs of Care? Compensation for Families in South Africa Silicosis Class Action” (2017) 27 Social & Legal Studies 671); a feminist approach to flesh (C Beasley and C Bacchi, “Envisaging a New Politics for an Ethical Future: Beyond Trust, Care and Generosity – Towards an Ethic of Social Flesh” (2007) 8 Feminist Theory 279); and the social body (S Lewis and M Thomson, “Social Bodies and Social Justice” (2019) International Journal of Law in Context 1).

140 This includes understanding in epigenetics and neuroscience: see M Pickersgill, “Neuroscience, Epigenetics and the Intergenerational Transmission of Social Life: Exploring Expectations and Engagements” (2014) 3(3) Families, Relationships and Societies 481; N Rose and J Abi-Rached, Neuro: The New Brain Sciences and the Management of the Mind (Princeton University Press 2013); N Rose and J Abi-Rached, “Governing Through the Brain: Neuropolitics, Neuroscience and Subjectivity” (2014) 32(1) Cambridge Anthropology 3; D Wastell and S White, Blinded by Science: The Social Implications of Epigenetics and Neuroscience (Policy Press 2017).

141 M Meloni, “How Biology Became Social, and What It Means for Social Theory” (2014) 62 The Sociological Review 593, at p 595.

142 To borrow from the title to the following: M Thomson, “Bioethics & Vulnerability: Recasting the Objects of Ethical Concern” (2018) 67(6) Emory Law Journal 1207.

143 Most notably, see Fineman, supra, note 138.

144 Sex differences are central to the development of personalised medicines – as mentioned by Nolan and Nguyen, supra, note 90.

145 R Ashcroft, “Fair Process and the Redundancy of Bioethics: A Polemic” (2008) 1 Public Health Ethics 3.

146 Thomson, supra, note 142.

147 See, especially, Jasanoff, supra, note 113.

148 A Moore, “Public Bioethics and Public Engagement: The Politics of ‘Proper Talk’” (2010) 19(2) Public Understanding of Science 197, at p 197. Also A Moore, “Public Bioethics and Deliberative Democracy” (2010) 58 Political Studies 715.

149 For discussion, see Flear, supra, note 22, ch 7, especially pp 205–206.

150 Power, supra, note 4, at p 11, emphasis added.

151 J Downer, “Anatomy of a Disaster: Why Some Accidents Are Unavoidable” (2010) CARR Discussion Paper No 61, p 20, emphasis added.

152 Turner, and Turner and Pidgeon, supra, note 15.

153 B Wynne, “Risk as a Globalising ‘Democratic’ Discourse? Framing Subjects and Citizens” in M Leach et al (eds), Science and Citizens: Globalisation and the Challenge of Engagement (Zed Books 2005). Also see A Boin et al (eds), The Politics of Crisis Management: Public Leadership Under Pressure (Cambridge University Press 2005); SG Breyer, Breaking the Vicious Circle: Toward Effective Risk Regulation (Harvard University Press 1993); D Demortain, “From Drug Crises to Regulatory Change: The Mediation of Expertise” (2008) 10(1) Health Risk & Society 37.

154 For discussion, see Flear, supra, note 22, especially ch 1 and ch 6.

155 F Fischer, Reframing Public Policy: Discursive Politics and Deliberative Practices (Oxford University Press 2003); DA Schon and M Rein, Frame/Reflection: Toward the Resolution of Intractable Policy Controversies (Basic Books 1994).

156 Part of what Jasanoff describes as the “civic epistemology” informing societal choices about technoscience –Jasanoff, supra, note 113, ch 10. On risk society, see U Beck, Risk Society: Towards a New Modernity (Sage Publications 1986); U Beck, World Risk Society (Polity 2009); A Giddens, Modernity and Self-Identity: Self and Society in the Late Modern Age (Stanford University Press 1991); N Luhmann, Observations on Modernity (Stanford University Press 1998). Also see H Kemshall, “Social Policy and Risk” in G Mythen and S Walklate (eds), Beyond the Risk Society (Open University Press 2006).

157 Power, supra, note 4, at p 21. Emphasis added.

158 It is worthwhile noting that, as regards vaginal mesh, “[l]itigation did not inform the regulatory decisions” made in the wake of failure – Macleod and Chakraborty, supra, note 5, at p 264. The latter authors do not seem to attribute the regulatory approach to PIP silicone breast implants to litigation. However, this statement was made before the decision of the Federal Court of Australia in Gill v Ethicon Sarl (No 5) [2019] FCA 1905. This case involved a class action against members of the Johnson & Johnson group in which the Court found in favour of the claimants.

159 Power, supra, note 4, at p 6. Emphasis added.

160 C Hood, The Blame Game: Spin, Bureaucracy, and Self-Preservation in Government (Princeton University Press 2011). Also see Hutter and Lloyd-Bostock, supra, note 76, at pp 209–213.

161 N Pidgeon et al, The Social Amplification of Risk (Cambridge University Press 2003).

162 Boin et al, supra, note 153.

163 On the ‘regulatory failure’, see Section II.2, and on ‘regulatory crisis’ see Hutter and Lloyd-Bostock, supra, note 76.

164 See Fineman, supra, note 138; M Fineman, “Equality, Autonomy, and the Vulnerable Subject in Law and Politics” in M Fineman and A Grear (eds), Vulnerability: Reflections on a New Ethical Foundation for Law and Politics (Ashgate 2013). For one recent deployment, see NM Ries and M Thomson, “Bioethics & Universal Vulnerability: Exploring the Ethics and Practices of Research Participation” (2019) Medical Law Review (forthcoming).

165 General Principle 13 Helsinki Declaration: “Groups that are underrepresented in medical research should be provided appropriate access to participation in research”.

166 Art 4 Universal Declaration on Bioethics and Human Rights (2005), emphasis added. Also see Additional Protocol to the Convention on Human Rights and Biomedicine, concerning Biomedical Research (25 January 2005, entered into force 1 September 2007) CETS 195. In addition see Convention for the Protection of Human Rights and Dignity of the Human Being with regard to the Application of Biology and Medicine: Convention on Human Rights and Biomedicine (4 April 1997, entered into force 1 December 1999) ETS 164 (often referred to simply as the Oviedo Convention). For discussion, see Brownsword, supra, note 12, at pp 102–105.

167 On which, including a discussion of the potential of human rights and bioethics to both narrow and widen participation, see Flear, supra, note 22. Also see T Murphy, “Technology, Tools and Toxic Expectations: Post-Publication Notes on New Technologies and Human Rights” (2009) 2 Law, Innovation and Technology 181.

168 For discussion, see R Ashcroft, “Could Human Rights Supersede Bioethics? (2010) 10(4) Human Rights Law Review 639, at pp 645–646.

169 In relation to biomedicine see S Epstein, Impure Science (University of California Press 1996). On risk and social mobilisation, see references to Beck, supra, note 156.

170 R Doubleday and B Wynne, “Despotism and Democracy in the United Kingdom: Experiments in Reframing Citizenship” in S Jasanoff (ed), Reframing Rights: Bioconstitutionalism in the Genetic Age (MIT Press 2011).

171 One of the most promising lines of enquiry is vulnerability theory. See references supra, note 164.

172 For a review of approaches to the collection of data, see DB Kramer et al, “Ensuring Medical Device Effectiveness and Safety: A Cross-National Comparison of Approaches to Regulation” (2014) 69(1) Food Drug Law Journal 1. The EU’s new legislation on medical devices has sought to improve, inter alia, post-marketing data collection, such as through take-up of the Unique Device Identification. This is used to mark and identify medical devices within the supply chain. For discussion of this and other aspects of the EU’s new legislation, see AG Fraser et al, “The Need for Transparency of Clinical Evidence for Medical Devices in Europe” (2018) 392 The Lancet 521.

173 S Sauerland et al, “Premarket Evaluation of Medical Devices: A Cross-Sectional Analysis of Clinical Studies Submitted to a German Ethics Committee” (2019) 9 BMJ Open e027041.

174 Heneghan et al, supra, note 54. Also see B Campbell et al, “How Can We Get High Quality Routine Data to Monitor the Safety of Devices and Procedures?” (2013) 346 BMJ 2782.

175 M Eikermann et al, “Signatories of Our Open Letter to the European Union. Europe Needs a Central, Transparent, and Evidence Based Regulation Process for Devices” (2013) 346 BMJ 2771; AG Fraser et al, “The Need for Transparency of Clinical Evidence for Medical Devices in Europe” (2018) 392 The Lancet 521.

176 V Xafis et al, “Ethics Framework for Big Data in Health and Research” (2019) 11(3) Asian Bioethics Review 227, at p 245.

177 ibid, at p 246.

178 Laurie, supra, note 21, at p 71, emphasis added.

179 Carroll et al, supra, note 25, at p 2, emphasis added. There is a slippage here between anticipation and expectation – but recall that the latter is central to the conditions of possibility for failure, at least as they are understood in this article.

180 Modes of description create possibilities for action – that is: “if new modes of description come into being, new possibilities for action come into being as a consequence” : see I Hacking, “Making-Up People” in T Heller et al (eds), Reconstructing Individualism: Autonomy, Individuality and the Self in Western Thought (Standard University Press 1986) p 231, emphasis added.

181 L McGoey, Unknowers: How Strategic Ignorance Rules the World (Zed Books 2019). See further KT Paul and C Haddad, “Beyond Evidence Versus Truthiness: Toward a Symmetrical Approach to Knowledge and Ignorance in Policy Studies” (2019) 52(2) Policy Sciences 299.