Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-25T07:13:55.120Z Has data issue: false hasContentIssue false

Using Continuous Quality Improvement in Community-based Programming During Disasters: Lessons Learned from the 2015 Ebola Crisis in Sierra Leone

Published online by Cambridge University Press:  12 December 2024

Cora Nally*
Affiliation:
Medicine and Health Science, Ghent University, Gent, Belgium
Marleen Temmerman
Affiliation:
Medicine and Health Science, Ghent University, Gent, Belgium
Patrick Van de Voorde
Affiliation:
Medicine and Health Science, Ghent University, Gent, Belgium
Adama Koroma
Affiliation:
Public Health Consultant, Freetown, Sierra Leone
Mary Adam
Affiliation:
Kijabe Hospital, Kijabe, Kenya
*
Corresponding author: Cora Nally; Email: Cora.Nally@ugent.be
Rights & Permissions [Opens in a new window]

Abstract

This paper describes the CQI (Continuous Quality Improvement) process of collecting and analyzing field level qualitative data in an ongoing cycle. This data can be used to guide decision-making for effective emergency response. When medical and community components are integrated from the earliest stages of the disaster, it allows for true collaboration and supports the CQI process to be responsive to evolving data. Our CQI process identified and addressed gaps in communication and coordination, problems with strategy implementation and, on a conceptual level, gaps in the disaster response model. The 2015 Ebola crisis in Sierra Leone provided a case study demonstrating improved effectiveness when a CQI approach is implemented in the Humanitarian Setting, equally in terms of reducing disease spread, and in meeting the broader needs of the population served.

Type
Concepts in Disaster Medicine
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of Society for Disaster Medicine and Public Health, Inc.

Continuous quality improvement (CQI) has been in common use as a business model since the 1920s.Reference Krishnan 7 It is now an emerging strategy in the field of Humanitarian Response, enlarging the toolkit of first responders in disaster management and response. CQI is particularly well-suited to community-based programing in disasters because community engagement at scale is essential for implementing evidenced-based solutions. We use our experience in Sierra Leone and the Ebola crisis as a demonstration of the value and impact of CQI. As we will demonstrate in this case study, CQI utilizes real time feedback loops to provide data for decision making at the front line, not just at command central. CQI facilitates incorporating local actors as well as an understanding of local perceptions of how disease processes work. In so doing, it addresses relevant cultural practices and supports an informed messaging strategy.

The 2014 Ebola epidemic in Sierra Leone began with a rapid spreading of the virus as a result of multiple interacting factors.Reference Arab-Zozani and Ghoddoosi-Nejad 1 These included the slow recognition of the dangers Ebola posed, the lack of information at the household level, and an overall weak health system. These factors compounded to delay information from reaching people in positions of power and hampered the coordination of a large-scale response to stop the disease.Reference Arab-Zozani and Ghoddoosi-Nejad 1 As the epidemic unfolded, the behavior of Sierra Leoneans was increasingly motivated by fear due to the lack of consistent messaging from all levels of government. This, in turn, caused the national government to become increasingly restrictive, halting border crossings as well as local travel. These restrictions had an unintended effect of spiraling fear of the disease, without driving any effective messaging on prevention. Because many of the early deaths from Ebola were Health Care Providers, fear was pervasive and included health care workers. Community Health Clinics were abandoned out of fear for the virus.Reference Arab-Zozani and Ghoddoosi-Nejad 1 By October of 2014, approximately 5 months after the first Ebola cases were identified, Sierra Leone experienced a simultaneous collapse of the health system and local and national governance structures because of fear. The government created significant delays in declaring a national emergency, according to several documented sources.Reference Harris 16

The Sierra Leone population was no longer sure who to trust or where to turn. This generated fear-based responses on the population level. One example was to disregard circulating information which had the logo of the ministry of health, even if it was being produced by disaster response organizations.Reference Bennett, Agyepong and Sheikh 3 Myths about the “strange disease” abounded. Prevention measures such as restrictions on washing and touching dead family members conflicted with existing cultural practices and were, thus, not trusted. In Port Loko, an urban center in the West of Sierra Leone, there was a widespread lack of confidence in the emergency health care systems that had been set up by the local government, which included foreign, military, and international non-governmental organizations (INGOs) who had arrived to assist.Reference Arab-Zozani and Ghoddoosi-Nejad 1 Despite their best efforts, by early 2015, the district was coping with increasing infection rates and death.

As demonstrated by our case study, outlined below, disasters are often dynamic in nature, even if they result from a single event. They require programming that is “responsive” to the ever-evolving situation.Reference Callaway, Yim and Stack 2 Continuous Quality Improvement (CQI) allows models to be flexible and adaptive, with a Plan-Do-Study-Act (PDSA) loop applied to each of its interventions.Reference Bennett, Agyepong and Sheikh 3 Real-time field level data drive such dynamic feedback loops. Often models of disaster response assume the underlying state of things to be static and approach things in a linear way, both in terms of planning and subsequent implementation.Reference Christoff 4 Such programs might plan and do but fail to further study the effectiveness of their interventions and act upon the new reality. “Unresponsive” implementation strategies increase problems instead of resolving them.Reference Feser 5

Emergency response actors often arrive with a pre-existing agenda aligned with their own experiences or institutional directives, as stated by Vasovic.Reference Harris, McGregor and Perencevich 6 Importantly, they expect a central coordinating agency to work hand in hand with national and local governance systems. These central coordinating agencies heavily rely on local data input, which is usually only available at the national level. Many, therefore, struggle without community level data due to lack of details. Feedback loops allow for the appreciation of community and household level data. A good example of the application of feedback loops in public health was presented in 1994 by Rissel.Reference Means, Wagner and Kern 8 They specifically discussed the role of “Community Empowerment” as a process that centers on a sense of community and results, by means of feedback loops, in community members obtaining control over their own resources, and eventually gaining autonomy in the emergency response process. They also identified the critical importance of access to evidenced-based medical information in the case of an infectious disease-induced disaster. There are many examples of emergency response actors developing information messages with limited indigenous contributions, distrusting local cultural-driven communication networks (e.g., word of mouth, vernacular radio programs, or community meetings), and with “unadapted” timeframes.Reference Paek, Niess and Padilla 9 Messages with insufficient cultural sensitivity may be technically correct but misunderstood, rendering no benefit at the household level where disease is being transmitted and decisions related to behaviors are being determined. This problem of communication further contributes to a general lack of trust encountered in the communities.

In developing this paper, we also undertook a structured rapid review of the current literature using the following keywords: CQI, quasi experimental study design, plan do study act cycles, emergency response, rapid response, community based programs, and emergency response evaluation/qualitative/monitoring tools. There is significant research published about gathering qualitative data during an emergency response, but very little written about how to analyze and effectively improve programming based on that data. We discovered very few articles in peer reviewed journals that addressed completing the PDSA cycles and incorporating CQI into program design in the emergency response setting. There was a case study published by UNHCR in Skopje that came closest to highlighting our findings and conclusions, but clearly more research needs to be done in varied emergency response settings. 13

Methodology and Approach to Data Collection

Based on our success with CQI in the humanitarian context, we felt compelled to share how we had utilized this tool, which is mostly associated with business models and health care systems. In this paper, we describe a dynamic CQI model using continuous real-time data feedback loops. Feedback was sought from all stakeholders. The usual disaster response approach is to include feedback at a national level and/or local aggregate level. CQI described here differs. Engagement of the front-line local community was required to gain a true field perspective. We needed to ask why in order to understand the context. This meant the responding team from PIH had to ask the specific detail of why certain behaviors were being chosen or avoided by the local population. Specifically, each member of the response team, which included 10 Sierra Leoneans and 1 consultant from Partners in Health, collecting data was charged with seeking answers to 3 questions continuously in their interactions with community members, government officials, other responders, and health care providers. The questions were: What is working? What isn’t working? How can we do better? These questions were asked and answered using the Socratic Method; then, qualitative data were noted and shared during planned weekly meetings with direct supervisors. Qualitative data were received by the 11 office staff mentioned above and written into a shared report providing critically comprehensive feedback to decision makers at local parish, district, and national levels. This resulted in continuous adjustments to programming at all levels of implementation. The listening and learning posture of CQI engenders trust and thus improved compliance to containment messages. It also resulted in improved coordination between different arms/actors of the external response team as well as population level outcomes.Reference Feser 5

Case study: Feedback Loops as Part of a CQI in the 2015 Sierra Leone Disaster Response

Partners in Health, in cooperation with the Sierra Leone government, developed a responsive community health worker (CHW) network in the face of a collapsed health system in order to support the emergency medical response and to extend the emergency health system into the impacted communities. The government of Sierra Leone adopted a Community Health Worker framework well before the Ebola epidemic and was, therefore, familiar with the benefits of community-based interventions. The geographic areas covered by this program included Lokosama, Port Loko, and Kaffu Bullum. All Chiefdoms within the District of Port Loko in Sierra Leone. The population was roughly 260, 000. A 4-arm program model was developed after an initial rapid qualitative assessment identified gaps in the disaster response.Reference Qiu and Chu 10 The gaps identified are listed in Table 1. Rapid feedback loops were incorporated into field program design to address the gaps in the disaster response. Examples of gaps identified via feedback loops and how they were addressed can be seen in Table 2. These feedback loops included weekly meetings with direct reports and direct supervisors, facilitating the movement of this feedback rapidly to the decision makers for the response.

Table 1. Identified gaps

Table 2. Feedback loops and PDSA cycle in practice

The CQI program integrated information from many different sources (program staff, other NGO, government partners, the British military, community leaders, and community members) on a regular basis. Feedback loops were utilized at all program levels, according to plan-do-study-act cycles and informed by real time field data (Nally et al 2021). Clear lines of communication were delineated in each level of the program management structure (who reports to whom and where should information flow to and from) so that meaningful data for decision making does not get lost. At each level, leaders are identified and have the important responsibility to continuously gather and share the information needed (Figure 1). Too often data are gathered via feedback loops but largely remain ignored.Reference Serenko and Bontis 12 The key to feedback loops is their cyclic nature (plan-do-study-act) where data lead to identification of potential gaps and adjusted decision making to improve program delivery.

Figure 1. Feedback Loops for the Port Loko Ebola Response.

Figure 1 lays out the feedback loops occurring at each level of the management structure during the Ebola Response in Sierra Leone. We can see that weekly the gathering of information and feeding it both “up and down” the management structure is integral to the design of the program. This model can be adapted and used in many settings by emergency response programs. These models require leaders at each level to accept responsibility for gathering and sharing information continuously. Through this constant cycling of information, the program activities can be adjusted immediately to accommodate the evolving disaster or close gaps in implementation.

The Feedback Loops and PDSA Cycle in Practice

The key to this process is utilizing the information gathered to inform decision making for the program. There are countless instances where information is gathered in this way and then ignored due to various biases.Reference Serenko and Bontis 12 By incorporating the feedback, you can build trust in the program and implementors, respond more effectively to the changing nature of a disaster, and ensure the resources are used to the greatest impact. Below are some examples from this program highlighting information the feedback loops provided that was acted upon to improve program delivery and impact. In most cases, other organizations or people had begun the PDSA cycle, but not completed it, thus stalling or slowing the response and impeding its adaptability to the evolving context. After these highlighted cycles were implemented, the feedback loops continued to validate or highlight gaps in the implementation and disaster response, and this meant that each of the 4 arms of programming could be adjusted.

Discussion

In the above case study, we demonstrate how a structured plan-do-study-act approach can rapidly highlight issues related to implementation of emergency response programs. Once information was fed back to emergency responders, they acted to rectify these gaps in coordination and implementation.Reference Qiu and Chu 10 Together with the Sierra Leonian government, they developed and integrated community-based responses as part of broader CQI feedback loops.Reference Hoffman 14

Feedback loops creating data for decision making are a part of CQI, but where and how this information is gained is often ignored or its importance is diminished at the national coordination and implementation level.Reference Rissel 15 Many disaster response models do not build in community-based feedback loops. Medical data on case findings is gathered and pushed out, but community level implementation data received from community actors who are doing the case finding is not considered relevant to the implementation and impact. Therefore, it is not analyzed or used to inform programmatic decisions. Household level decision makers need data they can trust. Too often in disaster response, education is viewed as a tertiary program and not integral to reducing FBR’s and integrating the community. It can be easier, in some cases, to rely on fear as a motivator for compliance. Two examples of this are highlighted in our case study. These challenges were remedied once frontline information was fed back to the decision makers at the national coordination level and at the household level.Reference Arab-Zozani and Ghoddoosi-Nejad 1

UNHCR published an interesting case study highlighting the importance of feedback loops in building trust in emergency response and disaster settings. They propose a similar structure to ours and highlight their own success in completing the PDSA cycles. However, there were very few other field-tested examples, and none that focused on the importance of closing these PDSA cycles completely and having them run continuously during the implementation period. The importance of feedback loops has been highlighted anecdotally most recently during the Global Coronavirus Pandemic. Many governments and responding bodies have struggled with messaging and securing compliance to restrictions by the wider population.Reference Arab-Zozani and Ghoddoosi-Nejad 1 This has necessitated the use of feedback loops, the CQI process, and PDSA cycles, whether formally or not,Reference Qiu and Chu 10 thus bringing their importance to the forefront of our current global public health climate.

The tendency of disaster responders to arrive with a prepackaged or preconceived idea of how the response should proceed ignores the impact and importance of indigenous systems and belief to the detriment of the health and lives at stake.Reference Feser 5 CQI is not often thought of as the method for responding to evolving disasters but as the science of QI and the ability to use real time data for decision making, hallmarks of both good disaster response and QI processes. For this program, community members were directly involved in the PDSA cycles.Reference Feser 5 CQI uses many small feedback loops to test both process and outcome measures.

While in practice feedback loops and quick responsive program adaptations do increase trust and impact in disaster response, it becomes challenging to effectively measure impact over time. Traditional models of assessment are difficult when parts of a program or implementation plan are constantly evolving. You can have a data point to start with and a clear idea of where you hope to end up, but many models of quantitative and qualitative research require periodic measurements of the same data point, which becomes difficult if the program evolves and that data point is no longer relevant to creating the outcome hoped for at the start of the process.

The case study of Ebola in 2015 provides an excellent example of an evolving epidemic that requires the ways real time data feedback loops allow implementation adjustments to programming to reflect and impact the situation as it evolves. In this paper, we demonstrated how simple feedback loops produced data guiding our response adaptation to help “keep up” with the ever-evolving epidemic and community needs. Further field testing is necessary to understand how traditional measures of success can still be applied to disaster implementation.

References

Arab-Zozani, M, Ghoddoosi-Nejad, D. Covid-19 in Iran: the good, the bad and the ugly strategies for preparedness - a report from the field. Disaster Med Public Health Prep. 2020:13. https://doi.org/10.1017/dmp.2020.261CrossRefGoogle Scholar
Callaway, DW, Yim, ES, Stack, C, et al. Integrating the disaster cycle model into traditional disaster diplomacy concepts. Disaster Med Public Health Prep. 2012;6(1):5359. https://doi.org/10.1001/dmp.2012.5Google ScholarPubMed
Bennett, S, Agyepong, IA, Sheikh, K, et al. Building the field of health policy and systems research: an agenda for action. PLoS Med. 2011;8(8):15. https://doi.org/10.1371/journal.pmed.1001081Google ScholarPubMed
Christoff, P. Running PDSA cycles. Curr Probl Pediatr Adolesc. 2018;48(8):198201. https://doi.org/10.1016/j.cppeds.2018.08.006Google ScholarPubMed
Feser, E. (2013). Isserman’s Impact: quasi-experimental comparison group designs in regional research. Int Reg Sci Rev. 2013;36(1):4468. https://doi.org/10.1177/0160017612464051Google Scholar
Harris, AD, McGregor, JC, Perencevich, EN, et al. The use and interpretation of quasi-experimental studies in medical informatics. J Am Med Inform Assoc. 2006;13(1):1623. https://doi.org/10.1197/jamia.M1749Google ScholarPubMed
Krishnan, P. A review of the non-equivalent control group post-test-only design. Nurse Researcher. 2019;26(2):3740. https://doi.org/10.7748/nr.2018.e1582Google ScholarPubMed
Means, AR, Wagner, AD, Kern, E, et al. Implementation science to respond to the COVID-19 pandemic. Front Public Health. 2020;8(September). https://doi.org/10.3389/fpubh.2020.00462Google ScholarPubMed
Paek, H, Niess, M, Padilla, B, et al. A community health center blueprint for responding to the needs of the displaced after a natural disaster: the hurricane maria experience. J Health Care Poor Underserved. 2018;29(2):xxvi. https://doi.org/10.1353/hpu.2018.0040Google Scholar
Qiu, W, Chu, C. Clarification of the concept of risk communication and its role in public health crisis management in China. Disaster Med Public Health Prep. 2021;13(5):20192021. https://doi.org/10.1017/dmp.2019.10Google Scholar
Rwabukwisi, FC, Bawah, AA, Gimbel, S, et al. Health system strengthening: A qualitative evaluation of implementation experience and lessons learned across five African countries. BMC Health Serv Res. 2017;17(Suppl 3). https://doi.org/10.1186/s12913-017-2662-9Google ScholarPubMed
Serenko, A, Bontis, N. Meta-review of knowledge management and intellectual capital literature: citation impact and research productivity rankings. Knowl Process Manag. 2004;11(3):185198. https://doi.org/10.1002/kpm.203Google Scholar
Internews. (2020). The space between us: Trust, communication, and collaboration between media and humanitarian organizations in public health emergencies. Retrieved from https://internews.org/resource/the-space-between-us-trust-communication-and-collaboration-between-media-and-humanitarian-organizations-in-public-health-emergencies/Google Scholar
Hoffman, K. (2008). Placing enterprise and business thinking at the heart of the war on poverty. Reinventing Foreign Aid. 2008:485502.Google Scholar
Rissel, C. Empowerment: the holy grail of health promotion? Health Promo Int. 1994;9(1):3947. https://doi.org/10.1093/heapro/9.1.39CrossRefGoogle Scholar
Harris, MJ. Evaluating Public and Community Health Programs (Second ed.). Jossey-Bass; 2017.Google Scholar
Kellerborg, K, Brouwer, W, van Baal, P. Costs and benefits of early response in the Ebola virus disease outbreak in Sierra Leone. Cost Eff Resour Alloc. 2020;18:13. https://doi.org/10.1186/s12962-020-00207-xCrossRefGoogle ScholarPubMed
Provost, LP. Analytical studies: a framework for quality improvement design and analysis. BMJ Qual Saf. 2011;20(Suppl 1):i9243. https://doi.org/10.1136/bmjqs.2011.051557CrossRefGoogle ScholarPubMed
Langley, A. Strategies for theorizing from process data. Acad Manag Rev. 1999;24(4):691710. https://doi.org/10.2307/259349Google Scholar
Vasovic, D, Janackovic, G, Musicki, S. Model of Effective Civil-Military Collaboration in Natural Disaster Risk Management. In: Gocić, M, Aronica, G, Stavroulakis, G, Trajković, S. (eds) Natural Risk Management and Engineering. Springer Tracts in Civil Engineering. Springer, Cham; 2020. https://doi.org/10.1007/978-3-030-39391-5_2Google Scholar
Burkle, F. M., Frost, D. S., Greco, S. B., et al. Strategic disaster preparedness and response: Implications for military medicine under joint command. Military Medicine, 1996;161(8):442447. https://doi.org/10.1093/milmed/161.8.442Google ScholarPubMed
United Nations. UN Disaster Assessment and Coordination (UNDAC) Field Handbook (7th edition). ReliefWeb; 2018. https://reliefweb.int/report/world/un-disaster-assessment-and-coordination-undac-field-handbook-7th-edition-2018Google Scholar
Walsh, S, Johnson, O. Getting to Zero: A Doctor and a Diplomat on the Ebola Frontline. 2018.CrossRefGoogle Scholar
Langley, A. Strategies for theorizing from process data. Academy of Management Review, 1999;24(4):691710.CrossRefGoogle Scholar
UNHCR. Designing and Implementing a Feedback Mechanism to Adapt Humanitarian Programming to the Needs of Communities. UNHCR Office in Skopje; 2016.Google Scholar
Figure 0

Table 1. Identified gaps

Figure 1

Table 2. Feedback loops and PDSA cycle in practice

Figure 2

Figure 1. Feedback Loops for the Port Loko Ebola Response.