CLINICIAN'S CAPSULE
What is known about the topic?
Despite growing interest in quality improvement and patient safety (QIPS), it is unclear how individual emergency departments (ED) are supporting these activities.
What did this study ask?
What is the current state of the QIPS infrastructure and activities in academic emergency medicine departments across Canada?
What did this study find?
This survey of department chiefs and QIPS leads found significant educational and academic efforts, with inconsistent levels of formal support/infrastructure.
Why does this study matter to clinicians?
This study highlights potential opportunities to advance QIPS efforts in emergency medicine further.
INTRODUCTION
Quality improvement and patient safety (QIPS) activities in health care have become increasingly important in recent years, driven, in part, by a focus on preventable medical errors.Reference Kohn, Corrigan and Donaldon1–Reference Calder, Forster and Nelson5 Initially, the term “quality improvement” was loosely defined as “the combined efforts to make the changes that will lead to better patient outcomes, system performance, and professional development.”Reference Batalden and Davidoff6 Specifically, for this paper, QIPS refers to a range of activities utilizing methodological and statistical rigour through which health care providers develop, implement, and assess small-scale interventions, identify those that work well, and implement them more broadly to improve clinical practice and patient safety.7 Encouraged by various local, national, and international health care organizations, individual hospitals and medical groups have engaged in QIPS activities across Canada.8,9 Previously published QIPS efforts within emergency medicine (EM) in Canada have targeted overcrowding, wait times, and resource utilization.Reference Cheng, Barclay and Abu-Laban10–Reference Chartier, Simoes, Kuipers and McGovern16 The Canadian Journal of Emergency Medicine (CJEM) has also recently published a three-paper series on QI, to enable EM practitioners to learn and apply relevant methods to improve care locally.Reference Chartier, Cheng, Stang and Vaillancourt17–Reference Chartier, Vaillancourt, Cheng and Stang19 However, it is unclear what the current national landscape is with regards to how individual EM departments are supporting QIPS activities and evaluating their success and sustainability.
The Canadian Association of Emergency Physicians (CAEP) recently formed a dedicated QIPS Committee tasked with providing leadership and advocacy with issues relating to quality and safety in EM across Canada. A panel for the 2018 Academic Symposium on Leadership was formed to examine the barriers, facilitators, and current national context for the pursuit of QIPS activities in EM. The results of this initiative were presented at the 2018 Symposium in Calgary and have been published concurrently with this article in CJEM.Reference Chartier, Mondoux, Stang, Dukelow, Dowling, Kwok, Trivedi, Tepper and Lang20 One of the major gaps identified early on through this initiative was the lack of understanding of how EM departments at each major academic teaching medical centre engage in QIPS activities. Specifically, knowledge gaps abounded with respect to funding, infrastructure, personnel, education, and academic support. Having a better understanding of the strengths and existing gaps would help guide national leadership on developing recommendations to enhance QIPS progress and patient care further in EM across Canada.
We sought to assess how Canadian medical school EM departments/divisions and major Canadian teaching hospitals approach QIPS programs and efforts, with regards to training, available infrastructure, education, scholarly activities, and perceived needs.
METHODS
We conducted an electronic survey inviting all 17 Canadian university-affiliated EM department/division chairs and affiliated academic hospital emergency department (ED) chiefs, as well as locally identified EM QIPS leads.
Survey design
Through a literature review and assessment of prior surveys of other academic EM activities in Canada,Reference Chartier, Stang, Vaillancourt and Cheng18,Reference Chartier, Vaillancourt, Cheng and Stang19 QIPS experts (EK, LBC, and SM) and a survey methodologist (JJP) generated survey questions around core themes deemed important by the CAEP QIPS committee. These themes included formal training and skill capacity, operational infrastructure, educational activities, academic and scholarship, and perceived gaps and needs.
These questions were then divided into two separate surveys targeted toward the specific individuals who would be most appropriately positioned to answer the respective questions. The surveys were comprised of quantitative and qualitative items, as well as comment boxes, and they were thoroughly reviewed and revised by the authors for clarity and flow logic. Survey 1 comprised 21 questions focusing on “formal training and skill capacity,” “operational infrastructure,” and “perceived gaps and needs,” meant for department/division chairs and ED chiefs (see Appendix A). Survey 2 comprised 33 questions focusing on more front-line operations including “educational activities,” “academic and scholarship,” “QIPS activities,” and “perceived gaps and needs,” meant for local QIPS leads (see Appendix B).
Recruitment
We included all 17 Canadian universities with a medical school and their affiliated academic hospitals. We identified individual EM department/division academic chairs at each medical school and ED chiefs at each affiliated academic hospital, through their respective organization's website contact information, personal communication, or both. An initial email was sent in February 2018 to these individuals to invite them to participate in Survey 1, followed by two separate reminder emails once a month. In addition, we asked each of these chairs/chiefs to further identify within their group any individual(s) currently responsible for QIPS activities at their site. A separate email invitation was then sent to these local QIPS leads, inviting them to participate in Survey 2. The methods for the second survey were the same, with a reminder email sent monthly for two additional months.
Survey administration and data collection
We administered the survey electronically using SurveyMonkey 2018 (SurveyMonkey Canada Inc., Ottawa, ON). A link to the survey was included in the recruitment emails sent to the participants. Participation was voluntary, and all responses were anonymous, with no identifying information linked to respondents. Results from the survey were electronically collected into a Microsoft Excel 2017 (Microsoft Corp., Redmond, WA) downloadable database for analysis. We present descriptive statistics including proportions, means, medians, and ranges, as appropriate. Qualitative comments were collated with common themes identified by one author (EK) and reviewed for agreement by a second author (LC).
Ethical consideration
Ethical approval was granted by the Ottawa Health Science Network Research Ethics Board prior to commencing the survey.
RESULTS
EM department, division chairs, and chiefs
Seventy department/division chairs and/or ED chiefs were invited to complete Survey 1, and 22 (31.4%) completed it. Ten (45.5%) worked in adult-only EDs, three (13.6%) worked in pediatric-only EDs, and nine (40.9%) worked in centres that see both adults and children.
Formal training and skill capacity
A majority of the respondents (81.8%) reported at least one physician member on staff having formal training in QIPS, either in the form of a master's degree or certificate-level course. The focus of these was more often on quality improvement than on patient safety (Table 1). The majority of the respondents also had a formal leadership position for QIPS within their organization (82%), with 83% of them having funding to support these positions. There was a variety of sources for this funding including hospital operational budgets (22%), direct financial contributions from the EM physician group (22%), provincial and government agency grants (17%), university contributions (6%), or a combination of these entities (Figure 1).
ED = emergency department; QIPS = quality improvement and patient safety.
Operational infrastructure
Fifty percent of respondents reported having dedicated office space for their teams to conduct QIPS work, and 46% of them reported having provided administrative or support staff (defined as non-physician personnel dedicated to operational running of QIPS programs) to their teams. Almost all centres (91%) have a dedicated quality committee, although only 65% are multidisciplinary; the rest comprised EM physicians only.
Local QIPS leads
From Survey 1, a total of 12 local QIPS leads were identified by their department/division chairs/chiefs and subsequently invited to participate in Survey 2. Of note, three of the authors (EK, LC, SM) also participated in the survey in their capacity as local QIPS leads.
Educational activities
Eleven (92%) of them completed Survey 2. Further, 64% of respondents reported having a formal training program for teaching QIPS topics to EM residents. Seventy-one percent of these curricula had an option for the completion of an actual QI project. Three (27%) centres had a QIPS education director position within their EM department to facilitate this QIPS training for residents, with two of them having formal funding for their positions. In contrast, only one (9%) respondent reported having any QIPS topics addressing continuing professional development (CPD) activities for staff physicians, and none reported mandatory QIPS training as part of their annual review or reappointment process.
Academic and scholarship
Forty-five percent of the respondents reported that their department produced peer-reviewed QIPS publications over the previous five years, with a median of four (range: 1–10) manuscripts per centre. Over one-half of Survey 2 respondents (55%) reported peer-reviewed QIPS abstract acceptances over the past five years, with a median of 10 (range: 1–100) abstracts per centre. Fifty-five percent of these QIPS projects had EM physicians as a project lead or co-lead. Over one-third (36%) of respondents described formal funding for faculty members to carry out QIPS scholarship. Two (18%) respondents reported that their department provides internal awards or has a points system for QIPS scholarship, and two (18%) respondents reported that their group currently held external peer-reviewed QIPS grants. Access to research-specific infrastructure and supports for conducting QIPS scholarship were variable between the respondents’ centres: mentorship (55%), librarian (36%), methodologist (27%), administrative personnel (27%), and statistician (9%).
QIPS activity
Six respondents reported active ongoing QIPS projects at the time of the survey, ranging from 4 to 30 individual projects in progress per responding site. Physicians were the sole leads for these projects at one-third (2/6) of responding centres, and projects at the other two-thirds (4/6) of centres were co-led by physicians, nurses, allied health professionals, or any combination of these professionals. Only three respondents were aware of an explicit quality plan within their ED, of which all of them reported the direct linkage of QIPS projects to that quality plan.
Qualitative comments
In open-text comment boxes in both surveys, 17/22 (77%) of ED chiefs and 5/11 (46%) of local QIPS lead respondents described a number of perceived gaps and needs in relation to their QIPS activities. They identified, from a national perspective, a need for greater promotion and general awareness of the developing field of QIPS, guidance and consensus on high priority QIPS topics that all EDs can focus on, and a platform through which best practices and learnings from QIPS work can be shared and spread. At the local level, respondents identified a need for increased academic support to carry out QIPS work, greater support for the development of QIPS skills and capacity, and standardization of common QIPS indicators across EDs.
DISCUSSION
This study is the first environmental scan of QIPS activities in academic EM at Canadian medical schools and major teaching hospitals. We found that there is a significant level of engagement in QIPS, as evidenced by the large proportion of responding academic centres with dedicated positions for QIPS and recruitment of staff with QIPS training. There appears to be a keen interest in providing QIPS education in EM residency programs, with many centres providing longitudinal supervision for QI projects. We also found significant academic efforts related to QIPS activities, with a majority of responding centres successfully producing peer-reviewed abstracts and manuscripts. In contrast, our study revealed that less than one-half of the responding centres have a dedicated administrative structure to support QIPS leads within their groups. Educational efforts are mostly focused on residents, with minimal investment in strengthening QIPS capacity for existing staff physicians. Despite the reported academic output, few respondents identified any formal funding or reward system to ensure ongoing academic success. Overall, our environmental scan found a disproportionate infrastructure in place to sustain QIPS activities in academic EM across Canada, and an opportunity for EDs to invest and close this potential gap in their efforts to advance QIPS in their departments further.
This study is important in summarizing current strengths and weaknesses of and providing an initial reference point for national and local QIPS activities in academic EM. Similar to other environmental scans for other academic endeavours, this report provides useful baseline information for comparison and highlights areas for improvement.Reference Stiell, Artz and Lang21,Reference Artz, Meckler, Argintaru, Lim and Stiell22 On a national level, our results are guiding the CAEP QIPS committee's key strategic activities to develop recommendations and coordinate large-scale initiatives to support the growth of QIPS capacity across Canada. For example, our results revealed that a few centres were significantly more successful at producing a high volume of scholarly QIPS output; there is an opportunity for sharing successful strategies amongst all EM departments and perhaps leading to larger multicentre QIPS initiatives. Locally, individual centres can utilize these findings as a comparison to identify gaps in their QIPS efforts and set priorities for improvement, for example, developing and funding a formal QIPS position within their group, as is the case in the majority of responding centres. The results of this study highlight a significant imbalance between formal support and successful output of QIPS work in EM. This may form the basis for individual academic centres to advocate for increased funding to sustain this important work aimed at improving care for patients locally.
There were some limitations to this study. The generalizability of the findings is limited by the 31.4% response rate from chairs and chiefs. We intentionally expanded the initial invitation email list to include affiliated teaching sites, in addition to academic medical centres, to increase our reach, but this may have led to a decreased response rate. However, because smaller EDs might not have the same resources or support for QIPS activities within their EDs, we restricted our recruitment efforts to university-affiliated EDs. Our ethics board review did not allow us to link and identify individual responses; thus, we were unable to provide further demographic details about survey respondents. Another limitation of our study was the challenges associated with the identification of local QIPS leads within each organization (and whether they even had one), given the lack of an available list for these positions. We achieved an excellent response rate of 11 of the 12 participants for the local QIPS leads, but it remains unclear how many were simply not identified to our team by their respective chairs and chiefs.
This study represents a limited snapshot of QIPS activities in Canadian EM. Future research in this area will need to examine how such activities are supported, sustained, and broadened over time. One of the core tenants in the science of quality improvement is to measure the current state and continuously aim for improvements. A future comprehensive review of this field in Canadian EM should examine in more detail how QIPS work is evaluated and what impacts on clinical outcomes have been achieved. In addition, we will need to review whether national collaborations and spread of QIPS gains beyond local centres are ultimately successful.
CONCLUSION
This study is the first review of QIPS activities in EM across Canada. We found multiple EDs showing interest in and recognizing the importance of QIPS, with a demonstration of significant local educational and academic efforts. However, there appears to be a discrepancy between the level of formal support and infrastructure and such activities. These findings have informed the CAEP 2018 Academic Symposium QIPS panel recommendations. It will also continue to guide impactful national initiatives and provide local EDs with a starting point to advocate and advance their important QIPS efforts aimed at improving patient care.
SUPPLEMENTARY MATERIAL
The supplementary material for this article can be found at https://doi.org/10.1017/cem.2019.16
Acknowledgements
The authors would like to thank the CAEP office staff Shanna Scarrow, Kelly Wyatt, and Gisele Leger for their assistance with survey logistics, as well as Dr. Eddy Lang for enabling the formation of the QIPS panel for the 2018 Academic Symposium in Calgary.
Competing interests
None declared.