Hostname: page-component-78c5997874-8bhkd Total loading time: 0 Render date: 2024-11-10T19:27:47.978Z Has data issue: false hasContentIssue false

SGEM Hot Off the Press: Computer provider order entry (CPOE) and emergency department flow

Published online by Cambridge University Press:  30 March 2017

Katie Lin*
Affiliation:
Department of Emergency Medicine, University of Calgary, Calgary, AB
Kenneth Chan
Affiliation:
Department of Emergency Medicine, University of Calgary, Calgary, AB
Rohit Mohindra
Affiliation:
Department of Emergency Medicine, McGill University, Montreal, QC
Ken Milne
Affiliation:
Division of Emergency Medicine, University of Western Ontario, London, ON
Brent Thoma
Affiliation:
Emergency Medicine, University of Saskatchewan, Saskatoon, SK.
Chris Bond
Affiliation:
Department of Emergency Medicine, University of Calgary, Calgary, AB
*
Correspondence to: Drs. Katie Lin and Kenneth Chan, Department of Emergency Medicine, Room C231, Foothills Medical Centre, 1403 29 St. NW, Calgary, AB T2N 2T9; Email: kylin@ucalgary.ca and kchan.med@gmail.com

Abstract

Type
Education: Social Media Review
Copyright
Copyright © Canadian Association of Emergency Physicians 2017 

As part of the Canadian Journal of Emergency Medicine (CJEM) developing social media strategy,Reference Thoma, Mohindra and Artz 1 we are collaborating with the Skeptics’ Guide to Emergency Medicine (SGEM) to summarize and critically appraise the current emergency medicine literature using evidence-based medicine principles. In the Hot Off the Press (HOP) series, we select original research manuscripts published in CJEM to be summarized and critically appraised on the SGEM website/podcastReference Milne 2 and discussed by the study authors and the online EM community. A similar collaboration is underway between the SGEM and Academic Emergency Medicine. What follows is a summary of the selected article and the immediate post-publication critical appraisal from the SGEM podcast, as well as an overview of the subsequent discussion from the SGEM blog and other social media. Through this series, we hope to enhance the value, accessibility, and application of important, clinically relevant EM research. In this, the fourth SGEM HOP hosted collaboratively with CJEM, we discuss Gray et al.’s paperReference Gray, Fernandes and Van Aarsen 3 evaluating the impact of computerized provider order entry on patient flow through a quaternary emergency department in London, Ontario.

BACKGROUND

Computerized provider order entry (CPOE) has emerged in North America as a means to standardize and improve the delivery of health care services to patients.Reference Kuperman and Gibson 4 CPOE involves the utilization of electronic systems to track health care provider orders but can also include more advanced functionality such as clinical decision support.Reference Miller, Waitman and Chen 5

The debate surrounding CPOE has focused on the clinical benefits versus costs of implementation. Studied clinical benefits of CPOE include reduced prescribing errors and adverse medication interactions,Reference Nuckols, Smith-Spangler and Morton 6 improved adherence to evidence-based protocols for specific presentations, such as renal colic and acute ischemic stroke,Reference Blankerfield, Rogers and White 7 - Reference Yang, Park and Chung 9 increased legibility and accessibility of documentation on record,Reference Cresswell, Bates and Williams 10 and potential secondary uses of data by health care organizations for outcome tracking and quality assessment.Reference Cresswell, Bates and Williams 10 CPOE’s benefits, however, may come at the cost of decreased patient and physician satisfaction,Reference Bastani, Walch and Todd 11 , Reference Shanafelt, Dyrbye and Sinsky 12 impaired emergency physician productivity,Reference Schiff, Amato and Egulae 13 and increased length of stay (LOS) in the emergency department (ED) for admitted patients.Reference Spalding, Mayer and Ginde 14 There is no strong evidence to suggest that CPOE improves patient mortality,Reference Brunette, Tersteeg and Brown 15 and the literature has not yet provided consensus evidence in favour of CPOE when weighing its benefits and drawbacks.Reference Georgiou, Prgomet and Paoloni 16

ARTICLE SUMMARY

Gray and colleagues conducted a retrospective cohort study to evaluate the impact of CPOE implementation on ED workflow at a quaternary care centre in London, Ontario, comprising two separate ED campuses. They adopted a pre-post implementation design to evaluate the effect of CPOE on three primary ED flow metrics: LOS, wait time (WT), and the proportion of patients who left without being seen (LWBS).

The study included all ED patients 18 years and older who were triaged during July and August 2013 (pre-implementation of CPOE) and July and August 2014 (post-implementation of CPOE). Data were extracted from the London Health Sciences Centre (LHSC) electronic database and health records. Any patients with incomplete or incorrect ED charts, negative WTs or LOS, extreme outliers with WT > 24 hours, or those missing vital statistics were excluded. The investigators also completed a subgroup analysis looking at variability by the Canadian Triage and Acuity Scale (CTAS) stratification and admitted patients to evaluate whether CPOE had differential effects based on acuity of illness or disposition status.

The authors analysed a combined data set of 36,758 ED visits: 18,872 visits in 2013 and 17,886 visits in 2014. Median age, gender distribution, CTAS stratification, and rate of admission were similar between the two groups at baseline. Statistical analyses were conducted to determine significant changes in WT, LOS, and LWBS between pre- and post-implementation of CPOE.

KEY RESULTS

The authors concluded that CPOE implementation detrimentally impacted patient flow in the two EDs that they studied. They found statistically significant increases in median WT (increased from 78 to 83 minutes), median LOS (increased from 254 to 264 minutes), proportion of patients who LWBS (increased from 7.2% to 8.1%), and the LOS for admitted patients (increased from 713 to 776 minutes) (Table 1).

Table 1 Key results from Gray et al.’s CPOE studyReference Gray, Fernandes and Van Aarsen 3

QUALITY ASSESSMENT

The study by Gray et al.Reference Gray, Fernandes and Van Aarsen 3 was a retrospective cohort design comparing ED patient flow pre- and post-implementation of CPOE using administrative data. The authors had clearly defined outcomes and collected data with objective variables. The inclusion and exclusion criteria were appropriate and acceptable for cohort recruitment. Nevertheless, this study had several limitations.

The CPOE in this study was introduced to the two EDs in April 2014, only 3 months prior to the data sampling set for post-implementation. The results seen in this study could potentially be confounded by inexperience and lack of adequate training or uptake during the early stages of implementation. It would be useful to collect data further from implementation to determine whether user familiarity with CPOE reduces its impact on workflow over time given previously described learning curves with CPOE implementation.Reference Abramson, Patel and Malhotra 17

The retrospective cohort design encompasses potential sampling errors that limit the generalizability of the authors’ findings.Reference Kaji, Schriger and Green 18 Gray et al.Reference Gray, Fernandes and Van Aarsen 3 drew data from only July and August during the pre- and post-implementation phases of CPOE in one Canadian city. It is difficult to state whether this would be a representative sampling for a number of reasons.

First, the authors did not collect or report on potential confounding characteristics between the pre- and post-implementation time periods such as mean boarding times, hospital over-capacity statistics, population and ED utilization rates, and bed allocation resources.

Second, the choice of July and August for data extraction may have impacted outcomes, given that July 1 marks the beginning of residency for new trainees in Canada, and junior residents in the ED environment would be adjusting to CPOE in addition to new clinical responsibilities. The existence of the “July effect” (whereby medical trainees beginning new clinical roles affect medical error rates, patient outcomes, and departmental productivity) remains uncertain in the literature.Reference Young, Ranji and Wachter 19

Third, in the post-implementation group, a substantial proportion of patient data was excluded, mostly due to clerical and administrative data error. It is impossible to determine what impact this may have had on the results.

Finally, CPOE software programs can be highly variable in their design, implementation strategy, and user interfaces. Some CPOE programs may include components of clinical decision support and order set bundles to prompt evidence-based reminders during clinical encounters, whereas others may simply be a collection of individual medications and orders requiring manual entry. Without further detail on the specifics of the CPOE design and implementation in the authors’ study, it is difficult to draw parallels between this study’s results and CPOE use in other centres.

TAKE-TO-WORK POINTS

CPOE implementation may statistically impair ED flow metrics, although this is of uncertain clinical significance given the slight differences in flow metric results (5 and 10 minutes’ difference in median WT and LOS, respectively) and the large interquartile ranges measured. The initial stages of implementing CPOE may lead to increased ED WTs, LOS, and more patients leaving without being seen by an emergency physician. The generalizability of Gray et al.’s studyReference Gray, Fernandes and Van Aarsen 3 is uncertain because the impact of CPOE on ED work flow may be highly dependent on both software characteristics and implementation strategy. Further studies looking at the long-term outcomes of CPOE implementation are still needed.

METHODOLOGY OF THE SOCIAL MEDIA RESPONSE ANALYSIS

The social media discussion started with the launch of the blog post and podcast on July 5, 2016, and continued for 2 weeks until July 19, 2016. An invitation to comment on the article was included in the audio of the podcast, the text of the blog post, and on social media communications (Twitter and Facebook). Social media responses written in the SGEM blog’s comment section, the SGEM Facebook page, and on Twitter (directed at @thesgem, @socmobem, or using the #SGEMHOP hashtag) between July 5 and July 19, 2016, were reviewed by the authorship team. KL compiled and reviewed all aforementioned social media commentary to identify tweets and posts related to the CPOE SGEM podcast and blog post. A thematic analysis was conducted using a qualitative framework approach as outlined below.

Framework approach for thematic analysis:

  • Provisional classification: content from each of the various analysed social media platforms was classified as either promotional (i.e., containing only a link to the blog post with no further content) or commentary-based.

  • Thematic framework development: each commentary-type item was evaluated individually to identify key issues, concepts, and themes raised.

  • Indexing: commonly identified themes across all of the commentary-type items was compiled and coded with short phrases for ease of comparison and tracking.

  • Charting: the thematic framework was organized into a comparison chart presented in Table 2.

  • Mapping and interpretation: as soon as common thematic groupings were identified and a comparison chart was created, all authors participated in a consensus-based analysis to determine which comments were most representative of the general themes of the discussion.

Table 2 Framework approach for thematic analysis – comparison chart

Multiple metrics of dissemination were further tracked by the SGEM HOP team for analysis:

  • Blog post page views were monitored using the Jet Pack plugin by Wordpress.com (available from https://wordpress.org/plugins/jetpack/).

  • Facebook “reach” analytics were provided by Facebook and represented the number of users who saw the original SGEM Facebook post on their own newsfeeds. 20

  • Twitter impressions (the number of users whose newsfeeds contained a tweet featuring the #SGEMHOP hashtag) were tracked using Symplur, a software program that monitors health care related twitter conversations.Reference Symplur 21 Tweets not containing the hashtag were not tracked by Symplur. The number of impressions was calculated by taking the number of tweets per twitter user using the #SGEMHOP hashtag and multiplying it by the number of followers that each participant had.

  • The altmetric score is a proprietary, standardized tool that tracks the disseminative impact of research articles in social media forums (e.g., Facebook, Twitter) and on blogs, podcasts, and news outlets. 22 The altmetrics of the featured article by Gray et al.Reference Gray, Fernandes and Van Aarsen 3 (Figure 1) were compared to all other articles published in CJEM, all published research analysed by altmetrics, and the articles covered in the first three CJEM-SGEM HOPs.Reference Luckett-Gatopoulos, Thoma and Milne 23 - Reference McKenna, Thoma and Milne 25

Figure 1 Screen capture of the altmetrics data retrieved August 3, 2016, from https://cambridgejournals.altmetric.com/details/9299925.

RESULTS OF THE SOCIAL MEDIA RESPONSE

Table 3 provides details on the social media reach of Gray et al.’s articleReference Gray, Fernandes and Van Aarsen 3 during the SGEM HOP campaign. During the 2-week period following the podcast release, #SGEMHOP was used in 84 tweets by 43 individual users, representing 216,926 Twitter impressions. Thirty-four of these tweets were from the study’s authors or CJEM personnel prompting social media responses. Online conversation through Twitter and the SGEM blog remained active for 14 days following podcast release. The altmetrics score for Gray et al.’s articleReference Gray, Fernandes and Van Aarsen 3 was 41, placing it 10th highest amongst all previous CJEM publications and within the top 5% of all published research. Prior articles featured in the SGEM HOPs series scored in the 46–71 range.Reference Luckett-Gatopoulos, Thoma and Milne 23 - Reference McKenna, Thoma and Milne 25 The mean altmetrics score for CJEM publications is 5.9. 26

Table 3 Aggregate analytic data of social media platform discussions following the SGEM blog posting

ONLINE DISCUSSION SUMMARY

The majority of the discourse was held directly on the SGEM blog with dissemination of the URL link through Twitter, Facebook, and Google +. The SGEM blog hosted a total of 17 comments from 9 discrete users, including direct feedback from the study’s primary author, Dr. Andrew Gray.Reference Milne 2 Clinicians across Canada engaged in conversation to share their experiences with CPOE implementation at their institutions. Consistently identified themes included the limited generalizability of the study’s results due to confounding factors, variability of CPOE software design across centres, and the importance of a collaborative implementation strategy when introducing CPOE to a department.

There were multiple confounding variables that could have influenced the study’s results. Dr. Eddy Lang, Academic Department Head for Emergency Medicine at the University of Calgary, cautioned that the study’s results needed to be considered within a systemic context of ED overcrowding: “CPOE may have an association with increased [length of stay] for admitted patients but without adjustment for boarding and [emergency inpatients] there is no way to know if any relationship exists.” This was consistent with the discussion held on the SGEM podcast, where the authors’ lack of reporting for important potential confounding factors was addressed. Variables, such as boarding time and proportion of emergency inpatients, were not tracked between the pre- and post-implementation time periods.

Physicians across the country shared their experiences with CPOE implementation and highlighted the variability of CPOE software across different centres. Dr. Wurster felt that implementation of CPOE at his centre was “nothing short of a disaster. […] Our training was not adequate and we were given little input in how we would like the computer system to work for us. No one examined how using CPOE would affect our daily processes in the emergency department.” This was contrasted with Dr. Lalani’s experiences with CPOE in Calgary, where physician input is a critical component of CPOE development and maintenance: “I really value my colleagues […] who have worked so hard to generate easily workable order sets that include things like ‘Well’s Score’ for VTE. I like that if I type ‘ED meningitis’ I get all the orders for this presentation (including CT and all four LP tubes pre-labelled), and I like that if I go to any of the four hospitals [in Calgary] that I work at, it’s the exact same process.”

Finally, the importance of a collaborative implementation strategy was summarized by Dr. Shawn Dowling, an emergency clinician-researcher with a funded role for maintenance and optimization of CPOE order sets in Calgary: “[CPOE’s] success and failure is dependent on the components (order sets), its fit (usability of the software), the engine (physician buy in/engagement) and its mechanics (the team responsible for maintaining and optimizing the CPOE environment). CPOE is most powerful when it’s part of an integrated knowledge translation strategy.”

LIMITATIONS OF SOCIAL MEDIA ANALYSIS

There are a number of inherent limitations to any study evaluating social media engagement. The utilization of social media data involves sampling bias, and we recognize that the online discussion summarized in this paper likely represents the subset of emergency physicians who are heavily engaged in free open access medical education (FOAMed) activities. As such, the opinions of the wider audience may not have been captured through the social media platforms analysed.

Furthermore, commonly used social media analytic scores, such as the number of Twitter impressions or the altmetrics score, are useful for quantifying article dissemination, but do not necessarily provide information on the quality of the discourse generated.Reference Eysenbach 27 - Reference Fox, Gurary and Ryan 29 While the blog post hosted informative discussion around Gray et al.’s publication,Reference Gray, Fernandes and Van Aarsen 3 virtually all of the tweets containing the #SGEMHOP hashtag and various Facebook posts simply directed readers to the blog itself and did not directly add to the discussion.

We have summarized the social media dialogue in this paper; however, it remains important for readers to independently evaluate the primary literature and continue to critically appraise the social media feedback.

CONCLUSION

Gray et al.’s paperReference Gray, Fernandes and Van Aarsen 3 suggested that CPOE implementation impaired ED flow to both a statistically significant and clinically variable degree. The SGEM’s blog post summarized the podcast discussion for readers, highlighted key limitations of generalizing the original study’s results to CPOE at other centres, and allowed physician leaders from across the country to share their experiences with the research topic. Online distribution of Gray et al.’s articleReference Gray, Fernandes and Van Aarsen 3 resulted in disseminative impact scores reaching the top 5% of all published research. Collaborative knowledge translation and online engagement have the potential to increase awareness of primary literature, harness the perspectives of academic experts to enrich critical appraisal, and bridge the gap between literature findings and pragmatic applications for clinical practice.

Keywords: medical order entry systems, emergency department, efficiency

Competing interests: None declared.

References

1. Thoma, B, Mohindra, R, Artz, JD, et al. CJEM and the changing landscape of medical education and knowledge translation. CJEM 2015;17(2):184-187.CrossRefGoogle ScholarPubMed
2. Milne, K. SGEM #159: Computer games – computer provider order entry (CPOE); 2016. Available at: http://thesgem.com/2016/07/sgem159-computer-games-computer-provider-order-entry-cpoe (accessed 27 July 2016).Google Scholar
3. Gray, A, Fernandes, C, Van Aarsen, K, et al. The impact of computerized provider order entry on emergency department flow. CJEM 2016;18(4):264-269.Google Scholar
4. Kuperman, GJ, Gibson, RF. Computer physician order entry: benefits, costs, and issues. Ann Intern Med 2003;139:31-39.Google Scholar
5. Miller, R, Waitman, L, Chen, S, et al. The anatomy of decision support during inpatient care provider order entry (CPOE): empirical observations from a decade of CPOE experience at Vanderbilt. J Biomed Info 2005;38(6):469-485.Google Scholar
6. Nuckols, TK, Smith-Spangler, C, Morton, SC, et al. The effectiveness of computerized order entry at reducing preventable adverse drug events and medication errors in hospital settings: a systematic review and meta-analysis. Syst Rev 2014;3:56.Google Scholar
7. Blankerfield, JF, Rogers, L, White, J, et al. Prospective evaluation of the treatment of pain in the ED using computerized physician order entry. Am J Emerg Med 2012;30(8):1613-1616.Google Scholar
8. Netherton, SJ, Lonergan, K, Wang, D, et al. Computerized physician order entry and decision support improves ED analgesic ordering for renal colic. Am J Emerg Med 2014;32(9):958-961.Google Scholar
9. Yang, JM, Park, YS, Chung, SP, et al. Implementation of a clinical pathway based on a computerized physician order entry system for ischemic stroke attenuates off-hour and weekend effects in the ED. Am J Emerg Med 2014;32(8):884-889.Google Scholar
10. Cresswell, K, Bates, D, Williams, R, et al. Evaluation of medium-term consequences of implementing commercial computerized physician order entry and clinical decision support prescribing systems in two “early adopter” hospitals. J Am Med Inform Assoc 2014;21(e2):e194-e202.Google Scholar
11. Bastani, A, Walch, R, Todd, B, et al. Computerized prescriber order entry decreases patient satisfaction and emergency physician productivity. Am J Emerg Med 2011;29(2):207-211.Google Scholar
12. Shanafelt, TD, Dyrbye, LN, Sinsky, C, et al. Relationship between clerical burden and characteristics of the electronic environment with physician burnout and professional satisfaction. Mayo Clin Proc 2016;91(7):836-848.Google Scholar
13. Schiff, GD, Amato, MG, Egulae, T, et al. Computerised physician order entry-related medication errors: analysis of reported errors and vulnerability testing of current systems. BMJ Qual Saf 2015;24(4):264-271.Google Scholar
14. Spalding, SC, Mayer, PH, Ginde, AA, et al. Impact of computerized physician order entry on ED patient length of stay. Am J Emerg Med 2011;29(2):207-211.CrossRefGoogle ScholarPubMed
15. Brunette, DD, Tersteeg, J, Brown, N, et al. Implementation of computerized physician order entry for critical patients in an academic emergency department is not associated with a change in mortality rate. West J Emerg Med 2013;14(2):114-120.Google Scholar
16. Georgiou, A, Prgomet, M, Paoloni, R, et al. The effect of computerized provider order entry systems on clinical care and work processes in emergency departments: a systematic review of the quantitative literature. Ann Emerg Med 2013;61(6):644-653.CrossRefGoogle ScholarPubMed
17. Abramson, E, Patel, V, Malhotra, S, et al. Physician experiences transitioning between an older versus newer electronic health record for electronic prescribing. Int J Health Info 2012;81(8):539-548.Google Scholar
18. Kaji, AH, Schriger, D, Green, S. Looking through the retrospectoscope: reducing bias in emergency medicine chart review studies. Ann Emerg Med 2014;64:292-298.CrossRefGoogle Scholar
19. Young, JQ, Ranji, SR, Wachter, RM, et al. July effect”: impact of the academic year-end changeover on patient outcomes: a systematic review. Ann Intern Med 2011;155(5):309-315.Google Scholar
20. Facebook. “How is reach defined for each of my Page posts?”; 2016. Available at: https://www.facebook.com/help/241332825914969 (accessed 27 July 2016).Google Scholar
21. Symplur, LLC. Symplur – connecting the dots in healthcare social media; 2016. Available at: www.symplur.com (accessed 27 July 2016).Google Scholar
22. Cambridge Journals. Altmetric scores; 2016. Available at: https://cambridgejournals.altmetric.com/details/9299925#score (accessed 26 July 2016).Google Scholar
23. Luckett-Gatopoulos, S, Thoma, B, Milne, K, et al. SGEM Hot Off the Press: regional nerve blocks for hip and femoral neck fractures: a systematic review. CJEM 2016;18(4):296-300.Google Scholar
24. Purdy, E, Thoma, B, Milne, K, et al. SGEM Hot Off the Press: hypertonic saline in severe traumatic brain injury: a systematic review and meta-analysis of randomized controlled trials. CJEM 2016;18(5):379-384.Google Scholar
25. McKenna, P, Thoma, B, Milne, K, et al. SGEM Hot Off the Press: ultrasound during critical care simulation: a randomized crossover study. CJEM 2017;19(1):50-54.Google Scholar
26. Cambridge Journals. Altmetric summary; 2016. Available at: https://cambridgejournals.altmetric.com/details/9299925/twitter (accessed 26 July 2016).Google Scholar
27. Eysenbach, G. Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact. J Med Internet Res 2011;13(4):e123, 10.2196/jmir.2012.Google Scholar
28. Ringelhan, S, Wollersheim, J, Welpe, I. I like, I cite? Do Facebook likes predict the impact of scientific work? PLoS One 2015;10(8):e0134389, 10.1371/journal.pone.0134389.Google Scholar
29. Fox, CS, Gurary, EB, Ryan, J, et al. Randomized controlled trial of social media: effect of increased intensity of intervention. J Am Heart Assoc 2016;5(5):pii:e004088, 10.1161/JAHA.115.003088.Google Scholar
Figure 0

Table 1 Key results from Gray et al.’s CPOE study3

Figure 1

Table 2 Framework approach for thematic analysis – comparison chart

Figure 2

Figure 1 Screen capture of the altmetrics data retrieved August 3, 2016, from https://cambridgejournals.altmetric.com/details/9299925.

Figure 3

Table 3 Aggregate analytic data of social media platform discussions following the SGEM blog posting