Reducing unnecessary testing is the most effective approach to reducing the carbon footprint of pathology testing. Reference McAlister, Barratt and Bell1 The 2023 SHEA Diagnostic Stewardship Task Force mentions restriction of repeat testing for two tests Reference Fabre, Davis and Diekema2 and we have shown that our customized electronic duplicate order alerts for 83 Microbiology tests had enormous impacts on reducing financial cost and carbon footprint as a result of a 47% reduction in test ordering. Reference Graham, Gugasyan and Dharmaraj3
Concern around the concept of “Alert fatigue” may hamper the successful use of electronic alerts to support Microbiology test Stewardship. While well-designed alerts can improve patient care and resource utilization, Reference Graham, Gugasyan and Dharmaraj3,Reference Procop, Keating and Stagno4 excessive alerting may be counterproductive, leading to clinician burden. Reference Baron, Huang and McEvoy5 There are many knowledge gaps in the research of clinical decision support (CDS) and the use of expletives by the requester to override alerts is an example suggesting that alerts are not always welcomed by providers. Reference Jackups6 A 2022 Bibliometric Review concluded that future research directions should focus on alert optimization to reduce alert fatigue. Reference Chien, Chen and Chien7
Our Microbiology alerts were implemented on October 15, 2020. We implemented test-specific alerts based on various repeat-test time frames (ranging from 3 days to 2 years), in which the alert message suggested time frames and clinical indications for appropriate retesting. Response to the alerts (with the option to cancel or proceed with test request) were retrospectively audited for a 24-month period (October 15, 2021–October 15, 2023) one year after alerts were implemented. Alert fatigue was defined as a decrease in test cancellation rate in response to the alert over time.
Total number of tests where the alert was deployed was 42298. The percentage of tests where the requester canceled the test remained between 43% and 46% with no significant decline from Q1 to Q4 for each of 2022 and 2023. There was a small decrease in test cancellation over time (OR 0.97 per year, 95%CI: 0.93, 1.00) but this difference was not statistically significant (P = 0.062) (Supplementary Table 1).
Test cancellation rate differed by profession: 43% for Doctors versus 57% for Nurses. Cancellation rate was similar for doctors independent of seniority (43% for Consultant and 43% for Medical Officer and Medical Students). In a multivariable regression, nurses were more likely to cancel tests than junior medical staff (OR 1.60, 95%CI: 1.48, 1.73) and serology tests were less likely to be canceled than culture tests (OR 0.60, 95%CI: 0.56, 0.64) (Supplementary Table 2 and 3).
Between 7am and 4pm test cancellation rate declined steadily and was lowest between midday and 4pm (41%) compared to between midnight and 4am (50%) (Figure 1). To our knowledge, there are no previous studies examining response to alerts based on time of day. In medicine, several studies have found temporal variations in clinical actions that are potentially explained by decision fatigue. One study found that clinicians ordered more tests in the first hour compared to the last hour of the day. Rates of cancer screening tests, influenza vaccination, appropriate antibiotic, and opiate prescribing, the probability of orthopedic surgeons deciding to operate, and hand hygiene compliance have been shown to decline over the course of the day: this decline was magnified by increased work intensity. An active choice intervention in the electronic record prompting staff to ask patients about vaccination was associated with a significant increase in vaccination rates that were similar in magnitude throughout the day. Reference Trinh, Hoover and Sonnenberg8
We found no evidence of alert fatigue in response to our “soft-stop” alerts during a 24-month period, in contrast to The Mayo Clinic findings where override rates increased the longer the intervention was in place – in this study a duplicate order was allowed only if the requester provided the indication for repeat in a free text field. Reference Moyer, Saenger and Willrich9 Mandatory requirement for information may be a disincentive for optimal requesting as ordering practice may change with the intention of avoiding the alert. A study of a “soft” CDS intervention on repeat troponin orders demonstrated no impact on troponin ordering. Reference Love, McKinney and Sandoval10 Another study compared a “hard-stop” CDS rule (phone-call necessary to override) vs “soft-stop” rule for duplicate laboratory orders placed on the same day: this study found that the soft stop reduced duplicate testing by only 42.6%, compared to 92.3% for the hard stop. Reference Procop, Keating and Stagno4 Consideration needs to be given as to whether the requirement for a phone-call may also be a disincentive to optimal test ordering. Further studies are required to better understand how to design alerts to optimize their impact by profession and test-type and knowledge of time-of-day effects on test ordering may influence design of alerts and may be used as a marker of workload pressure points.
When studying the response to alerts over time, it is important to take into consideration the impact of learning. Studies have found the decrease in pop-up alert volumes during the first 3 months of the intervention suggesting “that providers changed test ordering patterns to avoid interacting with the alert.” Reference Moyer, Saenger and Willrich9 and that “alerts may act through a combination of just-in-time advice and longer term education” since as clinicians repeatedly viewed the alert, there was a “dose-dependent” decrease in the fraction of searches without orders. Reference Baron, Huang and McEvoy5 One limitation of our study is that we are not able to determine the effect of education on the response to alerts.
In conclusion, we found no evidence of significant alert fatigue in response to our electronic duplicate order alerts one year after alert implementation. The provision of information and customization of advice may improve acceptance of alerts and help reduce alert fatigue by assisting the time-poor requester to make optimized decisions. Further studies are required to determine the factors that influence response to electronic alerts to better identify design and implementation strategies to optimize sustainability of alert impact and therefore ultimately environmental sustainability.
Supplementary material
The supplementary material for this article can be found at https://doi.org/10.1017/ice.2024.183.
Financial support
None.
Competing interests
None.
Ethical standard
Ethics approval for this study was obtained with Monash Health reference number RES-23-0000-836Q.