We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The central question this study sought to answer was whether the team members of Strategic Crisis Teams (SCTs) participating in mass-casualty incident (MCI) exercises in the Netherlands learn from their participation.
Methods
Evaluation reports of exercises that took place at two different times were collected and analyzed against a theoretical model with several dimensions, looking at both the quality of the evaluation methodology (three criteria: objectives described, link between objective and items for improvement, and data-collection method) and the learning effect of the exercise (one criterion: the change in number of items for improvement).
Results
Of all 32 evaluation reports, 81% described exercise objectives; 30% of the items for improvement in the reports were linked to these objectives, and 22% of the 32 evaluation reports used a structured template to describe the items for improvement. In six evaluation categories, the number of items for improvement increased between the first (T1) and the last (T2) evaluation report submitted by hospitals. The number of items remained equal for two evaluation categories and decreased in six evaluation categories.
Conclusion
The evaluation reports do not support the ideal-typical disaster exercise process. The authors could not establish that team members participating in MCI exercises in the Netherlands learn from their participation. More time and effort must be spent on the development of a validated evaluation system for these simulations, and more research into the role of the evaluator is needed.
Verheul MLMI, Dückers MLA, Visser BB, Beerens RJJ, Bierens JJLM. Disaster exercises to prepare hospitals for mass-casualty incidents: does it contribute to preparedness or is it ritualism? Prehosp Disaster Med. 2018;33(4):387–393
Recent events have brought disaster medicine into the public focus. Both the government and communities expect hospitals to be prepared to cope with all types of emergencies. Disaster simulations are the traditional method of testing hospital disaster plans, but a recent, comprehensive, literature review failed to find any substantial scientific data proving the benefit of these resource and time-consuming exercises.
Objectives:
The objective of this study was to test the hypothesis that an audiovisual presentation of the hospital disaster plans followed by a simulated disaster exercise and debriefing improved staff knowledge, confidence, and hospital preparedness for disasters.
Methods:
A survey of 50 members of the medical, nursing, and administrative staff were chosen from a pool of approximately 170 people likely to be in a position of responsibility in the event of a disaster.The pre-intervention survey tested factual knowledge as well as perceptions about individual and departmental preparedness. Post-intervention, the same 50 staff members were asked to repeat the survey, which included additional questions establishing their involvement in the exercise.
Results:
There were 50 pre-intervention tests and 42 post-intervention tests. The intervention resulted in a significant improvement in test pass rate: preintervention pass rate 9/50 (18%, 95% confidence interval ((CI) = 16.1–19.9%) versus post-intervention pass rate 21/42 (50%, 95% CI = 42.4–57.6%; X2 test, p = 0.002). Emergency department (ED) staff had a stronger baseline knowledge than non-ED staff: ED pre-test mean value for scores = 12.1 versus nonED scores of 6.2 (difference 5.9, 95% CI = 3.3–8.4); t-test, p <0.001. Those that attended >1 component had a greater increase in mean scores: increase in mean attendees was 5.6, versus the scores of non-attendees of 2.7 (difference 2.9, 95% CI = 1.0–4.9); t-test, p = 0.004. There was no significant increase in the general perception of preparedness. However, the majority of those surveyed described the exercise of benefit to themselves (53.7%,95% CI = 45.5–61.8%) and their department (63.2%, 95% CI = 53.5–72.8%).
Conclusions:
The disaster exercise and educational process had the greatest benefit for individuals and departments involved directly. The intervention also prompted enterprise-wide review, and an upgrade of disaster plans at departmental levels. Pre-intervention knowledge scores were poor. Post-intervention knowledge base remained suboptimal, despite a statistically significant improvement. This study supports the widely held belief that disaster simulation is a worthwhile exercise, but more must be done. More time and resources must be dedicated to the increasingly important field of hospital disaster preparedness.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.