Hostname: page-component-78c5997874-ndw9j Total loading time: 0 Render date: 2024-11-10T05:49:46.272Z Has data issue: false hasContentIssue false

A multimedia paediatric cardiology assessment tool for medical students and general paediatric trainees: development and validation

Published online by Cambridge University Press:  12 April 2022

Hunter C. Wilson
Affiliation:
Michigan Medicine Congenital Heart Center, C. S. Mott Children’s Hospital, Ann Arbor, MI, USA
Tiffany R. Lim
Affiliation:
Michigan Medicine Congenital Heart Center, C. S. Mott Children’s Hospital, Ann Arbor, MI, USA
David M. Axelrod
Affiliation:
Stanford University School of Medicine, Palo Alto, CA, USA
David K. Werho
Affiliation:
Rady Children’s Hospital San Diego, University of California San Diego, San Diego, CA, USA
Stephanie S. Handler
Affiliation:
Medical College of Wisconsin Herma Heart Institute, Milwaukee, WI, USA
Patricia B. Mullan
Affiliation:
Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, USA
James M. Cooke
Affiliation:
Department of Learning Health Sciences, University of Michigan, Ann Arbor, MI, USA Department of Family Medicine, Michigan Medicine, Ann Arbor, MI, USA
Sonal T. Owens*
Affiliation:
Michigan Medicine Congenital Heart Center, C. S. Mott Children’s Hospital, Ann Arbor, MI, USA
*
Author for correspondence: Sonal T. Owens, Michigan Medicine Congenital Heart Center, C. S. Mott Children’s Hospital, 1540 East Hospital Dr, Ann Arbor, MI, 48109, USA. Tel: 734-615-2369; Fax: 734-936-9470. E-mail: sthakkar@med.umich.edu.
Rights & Permissions [Opens in a new window]

Abstract

Background:

Understanding how cardiovascular structure and physiology guide management is critically important in paediatric cardiology. However, few validated educational tools are available to assess trainee knowledge. To address this deficit, paediatric cardiologists and fellows from four institutions collaborated to develop a multimedia assessment tool for use with medical students and paediatric residents. This tool was developed in support of a novel 3-dimensional virtual reality curriculum created by our group.

Methods:

Educational domains were identified, and questions were iteratively developed by a group of clinicians from multiple centres to assess understanding of key concepts. To evaluate content validity, content experts completed the assessment and reviewed items, rating item relevance to educational domains using a 4-point Likert scale. An item-level content validity index was calculated for each question, and a scale-level content validity index was calculated for the assessment tool, with scores of ≥0.78 and ≥0.90, respectively, representing excellent content validity.

Results:

The mean content expert assessment score was 92% (range 88–97%). Two questions yielded ≤50% correct content expert answers. The item-level content validity index for 29 out of 32 questions was ≥0.78, and the scale-level content validity index was 0.92. Qualitative feedback included suggestions for future improvement. Questions with ≤50% content expert agreement and item-level content validity index scores <0.78 were removed, yielding a 27-question assessment tool.

Conclusions:

We describe a multi-centre effort to create and validate a multimedia assessment tool which may be implemented within paediatric trainee cardiology curricula. Future efforts may focus on content refinement and expansion to include additional educational domains.

Type
Original Article
Copyright
© The Author(s), 2022. Published by Cambridge University Press

Paediatric cardiology is a discipline with unique anatomic and physiologic intersections. Clinical care depends on understanding how cardiovascular anatomic aberrations lead to altered physiologies, which in turn guide management strategies, all within a broader landscape of constant growth and development in children. Imparting understanding of basic concepts in paediatric cardiology is an important directive for paediatric residency training programmes. Despite availability of specialised cardiovascular care, patients with congenital heart disease (CHD) still require care coordination and a medical home, anchored by clinicians with a firm understanding of basic concepts in cardiovascular anatomy and physiology. Reference Fernandes and Sanders1 Furthermore, survivorship among patients with CHD is increasing, and primary care providers will care for an increasing number of patients with CHD. Reference Mandalenakis, Rosengren, Skoglund, Lappas, Eriksson and Dellborg2,Reference Best and Rankin3 Paediatric resident and medical student exposure to paediatric cardiology during their training are limited and must be used efficiently. Accordingly, several recent efforts have sought to enhance trainee education via curricular adjuncts to clinical training, including utilisation of 3-dimensional models for teaching, patient simulation, and creation of immersive training experiences for residents transitioning to cardiology fellowship. Reference Jones and Seckeler4Reference Harris, Adler, Unti and McBride9 However, despite educational advancements, there are few tools available with which to assess trainee knowledge of core educational concepts and guide curricular development.

In this study, we describe our effort to develop a robust assessment tool for medical students and paediatric residents participating in a paediatric cardiology rotation during their clinical training. We also report our derivation of content validity for the assessment tool using structured content expert feedback. Our objective was to create an assessment tool that tested trainee knowledge of core concepts in paediatric cardiology and could be used as a standard with which to assess educational achievements and evaluate efficacy of educational curricular interventions; specifically, this tool was created in tandem with and in support of a 3-dimensional virtual reality curriculum intended for medical students and residents completing paediatric cardiology rotations. We have successfully used this educational tool to evaluate efficacy of this curriculum; results of this effort are currently submitted for publication.

Materials and methods

A group of paediatric cardiologists and fellows from four institutions collaborated to develop base content for the assessment tool with intent to develop material appropriate for medical students and paediatric residents participating in paediatric cardiology rotations. Concepts were iteratively discussed among group members to determine overarching domains to assess that we felt were most relevant to our discipline and to our defined group of learners. We also aimed to generate material that would evaluate knowledge and visuospatial concepts gained from a 3-dimensional virtual reality curriculum developed concomitantly by our group. Ultimately, cardiovascular anatomy, physiology, and clinical applications were identified as core educational domains. To assess these educational domains, six congenital cardiovascular lesions were identified, and questions were created to evaluate understanding of material using core educational domains to guide question development (Table 1). These six simple lesions were chosen as they were felt to be conceptually within the scope of medical students and paediatric residents yet also offered significant potential to allow for assessment of important concepts encompassed by our educational domains. Video-based and graphic questions were created to test anatomic and physiologic concepts. After the initial iterative editing and review process, a 32-item assessment tool was generated. Many questions were lesion-specific, but some questions relied upon understanding of multiple lesions and core visuospatial concepts. The assessment tool was platformed within Research Electronic Data Capture hosted at the University of Michigan, which allowed for completion of the assessment tool electronically and visualisation of video clips associated with certain questions. Reference Harris, Taylor, Thielke, Payne, Gonzalez and Conde10,Reference Harris, Taylor and Minor11

Table 1. Educational domains and question development.

Following development of the initial assessment tool, we sought to assess the product’s content validity. Content validity is defined as the ability of an assessment item or tool as a whole to adequately measure the education domains it is designed to assess and may be derived by soliciting structured feedback from content experts. Reference Polit and Beck12 Accordingly, six content experts from three different institutions were identified to review the product. Qualifications for each content expert included current practice as a paediatric cardiologist and academic rank of associate professor or professor. No content expert provided any prior input during creation of the assessment tool. Each content expert completed the assessment tool without designated correct answers available. Content experts were then provided with an answer key and instructed to rate the relevance of each item to educational intent using a 4-point Likert scale, where 1 = Not relevant, 2 = Somewhat relevant, 3 = Quite relevant, and 4 = Highly relevant. Space for qualitative feedback was provided. Feedback was requested for items rated “1” or “2,” but was otherwise optional.

Content expert performances on the assessment tool were then graded against correct answers that had been established a priori during the assessment tool’s creation. Each question was then specifically evaluated to determine the percentage of content experts who answered the question incorrectly. Content expert ratings for each question were then tabulated. An item-level content validity index was then calculated for each item by dividing the number of experts who rated the item as “3” (quite relevant) or “4” (highly relevant) by the total number of content experts who reviewed the item, in accordance with methods described by Polit et al. Reference Polit and Beck12 A scale-level content validity index was then calculated by adding each item-level content validity index and dividing the sum by the total number of items in the assessment tool. We defined a priori individual item-level content validity index scores of ≥0.78 and a scale-level content validity index score of ≥0.90 as criteria by which each item and the scale as a whole, respectively, would be judged to have excellent content validity. Reference Polit and Beck12 As this manuscript describes development of the assessment tool, reproducibility of the assessment tool – that is, the ability of different raters to score learners’ assessments similarly – was not assessed but would likely be high as correct answers for the assessment tool are predominantly multiple choice and were determined through the development process. Consequently, minimal subjective rater input is required for scoring of the assessment tool.

Results

The group of content experts was comprised of one associate professor and five professors of paediatrics. The group averaged 25 ± 10 years of experience in clinical paediatric cardiology. Each content expert completed the assessment and provided subsequent feedback on items in accordance with our instructions. The average content expert score on the assessment was 92% (range 88–97%). There were eight questions with at least one incorrect content expert answer and two questions where ≤50% of the content expert answers agreed with answers we had originally established as correct. Calculation of item-level content validity index for each question yielded three items with item-level content validity index scores less than the pre-determined cut-off of 0.78. The scale-level content validity index for the assessment tool was 0.92. Content expert ratings, percentages of correct answers, and item-level content validity index calculations for select items included in the assessment tool are provided in Table 2. Items with ≤50% of the content expert agreement upon answers we had originally established as correct and those with item-level content validity indices <0.78 were removed yielding a 27-question assessment tool.

Table 2. Content expert (CE) ratings, percentage of questions answered correctly for each item, and calculated item-level content validity index (I-CVI) for each item.

CE ratings range from 1 to 4 according to how relevant each item was to stated learning objectives, where 1 = Not relevant, 2 = Somewhat relevant, 3 = Quite relevant, and 4 = Highly relevant. Items with ≤50% correct answers and I-CVI scores of <0.78 are bold. Items 1, 4, 7, 21, and 31 were removed from the final version of the assessment tool.

Each content expert rated item relevance and provided narrative feedback on the items. The content expert ratings and comments characterised the majority of items as relevant to learning objectives. Content experts also suggested several avenues by which the tool may be improved in future iterations, summarised in the following themes:

  1. 1. Questions in future versions of the tool may benefit from better distribution among the six tested lesions.

  2. 2. Content experts characterised certain questions, particularly those with echocardiogram clips, as relevant but potentially too advanced for paediatric residents. One of the questions which received an item-level content validity score below our pre-determined standard did contain an echocardiogram still frame image; this question was removed from the final version of the assessment tool.

  3. 3. Content experts suggested that future versions of the assessment tool may benefit from addition of questions targeting knowledge of additional congenital cardiovascular lesions.

Our process for creation and validation of our assessment tool is shown in Figure 1. The assessment tool in its final form is included as a data supplement (Data Supplement 1). A more detailed analysis of trainee responses for each question following implementation of the assessment tool for evaluation of a novel virtual reality curriculum developed by our group at several institutions is included in a manuscript which is currently submitted for publication.

Figure 1. To generate the assessment tool, educational domains were identified, content experts (CEs) reviewed and scored the tool, and poor questions were eliminated.

Discussion

In this report, we describe collaborative development of a validated assessment tool which may be implemented as a part of curricula for paediatric medical students and residents rotating through paediatric cardiology. During the content validation portion of this project, content expert review provided useful insight into question structure and content that allowed us to identify poorly constructed questions. There were several questions with answers from content experts that disagreed with answers we had designated as correct, but two questions in particular were notable as ≤50% of content experts received credit for the questions. Each of these questions required selection of multiple answers for credit to be given. The discrepant answers provided by experts in our discipline suggested that these items suffered from poor wording or structural deficits and led to our decision to remove these questions from the final version of the assessment tool. We did not require complete agreement on answers from each content expert for every question as we felt that this requirement would be too stringent and result in elimination of questions that otherwise were rated as appropriate for our educational domains.

In the second part of the validation process, content experts were asked to rate questions individually so that content validity of the assessment tool could be assessed. In the medical literature, derivation of content validity in keeping with professional standards for validity of assessments lends rigour to educational tools designed to evaluate trainee performance. Reference Tausch, Kowalewski, White, McDonough, Brand and Lendvay13,Reference Downing14 By seeking content expert feedback, we received both quantitative and qualitative input on items that comprised our assessment tool, which provided valuable insight into adequacy of questions as currently written and suggestions for future avenues of improvement. Review of our assessment tool revealed three questions with item-level content validity index scores less than our designated cut-off of 0.78. Evaluation of corroborating commentary for each of these items showed that one item was felt to be controversial and that its content did not clearly fit within our learning objectives. The remaining two questions were felt to be beyond the scope of expected knowledge for a paediatric resident. Given unsatisfactory ratings on each of these items, they were ultimately removed from the final version of the assessment tool, which after additional removal of the questions with ≤50% correct content expert answers yielded a 27-question assessment tool. Of note, there was no overlap between questions with item-level content validity index scores of <0.78 and those with ≤50% of correct initial content expert answers. Importantly, scale-level content validity index of the assessment tool was 0.92 even before removal of items of concern, which is above the cut-off of 0.90 that is recognised as the threshold above which a scale is determined to have excellent content validity.

Our assessment tool offers a valuable means with which to assess knowledge of medical students and residents in our institutions who are participating in clinical experiences within paediatric cardiology. Increasingly, new technologies, such as 3-D printing, simulation, and virtual reality, offer the opportunity to enhance educational experiences. Reference Harris, Adler, Unti and McBride7,Reference Costello, Olivieri and Su15Reference Rogers and Cohen17 Our assessment tool was developed as a means with which to evaluate efficacy of a novel 3-dimensional curriculum our group has developed for medical students and residents undergoing rotations in paediatric cardiology and has been successfully used to this effect. This curriculum is based upon the Stanford Virtual Heart, which is a programme that allows users to explore virtually 3-dimensional cardiovascular anatomy. 18 This curriculum included guided narratives for learners to explore the six lesions we used in our assessment tool to evaluate identified educational domains. Results of this effort have been submitted for publication. Although our assessment tool was developed to evaluate the novel virtual reality curriculum developed by our group, use of the assessment tool is not limited to this specific intervention and may be of interest to other centres as a means with which to evaluate baseline trainee knowledge to guide curricular development, evaluate trainee knowledge as an adjunct method of rating a trainee’s performance on a rotation, or as a platform for development of content that could be individualised according to different centres’ goals. As programmes use different methodologies to evaluate trainee performance, incorporation of a tool such as this would be in accordance with individual centre needs and extant assessment products already in use.

The strengths of this tool lie in part in its ability to target understanding of anatomic determinants of pathophysiology; for example, why left-sided chambers dilate in patients with large ventricular septal defects. We feel that understanding of concepts such as these is important in our field, where often the physiologic consequences of structural defects are the underpinnings of clinical sequelae and ultimately inform management strategies. Our assessment tool has the added benefit of undergoing validation through expert review, unlike many pre- and post-interventional assessments used for evaluation of various curricula. The methodologies applied in our effort also demonstrate a means with which content validity may be evaluated in a more quantitative fashion for other tools developed for medical education and should be of interest to clinicians interested in developing educational curricula for trainees of all levels.

Our work has several limitations. First, the items used in our assessment tool focus on six specific cardiovascular lesions. Second, there are four items with echocardiography media in the final version of the assessment, which content experts felt may require knowledge to answer beyond that which would be expected for residents. However, in at least one of our centres, medical students and residents are exposed to didactic echocardiography curricula and clinical echocardiography. Accordingly, we felt that a few questions including basic echocardiography media, particularly as relevant to our educational domains, were not outside the realm of expectation for learners passing through a paediatric cardiology rotation. We also felt that basic ultrasound understanding would be reasonable to convey given increasing use of point of care ultrasound in the broader landscape. We therefore retained questions in the assessment tool with echocardiogram clips that otherwise satisfied our criteria for inclusion within the final version of the assessment tool. Programmes without basic echocardiogram exposure may consider withholding these questions if the assessment tool is implemented at their centre. Another limitation is that as part of the iterative development process for generation of this assessment tool, questions are not equally distributed among all anatomic lesions; however, we feel that the tool in its entirety provides a reasonable assessment of the educational domains we sought to address. Finally, we did not solicit feedback from learners on question clarity. Solicitation of feedback from end-users is not typically used to derive content validity using our methodology but could provide another facet of insight into adequacy of question structure and could be used in future endeavours aimed at generation of medical educational assessment tools related to this effort or otherwise.

In conclusion, we report creation of a validated multimedia 27-question assessment tool that may be used to assess knowledge of key cardiovascular concepts among medical students and residents participating in paediatric cardiology rotations. Our assessment tool has been successfully implemented as part of a multi-centre effort to evaluate a novel virtual reality curriculum, the results of which have been submitted for publication. Our assessment tool and the methodologies used for derivation of content validity described herein should be of interest to clinicians who work with trainees in paediatric cardiology. Ongoing and future efforts will refine and improve questions and may expand the tool to include additional educational domains within paediatric cardiology.

Supplementary material

To view supplementary material for this article, please visit https://doi.org/10.1017/S1047951122001123

Acknowledgements

We thank Scott Ceresnak, Carlen Fifer, Leo Lopez, James Perry, Beth Printz, and Albert Rocchini for their participation as content experts and Brian Fagan for his assistance with question development.

Financial support

This work was supported in part by a grant from the Gilbert Whitaker Fund for the Improvement of Teaching awarded through the University of Michigan. The sponsor had no role in the study design, collection, analysis, or interpretation of data, in the writing of this report, or in the decision to submit this manuscript for publication.

Conflicts of interest

DMA currently serves as lead medical advisor and a shareholder at Lighthaus Inc., the company that created The Stanford Virtual Heart. The remaining authors disclose no conflicts of interest.

References

Fernandes, SM, Sanders, LM. Patient-centered medical home for patients with complex congenital heart disease. Curr Opin Pediatr 2015; 27: 581586.CrossRefGoogle ScholarPubMed
Mandalenakis, Z, Rosengren, A, Skoglund, K, Lappas, G, Eriksson, P, Dellborg, M. Survivorship in children and young adults with congenital heart disease in Sweden. JAMA Intern Med 2017; 177: 224230.CrossRefGoogle Scholar
Best, KE, Rankin, J. Long-term survival of individuals born with congenital heart disease: a systematic review and meta-analysis. J Am Heart Assoc 2016; 5: e002846.CrossRefGoogle ScholarPubMed
Jones, TW, Seckeler, MD. Use of 3D models of vascular rings and slings to improve resident education. Congenit Heart Dis 2017; 12: 578582.CrossRefGoogle ScholarPubMed
Harahsheh, AS, Ottolini, M, Lewis, K, Blatt, B, Mitchell, S, Greenberg, L. An innovative pilot curriculum training pediatric residents in referral and communication skills on a cardiology rotation. Acad Pediatr 2016; 16: 700702.CrossRefGoogle ScholarPubMed
Mohan, S, Follansbee, C, Nwankwo, U, Hofkosh, D, Sherman, FS, Hamilton, MF. Embedding patient simulation in a pediatric cardiology rotation: a unique opportunity for improving resident education. Congenit Heart Dis 2015; 10: 8894.CrossRefGoogle Scholar
Harris, TH, Adler, M, Unti, SM, McBride, ME. Pediatric heart disease simulation curriculum: educating the pediatrician. Congenit Heart Dis 2017; 12: 546553.CrossRefGoogle ScholarPubMed
Ceresnak, SR, Axelrod, DM, Motonaga, KS, Johnson, ER, Krawczeski, CD. Pediatric cardiology boot camp: description and evaluation of a novel intensive training program for pediatric cardiology trainees. Pediatr Cardiol 2016; 37: 834844.CrossRefGoogle ScholarPubMed
Harris, TH, Adler, M, Unti, SM, McBride, ME. Pediatric heart disease simulation curriculum: educating the pediatrician. Congenit Heart Dis 2017; 12: 546553.CrossRefGoogle ScholarPubMed
Harris, PA, Taylor, R, Thielke, R, Payne, J, Gonzalez, N, Conde, JG. Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009; 42: 377381.CrossRefGoogle ScholarPubMed
Harris, PA, Taylor, R, Minor, BL, et al. The REDCap consortium: building an international community of software platform partners. J Biomed Inform 2019; 95: 103208.CrossRefGoogle ScholarPubMed
Polit, DF, Beck, CT. The content validity index: are you sure you know what’s being reported? Critique and recommendations. Res Nurs Health 2006; 29: 489497.CrossRefGoogle ScholarPubMed
Tausch, TJ, Kowalewski, TM, White, LW, McDonough, PS, Brand, TC, Lendvay, TS. Content and construct validation of a robotic surgery curriculum using an electromagnetic instrument tracker. J Urol 2012; 188: 919923.CrossRefGoogle ScholarPubMed
Downing, SM. Validity: on the meaningful interpretation of assessment data. Med Educ 2003; 37: 830837.CrossRefGoogle ScholarPubMed
Costello, JP, Olivieri, LJ, Su, L, et al. Incorporating three-dimensional printing into a simulation-based congenital heart disease and critical care training curriculum for resident physicians. Congenit Heart Dis 2015; 10: 185190.CrossRefGoogle ScholarPubMed
Silva, JNA, Southworth, M, Raptis, C, Silva, J. Emerging applications of virtual reality in cardiovascular medicine. JACC Basic Transl Sci 2018; 3: 420430.CrossRefGoogle ScholarPubMed
Rogers, LS, Cohen, MS. Medical education in pediatric and congenital heart disease: a focus on generational learning and technology in education. Prog Pediatr Cardiol 2020; 59: 101305.CrossRefGoogle ScholarPubMed
Stanford Children’s Health, Stanford Lucile Packard Children’s Hospital. Stanford pioneers use of VR for patient care, education and experience. Retrieved November 11, 2021, from http://www.stanfordchildrens.org/en/about/news/releases/2017/virtual-reality-program Google Scholar
Figure 0

Table 1. Educational domains and question development.

Figure 1

Table 2. Content expert (CE) ratings, percentage of questions answered correctly for each item, and calculated item-level content validity index (I-CVI) for each item.

Figure 2

Figure 1. To generate the assessment tool, educational domains were identified, content experts (CEs) reviewed and scored the tool, and poor questions were eliminated.