Paediatric cardiology is a discipline with unique anatomic and physiologic intersections. Clinical care depends on understanding how cardiovascular anatomic aberrations lead to altered physiologies, which in turn guide management strategies, all within a broader landscape of constant growth and development in children. Imparting understanding of basic concepts in paediatric cardiology is an important directive for paediatric residency training programmes. Despite availability of specialised cardiovascular care, patients with congenital heart disease (CHD) still require care coordination and a medical home, anchored by clinicians with a firm understanding of basic concepts in cardiovascular anatomy and physiology. Reference Fernandes and Sanders1 Furthermore, survivorship among patients with CHD is increasing, and primary care providers will care for an increasing number of patients with CHD. Reference Mandalenakis, Rosengren, Skoglund, Lappas, Eriksson and Dellborg2,Reference Best and Rankin3 Paediatric resident and medical student exposure to paediatric cardiology during their training are limited and must be used efficiently. Accordingly, several recent efforts have sought to enhance trainee education via curricular adjuncts to clinical training, including utilisation of 3-dimensional models for teaching, patient simulation, and creation of immersive training experiences for residents transitioning to cardiology fellowship. Reference Jones and Seckeler4–Reference Harris, Adler, Unti and McBride9 However, despite educational advancements, there are few tools available with which to assess trainee knowledge of core educational concepts and guide curricular development.
In this study, we describe our effort to develop a robust assessment tool for medical students and paediatric residents participating in a paediatric cardiology rotation during their clinical training. We also report our derivation of content validity for the assessment tool using structured content expert feedback. Our objective was to create an assessment tool that tested trainee knowledge of core concepts in paediatric cardiology and could be used as a standard with which to assess educational achievements and evaluate efficacy of educational curricular interventions; specifically, this tool was created in tandem with and in support of a 3-dimensional virtual reality curriculum intended for medical students and residents completing paediatric cardiology rotations. We have successfully used this educational tool to evaluate efficacy of this curriculum; results of this effort are currently submitted for publication.
Materials and methods
A group of paediatric cardiologists and fellows from four institutions collaborated to develop base content for the assessment tool with intent to develop material appropriate for medical students and paediatric residents participating in paediatric cardiology rotations. Concepts were iteratively discussed among group members to determine overarching domains to assess that we felt were most relevant to our discipline and to our defined group of learners. We also aimed to generate material that would evaluate knowledge and visuospatial concepts gained from a 3-dimensional virtual reality curriculum developed concomitantly by our group. Ultimately, cardiovascular anatomy, physiology, and clinical applications were identified as core educational domains. To assess these educational domains, six congenital cardiovascular lesions were identified, and questions were created to evaluate understanding of material using core educational domains to guide question development (Table 1). These six simple lesions were chosen as they were felt to be conceptually within the scope of medical students and paediatric residents yet also offered significant potential to allow for assessment of important concepts encompassed by our educational domains. Video-based and graphic questions were created to test anatomic and physiologic concepts. After the initial iterative editing and review process, a 32-item assessment tool was generated. Many questions were lesion-specific, but some questions relied upon understanding of multiple lesions and core visuospatial concepts. The assessment tool was platformed within Research Electronic Data Capture hosted at the University of Michigan, which allowed for completion of the assessment tool electronically and visualisation of video clips associated with certain questions. Reference Harris, Taylor, Thielke, Payne, Gonzalez and Conde10,Reference Harris, Taylor and Minor11
Following development of the initial assessment tool, we sought to assess the product’s content validity. Content validity is defined as the ability of an assessment item or tool as a whole to adequately measure the education domains it is designed to assess and may be derived by soliciting structured feedback from content experts. Reference Polit and Beck12 Accordingly, six content experts from three different institutions were identified to review the product. Qualifications for each content expert included current practice as a paediatric cardiologist and academic rank of associate professor or professor. No content expert provided any prior input during creation of the assessment tool. Each content expert completed the assessment tool without designated correct answers available. Content experts were then provided with an answer key and instructed to rate the relevance of each item to educational intent using a 4-point Likert scale, where 1 = Not relevant, 2 = Somewhat relevant, 3 = Quite relevant, and 4 = Highly relevant. Space for qualitative feedback was provided. Feedback was requested for items rated “1” or “2,” but was otherwise optional.
Content expert performances on the assessment tool were then graded against correct answers that had been established a priori during the assessment tool’s creation. Each question was then specifically evaluated to determine the percentage of content experts who answered the question incorrectly. Content expert ratings for each question were then tabulated. An item-level content validity index was then calculated for each item by dividing the number of experts who rated the item as “3” (quite relevant) or “4” (highly relevant) by the total number of content experts who reviewed the item, in accordance with methods described by Polit et al. Reference Polit and Beck12 A scale-level content validity index was then calculated by adding each item-level content validity index and dividing the sum by the total number of items in the assessment tool. We defined a priori individual item-level content validity index scores of ≥0.78 and a scale-level content validity index score of ≥0.90 as criteria by which each item and the scale as a whole, respectively, would be judged to have excellent content validity. Reference Polit and Beck12 As this manuscript describes development of the assessment tool, reproducibility of the assessment tool – that is, the ability of different raters to score learners’ assessments similarly – was not assessed but would likely be high as correct answers for the assessment tool are predominantly multiple choice and were determined through the development process. Consequently, minimal subjective rater input is required for scoring of the assessment tool.
Results
The group of content experts was comprised of one associate professor and five professors of paediatrics. The group averaged 25 ± 10 years of experience in clinical paediatric cardiology. Each content expert completed the assessment and provided subsequent feedback on items in accordance with our instructions. The average content expert score on the assessment was 92% (range 88–97%). There were eight questions with at least one incorrect content expert answer and two questions where ≤50% of the content expert answers agreed with answers we had originally established as correct. Calculation of item-level content validity index for each question yielded three items with item-level content validity index scores less than the pre-determined cut-off of 0.78. The scale-level content validity index for the assessment tool was 0.92. Content expert ratings, percentages of correct answers, and item-level content validity index calculations for select items included in the assessment tool are provided in Table 2. Items with ≤50% of the content expert agreement upon answers we had originally established as correct and those with item-level content validity indices <0.78 were removed yielding a 27-question assessment tool.
CE ratings range from 1 to 4 according to how relevant each item was to stated learning objectives, where 1 = Not relevant, 2 = Somewhat relevant, 3 = Quite relevant, and 4 = Highly relevant. Items with ≤50% correct answers and I-CVI scores of <0.78 are bold. Items 1, 4, 7, 21, and 31 were removed from the final version of the assessment tool.
Each content expert rated item relevance and provided narrative feedback on the items. The content expert ratings and comments characterised the majority of items as relevant to learning objectives. Content experts also suggested several avenues by which the tool may be improved in future iterations, summarised in the following themes:
-
1. Questions in future versions of the tool may benefit from better distribution among the six tested lesions.
-
2. Content experts characterised certain questions, particularly those with echocardiogram clips, as relevant but potentially too advanced for paediatric residents. One of the questions which received an item-level content validity score below our pre-determined standard did contain an echocardiogram still frame image; this question was removed from the final version of the assessment tool.
-
3. Content experts suggested that future versions of the assessment tool may benefit from addition of questions targeting knowledge of additional congenital cardiovascular lesions.
Our process for creation and validation of our assessment tool is shown in Figure 1. The assessment tool in its final form is included as a data supplement (Data Supplement 1). A more detailed analysis of trainee responses for each question following implementation of the assessment tool for evaluation of a novel virtual reality curriculum developed by our group at several institutions is included in a manuscript which is currently submitted for publication.
Discussion
In this report, we describe collaborative development of a validated assessment tool which may be implemented as a part of curricula for paediatric medical students and residents rotating through paediatric cardiology. During the content validation portion of this project, content expert review provided useful insight into question structure and content that allowed us to identify poorly constructed questions. There were several questions with answers from content experts that disagreed with answers we had designated as correct, but two questions in particular were notable as ≤50% of content experts received credit for the questions. Each of these questions required selection of multiple answers for credit to be given. The discrepant answers provided by experts in our discipline suggested that these items suffered from poor wording or structural deficits and led to our decision to remove these questions from the final version of the assessment tool. We did not require complete agreement on answers from each content expert for every question as we felt that this requirement would be too stringent and result in elimination of questions that otherwise were rated as appropriate for our educational domains.
In the second part of the validation process, content experts were asked to rate questions individually so that content validity of the assessment tool could be assessed. In the medical literature, derivation of content validity in keeping with professional standards for validity of assessments lends rigour to educational tools designed to evaluate trainee performance. Reference Tausch, Kowalewski, White, McDonough, Brand and Lendvay13,Reference Downing14 By seeking content expert feedback, we received both quantitative and qualitative input on items that comprised our assessment tool, which provided valuable insight into adequacy of questions as currently written and suggestions for future avenues of improvement. Review of our assessment tool revealed three questions with item-level content validity index scores less than our designated cut-off of 0.78. Evaluation of corroborating commentary for each of these items showed that one item was felt to be controversial and that its content did not clearly fit within our learning objectives. The remaining two questions were felt to be beyond the scope of expected knowledge for a paediatric resident. Given unsatisfactory ratings on each of these items, they were ultimately removed from the final version of the assessment tool, which after additional removal of the questions with ≤50% correct content expert answers yielded a 27-question assessment tool. Of note, there was no overlap between questions with item-level content validity index scores of <0.78 and those with ≤50% of correct initial content expert answers. Importantly, scale-level content validity index of the assessment tool was 0.92 even before removal of items of concern, which is above the cut-off of 0.90 that is recognised as the threshold above which a scale is determined to have excellent content validity.
Our assessment tool offers a valuable means with which to assess knowledge of medical students and residents in our institutions who are participating in clinical experiences within paediatric cardiology. Increasingly, new technologies, such as 3-D printing, simulation, and virtual reality, offer the opportunity to enhance educational experiences. Reference Harris, Adler, Unti and McBride7,Reference Costello, Olivieri and Su15–Reference Rogers and Cohen17 Our assessment tool was developed as a means with which to evaluate efficacy of a novel 3-dimensional curriculum our group has developed for medical students and residents undergoing rotations in paediatric cardiology and has been successfully used to this effect. This curriculum is based upon the Stanford Virtual Heart, which is a programme that allows users to explore virtually 3-dimensional cardiovascular anatomy. 18 This curriculum included guided narratives for learners to explore the six lesions we used in our assessment tool to evaluate identified educational domains. Results of this effort have been submitted for publication. Although our assessment tool was developed to evaluate the novel virtual reality curriculum developed by our group, use of the assessment tool is not limited to this specific intervention and may be of interest to other centres as a means with which to evaluate baseline trainee knowledge to guide curricular development, evaluate trainee knowledge as an adjunct method of rating a trainee’s performance on a rotation, or as a platform for development of content that could be individualised according to different centres’ goals. As programmes use different methodologies to evaluate trainee performance, incorporation of a tool such as this would be in accordance with individual centre needs and extant assessment products already in use.
The strengths of this tool lie in part in its ability to target understanding of anatomic determinants of pathophysiology; for example, why left-sided chambers dilate in patients with large ventricular septal defects. We feel that understanding of concepts such as these is important in our field, where often the physiologic consequences of structural defects are the underpinnings of clinical sequelae and ultimately inform management strategies. Our assessment tool has the added benefit of undergoing validation through expert review, unlike many pre- and post-interventional assessments used for evaluation of various curricula. The methodologies applied in our effort also demonstrate a means with which content validity may be evaluated in a more quantitative fashion for other tools developed for medical education and should be of interest to clinicians interested in developing educational curricula for trainees of all levels.
Our work has several limitations. First, the items used in our assessment tool focus on six specific cardiovascular lesions. Second, there are four items with echocardiography media in the final version of the assessment, which content experts felt may require knowledge to answer beyond that which would be expected for residents. However, in at least one of our centres, medical students and residents are exposed to didactic echocardiography curricula and clinical echocardiography. Accordingly, we felt that a few questions including basic echocardiography media, particularly as relevant to our educational domains, were not outside the realm of expectation for learners passing through a paediatric cardiology rotation. We also felt that basic ultrasound understanding would be reasonable to convey given increasing use of point of care ultrasound in the broader landscape. We therefore retained questions in the assessment tool with echocardiogram clips that otherwise satisfied our criteria for inclusion within the final version of the assessment tool. Programmes without basic echocardiogram exposure may consider withholding these questions if the assessment tool is implemented at their centre. Another limitation is that as part of the iterative development process for generation of this assessment tool, questions are not equally distributed among all anatomic lesions; however, we feel that the tool in its entirety provides a reasonable assessment of the educational domains we sought to address. Finally, we did not solicit feedback from learners on question clarity. Solicitation of feedback from end-users is not typically used to derive content validity using our methodology but could provide another facet of insight into adequacy of question structure and could be used in future endeavours aimed at generation of medical educational assessment tools related to this effort or otherwise.
In conclusion, we report creation of a validated multimedia 27-question assessment tool that may be used to assess knowledge of key cardiovascular concepts among medical students and residents participating in paediatric cardiology rotations. Our assessment tool has been successfully implemented as part of a multi-centre effort to evaluate a novel virtual reality curriculum, the results of which have been submitted for publication. Our assessment tool and the methodologies used for derivation of content validity described herein should be of interest to clinicians who work with trainees in paediatric cardiology. Ongoing and future efforts will refine and improve questions and may expand the tool to include additional educational domains within paediatric cardiology.
Supplementary material
To view supplementary material for this article, please visit https://doi.org/10.1017/S1047951122001123
Acknowledgements
We thank Scott Ceresnak, Carlen Fifer, Leo Lopez, James Perry, Beth Printz, and Albert Rocchini for their participation as content experts and Brian Fagan for his assistance with question development.
Financial support
This work was supported in part by a grant from the Gilbert Whitaker Fund for the Improvement of Teaching awarded through the University of Michigan. The sponsor had no role in the study design, collection, analysis, or interpretation of data, in the writing of this report, or in the decision to submit this manuscript for publication.
Conflicts of interest
DMA currently serves as lead medical advisor and a shareholder at Lighthaus Inc., the company that created The Stanford Virtual Heart. The remaining authors disclose no conflicts of interest.