Hostname: page-component-cd9895bd7-p9bg8 Total loading time: 0 Render date: 2024-12-27T07:20:42.265Z Has data issue: false hasContentIssue false

Clinical research coordinators’ instructional preferences for competency content delivery

Published online by Cambridge University Press:  31 October 2018

H. Robert Kolb*
Affiliation:
Regulatory Knowledge, Research Support and Service Center, JHMHC, Gainesville, FL, USA
Huan Kuang
Affiliation:
School of Human Development and Organizational Studies in Education, College of Education, CTSI, University of Florida, Gainesville, FL, USA
Linda S. Behar-Horenstein
Affiliation:
Colleges of Dentistry, Education, & Pharmacy, CTSI Educational Development & Evaluation, HRSA Faculty Development in Dentistry, Gainesville, FL, USA
*
*Address for correspondence: H. R. Kolb, RN, MS, CCRC, Assistant Director Clinical Research, Translational Workforce Directorate, Research Participant Advocate/Consultant, Regulatory Knowledge, Research Support and Service Center, JHMHC, P.O. Box 100322, Gainesville, FL 32610-0219, USA. (Email: kolbhr@ufl.edu)
Rights & Permissions [Opens in a new window]

Abstract

Introduction

A lack of standardized clinical research coordinator (CRC) training programs requires determining appropriate approaches for content delivery. The purpose of this study was to assess CRCs preferred training delivery methods related to the 8 designated Joint Task Force Clinical Trial Competency domains.

Methods

Repeated measures analysis of variance and split-plot analysis of variance were adopted to compare the group means among 5 training delivery methods by 8 competency content domains and to examine whether demographic variables caused different preference patterns on the training delivery methods.

Results

Participants reported a preference for online video; mentoring/coaching was the least preferred. Significant training delivery method preferences were reported for 3 content domains: participant safety considerations, medicines development and regulation, and clinical trials operations.

Discussion

Observed statistical differences in the training delivery methods by the content domains provides guidance for program development. Ensuring that standardized educational training is aligned with the needs of adult learners may help ensure that CRCs are appropriately prepared for the workforce.

Type
Education
Creative Commons
Creative Common License - CCCreative Common License - BYCreative Common License - NCSA
This is an Open Access article, distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike licence (http://creativecommons.org/licenses/by-ncsa/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided thesame Creative Commons licence is included and the original work is properly cited. The written permission of Cambridge University Press must be obtained for commercial re-use.
Copyright
© The Association for Clinical and Translational Science 2018

Introduction

The ongoing management of clinical research, from start-up to close-out, is generally delegated to a clinical research coordinator (CRC). The CRC is a highly specialized professional working in a research team whose responsibilities are critical to study trial success [Reference Woodin1]. It is the CRC who ensures that criteria are met and that complications are recognized and resolved directly. Clinical research translation requires a trained and well-prepared workforce of CRCs who can effectively and efficiently conduct critical testing in clinical trials [Reference Zerhouni2]. Moreover, the most recent version of the Declaration of Helsinki pointed out that, “Medical research must be conducted by individuals with appropriate training and qualifications in clinical research” [3]. They are essential to the success of the clinical research enterprise.

The current state of industry and federally funded clinical trials has been criticized for variable and inconsistent quality in the design, execution, analysis, and reporting of clinical trial activity [Reference Rosenberg4]. This dilemma is further exacerbated as the development of new drugs, devices, and behavioral interventions continue to be one of the most highly regulated endeavors in the United States [Reference Sparrow5]. At the same time, the intricacy of clinical trial protocols and the guidelines required to manage clinical trial activity has also increased in scope and complexity. One barrier to completion of effective, efficient, and rigorously conducted clinical trials is varying or missing competency-based training for study staff involved in clinical trials [Reference Sung6].

The Clinical and Translational Science Award (CTSA) Research Coordinator Taskforce recognized the need for improved training of CRCs when they reported that the “provision of adequate training and support…is critical to the overall goal of human subject protection” [Reference Speicher7]. Although emphasizing the need for appropriately trained CRCs, the Task Force concluded that current training programs must be improved. The absence of standardized requirements for providing and ensuring appropriate levels of qualification or professional standards compounds this dilemma.

To address this need, a national movement of professionals has been dedicated towards ensuring that there is a set of common core competencies from which to build standardized didactic curriculum. Following the work of the Joint Task Force (JTF) for Clinical Trial Competency, Sonstein, Seltzer, Li, Silva, Jones, and Daemen [Reference Sonstein8] provided a core competency framework for the clinical research professional. This action resulted in the development of a single, high-level set of 8 standards to be adopted globally and serve as a framework of defined professional competencies for the clinical research enterprise [Reference Sonstein8]. The domains include:

  1. (1) Scientific Concepts and Research Design: Knowledge of scientific concepts related to the design and analysis of clinical trials.

  2. (2) Ethical and Participant Safety Considerations: Care of patients, aspects of human subject protection, and safety in the conduct of a clinical trial.

  3. (3) Medicines Development and Regulation: Knowledge of how drugs, devices, and biologicals are developed and regulated.

  4. (4) Clinical Trials Operations, Good Clinical Practices (GCPs): Study management, GCP compliance; safety management (adverse event identification, reporting, post market surveillance, pharmacovigilance), and handling of investigational products.

  5. (5) Study and Site Management: Content required at the site level to run a study (financial and personnel aspects). Includes site and study operations (excluding regulatory and GCPs).

  6. (6) Data Management and Informatics: Data acquisition and management during a clinical trial, including source data, data entry, queries, quality control, correction, and the concept of a locked database.

  7. (7) Leadership and Professionalism: Principles and practice of leadership and professionalism in clinical research.

  8. (8) Communication and Teamwork: Communication practices within the site and between the site and sponsor, Clinical Research Organization, and regulators, and teamwork skills necessary for conducting a clinical trial.

Goldstein [Reference Goldstein9] advised that the development of any educational programs be “undertaken by individuals skilled in instructional design and curriculum development [and be built] upon the principles of adult learning” [Reference Calvin-Naylor10]. In addition, the CTSA Coordinator Taskforce recommended that “institutions conduct a gap analysis to determine areas of weakness or additional needs in CRC training. This effort should include a focus on CRC core competencies” [Reference Speicher7]. Few studies have examined the teaching strategies or training methods for delivering content for JTF for Clinical Trial Competency [Reference Speicher7, Reference Jones11]. The purpose of this study was to assess CRCs’ preferences for receiving JTF competency content through training methods grounded by learning theory lens to better inform subsequent design of competency-driven CRC training.

This learning lens refers to Malcolm Knowles’ adult learning theory, andragogy, introduced in the early 1970s. This theory provided advancements in the field of adult education [Reference Lomax and Hahs-Vaughn12]. According to Knowles et al., “andragogy is a core set of adult learning principles. The six principles of andragogy are: the learner’s need to know; self-concept of the learner; prior experience of the learner; readiness to learn; orientation to learning; and motivation to learn… [and] andragogy is preferred in practice when it is adapted to fit the uniqueness of the learners and the learning environment” [Reference Lomax and Hahs-Vaughn12, pp. 4–5].

Methods

This study utilized quantitative methodology. The authors inductively adapted the classification of training delivery methods derived from Jones et al.’s [Reference Jones11] and Speicher et al.’s [Reference Speicher7] studies based on field experience at our site in tandem with the competency domains. For our study, we decided to include 5 categories: (1) mentoring or coaching, (2) online text-based training, (3) online video-based training, (4) live lecture, and (5) flipped classroom model. The 8 competency domains reflected the JTF framework referenced above.

The researcher-constructed survey was comprised of 45 items. Five items measured participant demographic background including age, gender, highest degree, years of being a CRC, and department affiliation. The other 40 questions addressing 8 domains of CRC core competencies and 5 training delivery methods comprised the survey. For each training method, the participant was asked to indicate how likely they would want to enroll in a training course that employs the given training method to facilitate their master of a competency domain. The participant repeated the process for all 8 domains of the JTF core competencies. For example, one question asked was: “Please indicate the extent to which you like or unlike the Mentoring or Coaching to deliver knowledge of scientific concepts related to the design and analysis of clinical trials?” These items were scored using a 7-point Likert scale (7=extremely like, 1=extremely unlike) (see Table 1). Purposeful, non-probability sampling was used. Individuals (n=160) who worked as a CRC at a single research intensive university in the Southeast portion of the United States were invited to participate in this study. Data were collected online via Qualtrics between November and December in 2016. The university’s Institutional Review Board approved this study (IRB201601579). The data were analyzed using SPSS (version 24). The repeated measures analysis of variance (ANOVA) and split-plot ANOVA were adopted to compare the group means among 5 training delivery methods. The repeated measures ANOVA were used to examine which training delivery methods were preferred by CRC’s in the aforementioned 8 domains. Since the variances of the differences between all combinations of related groups were unequal, the sphericity assumption was violated. Therefore, the lower-bound corrections (the lowest possible theoretical value) was adopted to produce a more valid critical F-value and to reduce the potential increase in type I error rates [Reference Lomax and Hahs-Vaughn12]. The split-plot ANOVA was used to examine whether demographic variables caused a different preference pattern on the training methods [13]. The statistical null hypothesis was assumed for all research questions. The significance level was set at α=0.05 for all analysis. Pairwise deletion technique was used for handling missing data.

Table 1 Questionnaire

Results

In total, 160 active CRCs were invited to participate in this study. Of those, 87 responded for a response rate of 54.4%. Demographic information including gender, highest degree, age, years of being CRCs, and department affiliation are shown in Table 2. On average, coordinators reported that they slightly or moderately like the selected 5 training delivery methods to convey requisite information related to each of the 8 competency content domains (see Table 3). Generally, participants reported a preference for online video-based training (mean=45.41, SD=9.55) and least preferred the mentoring or coaching (mean=40.89, SD=12.69). In terms of encompassing knowledge of scientific concepts related to the design and analysis of clinical trials (mean=5.84, SD=1.34), study and site management (mean=5.71, SD=1.48), as well as leadership and professionalism (mean=5.68, SD=1.45), the coordinators thought that live lecture was preferable. For promoting knowledge of ethical and participant safety considerations (mean=5.77, SD=1.35), medicines development and regulation (mean=5.83, SD=1.32), clinical trials and operations (mean=5.86, SD=1.17), data management and informatics (mean=5.69, SD=1.44), and communication and teamwork (mean=5.52, SD=1.49), the coordinators reported a preference for online video-based training.

Table 2 Overview of demographics (n=87)

CRC, clinical research coordinator.

Table 3 Mean and SD of platform by domain (n=87)

* The sum of the 8 domains and it ranges from 1 to 56. The higher overall score indicates that the coordinators believe the specific platform works better in general.

Results of 1-way repeated measures ANOVA showed statistically significant differences (see Table 4) in CRCs’ preferences with respect to ethical and participant safety considerations (η2 p=0.065, F 1,85=5.917, p=0.017), medicines development and regulation (η2 p=0.091, F 1,85=8.509, p=0.005), as well as clinical trials operations (η2 p=0.059, F 1,86=5.375, p=0.023). Regarding the remaining competency domains, scientific concepts and research design, study and site management, data management and informatics, leadership and professionalism, and communication and teamwork, there were no statistically significant preference differences among the 5 training delivery methods.

Table 4 Selected repeated measures analysis of variance and posthoc result of training method comparison on each competency domain (n=87)

* 1, Mentoring or coaching; 2, online text-based training; 3, online video-based training; 4, live lecture; 5, flipped classroom.

The posthoc analysis of ethical and participant safety considerations (Table 4) indicated a statistically significant difference in the CRCs’ preferences between mentoring or coaching, and online video-based training (mean difference [MD]=1.023, SE=0.260, p=0.002), as well as between mentoring or coaching and live lecture (MD=0.895, SE=0.249, p=0.005) and between mentoring or coaching and flipped classroom (MD=0.581, SE=0.197, p=0.040). There were no statistically significant differences among other pairs. In other words, the coordinators least preferred the mentoring or coaching platform in terms of learning ethical and participant safety considerations compared with online video-based training, live lecture, and flipped classroom. Except for the mentoring and coaching training method, the remaining 4 training delivery methods were equally preferable for the ethical and participant safety considerations domain.

The posthoc analysis of medicines development and regulation (Table 4) suggested that there was a statistically significant difference in the coordinators’ preference between mentoring or coaching and online video-based training (MD=1.198, SE=0.227, p<0.001), between mentoring or coaching and live lecture (MD=0.872, SE=0.217, p=0.001), between mentoring or coaching and flipped classroom (MD=0.593, SE=0.202, p=0.043), as well as between online text-based training and online video-based training (MD=0.523, SE=0.171, p=0.029) and between online video-based training and flipped classroom (MD=0.581, SE=0.197, p=0.040). There were no statistically significant differences among other pairs. In other words, the coordinators preferred the online video-based training platform and live lecture over mentoring or coaching. They preferred the online text-based training and flipped classroom for the medicines development and regulation domain. The posthoc analysis of clinical trials operations (Table 4) indicated that there was a statistically significant difference in the coordinators’ preference between mentoring or coaching and online video-based training (MD=1.011, SE=0.246, p=0.001), as well as between online text-based training and online video-based training (MD=0.540, SE=0.154, p=0.007). There were no statistically significant differences among other pairs. In particular, the coordinators had less preference for the mentoring or coaching and online text-based training delivery methods compared with online video-based training, live lecture and flipped classroom when studying clinical trials operations. The result of split-plot ANOVA suggested that there was no different preference patterns on the training delivery methods due to the demographic variables including gender, highest degree, age, years of being CRCs, and department affiliation.

Discussion

The researchers explored CRC participant preferences for training delivery methods across 8 standard content domains. Researchers also examined the degree to which demographic variables were related to training delivery methods. Overall, participants reported a preference for online video, whereas mentoring or coaching was least preferred. There were statistical differences in the delivery methods by the selected content domains including, Ethical and Participant Safety Considerations, Medicines Development and Regulation, and Clinical Trials Operations. No significant differences across delivery methods and content domains by demographic variables were observed. This is the first study that we are aware of that has quantitatively assessed participant preferences for content delivery methods across the 8 JTF competency domains.

Limitations of this study include the use of a convenience sample and the inherent potential for social desirability bias in self-report surveys. The generalizability of the findings are limited to the participants in this study.

Other researchers have studied CRC preferences for training delivery methods. In Speicher et al. [Reference Speicher7], they asked participants to indicate which types of training delivery they wanted to provide for newly hired CRCs. They found that mentorship, online training modules, orientation courses, conferences, and book trainings predominated. Jones et al. [Reference Jones11] asked CRCs to rate their preferences for teaching strategies including: distance education (i.e., online, email); experiential learning opportunities; portfolio development; virtual clinical trial practicum; simulation, mock patients and case studies in clinical research; opportunities to interact with international coordinators (via email); and traditional classroom setting. Unlike the Jones’ and Speicher studies, we quantified participant preferences. In another study, when asked to indicate a preference for online or classroom learning, researchers [13], found that novice and experienced CRCs held a preference for classroom learning. Notably, their options for teaching strategies were more limited. Findings from this study provide insight into participant preferences for training delivery methods related to clinical research competency domains.

Participant preferences may viewed though a learning theory lens to shed light on the preferred design of competency driven CRC training. The data obtained in this study can be used to inform future instructional design of CRC competency training and professional development programs. For example, asking participants to rate the training delivery methods is compatible with Malcolm Knowles’ adult learning theory, andragogy [14]. The CRCs’ ratings of the possible training delivery methods in this study can be viewed as a form of practicing andragogy.

It was perhaps somewhat surprising that the mentoring or coaching training method was less preferred, given myriad benefits attributed to mentoring [15]. However, the CRC field, lacks a history and culture of formal mentoring. Perhaps developing a peer-to-peer support network, like Mentor Academy programs [16], which exists in some CTSAs would be advisable. At this institution, programmatic efforts have been undertaken to develop a peer-to-peer support network [17] and to use hybridized content delivery. Drawing upon these experiences, the researchers have found that an approach that combines classroom and online learning embedded within a community network may hold the most promising outcomes for standardizing CRC training [14, 17, 18, 19].

The Enhancing Clinical Research Professionals’ Training and Qualifications (ECRPTQ) National Center for Advancing Translational Sciences’ supplement project [Reference Shanely, Masour and Baron20] identified at least 334 training courses in various formats for emerging training platforms. However, it is critical to understand what it means to provide essential training and to ensure that regardless of what platform, model or delivery method used that trainings are firmly linked to a meaningful integration of established core competencies. Other CTSA institutions should look to the JTF for Clinical Trial Competency conceptual framework [Reference Sonstein8] to import competency language into local educational and training initiatives and then establish a common competency framework across the consortium. Notably, this framework has been used to define professional competency across the clinical research enterprise. Subsequently, CTSA investigators in ECRPTQ established and vetted a set of standards. Although the consortium of CTSA sites impacts diverse audiences, they are linked together by a common focus: excellence in clinical research. Findings from our study offer guidance to those charged with developing the training for CRCs.

Conclusion

The results of our survey, reveal participants’ tacit desire for using online video offerings for some competency content. This observation has been influential in promoting the expansion of our training delivery portfolio. Currently, we are developing more online video content, peer-to-peer support networks and hybrid certification classes. We are also collaborating with instructional design experts in our training and development office to provide hybrid certification classes. Using the findings from this study as a guide, we plan to develop a suite of online training videos that will eventually span the 8 JTF domains, while locally contextualizing their application. The goal is to transmit the idea that the role of a research coordinator is grounded in the need for a facile grasp of what it means to conduct clinical research in a safe, competent, and compliant manner situated in the framework of the JTF Clinical Research Competencies. These online videos will enrich our hybrid classroom experiences across our peer-to-peer mentoring/support networks as we continue to combine classroom and online learning while remaining cognizant of the cultural and specific workplace needs embedded within our community network.

Financial Support

Research reported in this publication was supported by the University of Florida Clinical and Translational Science Institute, which is supported in part by the NIH National Center for Advancing Translational Sciences under award number UL1TR001427. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Acknowledgments

The authors would like to acknowledge the support of the University of Florida Clinical and Translational Science Institute (CTSI).

Disclosures

The authors have no conflicts of interest to declare.

References

1. Woodin, K. The CRC’s Guide to Coordinating Clinical Research, 3rd edition. Boston, MA: Thomson CenterWatch, 2016.Google Scholar
2. Zerhouni, EA. Translational and clinical science—time for a new vision. New England Journal of Medicine 2005; 353: 16211623.Google Scholar
3. World Medical Association. World Medical Association Declaration of Helsinki: ethical principles for medical research involving human subjects. JAMA 2013; 310: 21912194.Google Scholar
4. Rosenberg, RN. Translating biomedical research to the bedside: a national crisis and a call to action. JAMA 2003; 289: 13051306.Google Scholar
5. Sparrow, MK. The Regulatory Craft: Controlling Risks, Solving Problems, and Managing Compliance. Washington, DC: Brookings Institution Press, 2011.Google Scholar
6. Sung, NS, et al. Central challenges facing the national clinical research enterprise. JAMA 2003; 289: 12781287.Google Scholar
7. Speicher, LA, et al. The critical need for academic health centers to assess the training, support, and career development requirements of clinical research coordinators: recommendations from the clinical and translational science award research coordinator taskforce. Clinical and Translational Science 2012; 5: 470475.Google Scholar
8. Sonstein, SA, et al. Moving from compliance to competency: a harmonized core competency framework for the clinical research professional. Clinical Research 2014; 28: 1723.Google Scholar
9. Goldstein, IL. Training in Organizations: Needs Assessment, Development, and Evaluation. Pacific Grove, CA: Thomson Brooks/Cole Publishing Co, 1993.Google Scholar
10. Calvin-Naylor, NA, et al. Education and training of clinical and translational study investigators and research coordinators: a competency-based approach. Journal of Clinical and Translational Science 2017; 1: 110.Google Scholar
11. Jones, CT, et al. Education and training preferences of clinical research managers. Research Practitioner 2008; 9: 202214.Google Scholar
12. Lomax, RG, Hahs-Vaughn, DL. An Introduction to Statistical Concepts, 3rd edition. New York, NY: Taylor & Francis Group, 2012, p. 503.Google Scholar
13. Behar-Horenstein LS, Potter J, Prikhidko A, Swords S, Sonstein S, Kolb HR. Training impact on novice and experienced research coordinators. The Qualitative Report 2017; 22: 3118–3138.Google Scholar
14. Knowles MS, Holton EF, Swanson RA. The Adult Learner: The Definitive Classic in Adult Education and Human Resource Development. New York, NY: Routledge, 2014.Google Scholar
15. Behar-Horenstein LS, Prikhidko A. Exploring mentoring in the context of team science. Mentoring & Tutoring 2017; 25: 430–454.Google Scholar
16. Behar-Horenstein LS, Feng X, Prikhidko A, Su Y, Kuang H, Roger B, Fillingim RB. Assessing mentor academy program effectiveness using mixed methods. Under review.Google Scholar
17. Solberg L, Kolb HR, Prikhidko A, Behar-Horenstein LS. Ensuring representativeness in competencies for research coordinators. Clinical Researcher 2018; 32.Google Scholar
18. Behar-Horenstein LS, Prikhidko A, Kolb HR. Advancing the practice of CRCs: Why professional development matters. Therapeutic Innovation and Regulatory Science 2018: 1–10.Google Scholar
19. Behar-Horenstein LS, Baiwa W, Kolb HR, Prikhidko A. A mixed method approach to assessing online dominate gcp training platforms. The Clinical Researcher 2017; 31: 38–42.Google Scholar
20. Shanely, T, Masour, G, Baron, R. Enhancing clinical research professionals’ training and qualifications (ECRPTQ) competency assessments [Internet], 2015 [cited Apr 7, 2016]. (http://www.ctsa-gcp.org/uploads/3/9/2/5/39256889/ecrptq_assessments_103015_a2.pdf)Google Scholar
Figure 0

Table 1 Questionnaire

Figure 1

Table 2 Overview of demographics (n=87)

Figure 2

Table 3 Mean and SD of platform by domain (n=87)

Figure 3

Table 4 Selected repeated measures analysis of variance and posthoc result of training method comparison on each competency domain (n=87)