Hostname: page-component-78c5997874-v9fdk Total loading time: 0 Render date: 2024-11-10T06:25:30.542Z Has data issue: false hasContentIssue false

Updating and evaluating a research best practices training course for social and behavioral research professionals

Published online by Cambridge University Press:  27 December 2023

Elias Samuels*
Affiliation:
Michigan Institute of Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA
Mary R. Janevic
Affiliation:
Department of Health Behavior and Health Education, School of Public Health, University of Michigan, Ann Arbor, MI, USA Department of Physical Medicine and Rehabilitation, University of Michigan, Ann Arbor, MI, USA
Alexandra E. Harper
Affiliation:
Department of Physical Medicine and Rehabilitation, University of Michigan, Ann Arbor, MI, USA
Angela K. Lyden
Affiliation:
Michigan Institute of Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA Clinical Trials Support Office, University of Michigan, Ann Arbor, MI, USA
Gina M. Jay
Affiliation:
Department of Physical Medicine and Rehabilitation, University of Michigan, Ann Arbor, MI, USA
Ellen Champagne
Affiliation:
Michigan Institute of Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA
Susan L. Murphy
Affiliation:
Michigan Institute of Clinical and Health Research, University of Michigan, Ann Arbor, MI, USA Department of Health Behavior and Health Education, School of Public Health, University of Michigan, Ann Arbor, MI, USA
*
Corresponding author: E. Samuels, PhD; Email: eliasms@umich.edu
Rights & Permissions [Opens in a new window]

Abstract

Introduction:

The clinical and translational research workforce involved in social and behavioral research (SBR) needs to keep pace with clinical research guidance and regulations. Updated information and a new module on community and stakeholder engagement were added to an existing SBR training course. This article presents evaluation findings of the updated course for the Social and Behavioral Workforce.

Methods and Materials:

Participants working across one university were recruited. Course completers were sent an online survey to evaluate the training. Some participants were invited to join in a focus group to discuss the application of the training to their work. We performed descriptive statistics and conducted a qualitative analysis on focus group data.

Results:

There were 99 participants from diverse backgrounds who completed the survey. Most reported the training was relevant to their work or that of the study teams they worked with. Almost half (46%) indicated they would work differently after participating. Respondents with community or stakeholder engaged research experience vs. those without were more likely to report that the new module was relevant to study teams they worked with (t = 5.61, p = 0.001), and that they would work differently following the training (t = 2.63, p = 0.01). Open-ended survey responses (n = 99) and focus group (n = 12) data showed how participants felt their work would be affected by the training.

Conclusion:

The updated course was rated highly, particularly by those whose work was related to the new course content. This course provides an up-to-date resource for the training and development for the Social and Behavioral Workforce.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of The Association for Clinical and Translational Science

Introduction

The training needs of the clinical and translational research workforce are evolving to include new scientific areas and good clinical practices (GCP). More specialized training for the Social and Behavioral Workforce who work under the clinical and translational research umbrella is particularly needed [Reference Shanley, Calvin-Naylor and Divecha1Reference Reis, Berglund, Bernard, Califf, FitzGerald and Johnson5]. Progress has been made defining the important elements of health research training and using robust evaluation methods to assess the impact of training programs on the work of scholars, trainees, and research staff [Reference Rubio, Schoenbaum and Lee6Reference Hornung, Ianni, Jones, Samuels and Ellingrod10]. Recent empirical research also demonstrates an emerging need to integrate new course content on community and stakeholder engagement into existing training programs [Reference Passmore, Farrar Edwards, Sorkness, Esmond and Brasier11Reference DiGirolamo, Geller, Tendulkar, Patil and Hacker13]. Equipping the Social and Behavioral Workforce with knowledge about community and stakeholder engagement can facilitate collaborations with clinical and basic scientists, community partners, patients, and healthcare providers [Reference Hörig, Marincola and Marincola14]. The aim of this study was to detail the updates made to a social and behavioral research training course and to evaluate the participant experience and impact of completing the course.

Background

In 2016, the National Institutes of Health (NIH) issued a policy requiring (or recommending?) NIH-funded researchers involved in clinical trials to complete GCP training and identified various training opportunities for the workforce to meet the new requirement [15]. The original Social and Behavioral Research Best Practices course was developed in 2016 by a team at the University of Michigan (U-M) to educate clinical and translational researchers to apply GCP principles to social and behavioral research [Reference Murphy, Samuels and Kolb16]. This training course, which took a median of 3.2 hours to complete, contained modules on key topics including research protocols, participant recruitment and retention, informed consent communication, confidentiality and privacy, participant safety and adverse event reporting, quality control and assurance, and research misconduct. A description of the development and evaluation of the original course is available elsewhere [Reference Murphy, Samuels and Kolb16]. The NIH Office of Behavioral and Social Sciences Research included this course in online training resources offered to the health research workforce [17]. The Collaborative Institutional Training Initiative (CITI) program also offered the training course for health researchers needing an advanced refresher course in GCP [18]. At U-M, the original course was offered through the university’s learning management system (LMS), from 2016 through 2020, during which 793 individuals completed the course.

In 2021, researchers at U-M revised the GCP course to provide updated information on research regulations, practice, and community and stakeholder engagement. This update was also made in response to the need to train the Social and Behavioral Workforce in community and stakeholder engagement methods [Reference Hörig, Marincola and Marincola14]. The team working on this update included U-M faculty and research staff, including those who had expertise in community-engaged research; and subject matter experts working at the NIH. The team revised the course between 2021 and 2022 and evaluated it from 2022 to 2023.

This paper describes the updates of the course and its evaluation, as illustrated by the timeline shown in Fig. 1. This course was updated to: 1) add training on regulatory changes and reporting requirements, including guidance for clinical trials, 2) incorporate features enhancing accessibility of the course, and 3) develop a new module on Community and Stakeholder Engagement (CASE). We use the term “community and stakeholder engagement” to be broadly focused on well-established methods of community engagement as well as engagement with other stakeholder groups, including participants of clinical and translational research studies. Focus groups were conducted to inform the course updates. Our team then evaluated the learners’ experiences used data extracted from the U-M LMS and from participant surveys and focus groups about impact of the course on their work. We also examined whether responses varied by learner characteristics, including past research experience, professional credentials, and demographic backgrounds including participants’ age, sex, race, and ethnicity to test our hypothesis that completing the updated training course would have a positive impact on all members of the Social and Behavioral Workforce, including those with and without community and stakeholder engagement experience.

Figure 1. Timeline of updates made to the SBR course and of the course evaluation.

Materials and methods

Updating the training course

Content

The U-M study team met bi-weekly or monthly starting in 2021. The first eight months of the project were dedicated to reviewing the course to identify necessary updates, drafting new course materials, and redesigning the functionality of the training modules. The team went module-by-module to determine updates needed to the course content. In general, the team wanted to ensure the recommended practices reflected updated regulatory guidance, appropriate terminology, and new developments in technology. The content changes to specific modules are outlined in Table 1. These updates included improving the quality of the knowledge checks and the exam embedded in several modules the course. For the knowledge checks and exam questions, we examined data from learners participating in the original course to determine if any questions appeared confusing or difficult to answer based on their responses. We reworded questions that appeared ambiguous and ensured that questions reflected appropriate terminology and scenarios.

Table 1. Updates for the social and behavioral research training course by module

Presentation and functionality

An instructional design company helped redesign existing training modules and the development of the new CASE module. This company redesigned videos, graphics, text, and course functionality. The course was made more accessible to learners with visual impairment. We changed how the course manual was referred to in the course. In the original course, this manual appeared in each module. We updated the manual and kept it in the resources section of the course but no longer referred to it as an interactive part of each module. We developed “key principles” to remember at the end of each module. We also added the anticipated time to complete each module. The structure of the modules, including the CASE module, was changed to accommodate the additional information. An example of the structure is provided in Table 2.

Table 2. Community and stakeholder engagement module structure

Developing the CASE module

The new CASE module was designed to be relevant to members of the Social and Behavioral Research Workforce, whether or not they are directly involved in community-engaged or patient-centered research. However, we hypothesized that participants involved in research most relevant to the content of the CASE module would find it more impactful than those not involved in this type of research. This hypothesis is based on well-established theories of adult learning and organizational change. The theory of sensemaking hypothesizes that the ways people understand new or different information they notice is greatly affected by what information they are already familiar with [Reference Weick19Reference Samuels23]. For this reason, we hypothesize that participants with experience conducting community and stakeholder engaged research will report the course to be more relevant and impactful than those participants without such experience.

Subject matter expert participation

Focus group data were collected from individuals employed as faculty or researchers in eight states including Michigan, all of whom had considerable experience conducting and teaching community-engaged research. These participants had decades of expertise in academic-community partnerships, community-engaged research, health disparities, institutional review boards, research coordination, research regulation, patient engagement, patient-centered research, public health, workforce development, and underserved populations in health care. The individuals who participated in the focus groups used to inform the development of the CASE module were selected by the authors based on their subject matter expertise. These individuals were contacted by the authors and invited to share their feedback about the development of this module through a focus group that was conducted virtually.

Two focus groups were conducted in May 2022 with seven and five individuals participating, respectively. Participants were prompted to describe their view of benefits and challenges to engaging communities in social and behavioral research. This was followed by a brief presentation about the background of the new CASE training module developed by the team, the structure of which is shown in Table 2. This was followed by a review of the seven learning objectives which included, (1) describing how research can engage communities to help reduce inequities, (2) explaining the benefits of community and stakeholder participation in research, (3) showing how the history of mistreatment of research participants can affect today’s community-engaged partnerships, (4) presenting various approaches of effectively engaging communities and other stakeholders in research, (5) naming common challenges of engaging community partners and other stakeholders in research, and possible solutions, (6) describing the role and functions of a Community and Stakeholder Advisory Board across study phases, and (7) summarizing best practices that can be used to engage community partners and stakeholders in research.

Focus group participants were then asked to join in an open discussion about three key topics: 1) what participants liked about the topics addressed in the community engagement module; 2) the aspects or sections of the training course that were most important; and 3) what key information or resources were missing from the module.

Transcripts of the focus group were reviewed by the team and used to redesign the community engagement module by reframing key information and adding information and resources about community engagement to the course. For example, additional resources about community advisory boards were included, and figures depicting models of community engagement were restyled to make them easier to understand. The redesigned course was then made available to U-M employees through the university’s LMS in August 2022.

Evaluating the updated training course

Participants

Data were collected from participating social and behavioral research professionals working in health research at U-M’s campuses in Ann Arbor, Dearborn and Flint, Michigan. Multiple strategies were used to recruit U-M faculty and staff working at U-M campuses to participate in the redesigned course. The participants included individuals who had completed the original iteration of the course as well as individuals who had never taken any iteration of the course before. All U-M employees who had completed the original course through the U-M LMS before 2018 were identified (N = 793), of whom 534 were still employed at U-M at the time of this study and who received email invitations to participate in the course and an evaluation survey administered online using Qualtrics. The course and survey invitations were personalized. In parallel, the course and survey were advertised to department chairs at U-M schools and colleges on all three campuses. Promotions were included in newsletters, social media, and special communications sent to researchers across the university from November 2022 through April 2023 by the Michigan Institute for Clinical and Health Research (MICHR).

Low participation rates in the first weeks of the study period motivated the team to use financial incentives for participant recruitment. To promote participation in the course evaluation, a $50 incentive was provided to all course completers who finished the survey. To ensure equity of treatment, participants who finished the survey before the incentive was offered were also contacted and offered $50 for their involvement.

Methods

Another focus group was conducted in December 2022 to discuss participants’ understanding of how they could utilize the training course in practice, particularly within their study teams planning or implementing Community-Based Participatory Research (CBPR) approaches to research, patient-oriented research, and other community-engaged research practices [Reference Michener, Scutchfield and Aguilar-Gaxiola24]. Participants were recruited from U-M staff who volunteered for a standing working group at MICHR to advance workforce development initiatives. This working group consisted of 12 health research staff employed in a variety of schools, colleges, and administrative units at U-M’s Ann Arbor campus who had all taken the updated training and participated in the evaluation survey. We asked a series of questions about the participants’ experiences of attending the training course as well as the potential of utilizing the lessons learned to advance the professional development of their study teams conducting social and behavioral research. A recording of the focus group was transcribed and analyzed.

Evaluation outcome measures

Training outcomes were measured via the participants’ experiences, context-based learning, and behaviors, as well as their perceptions of the impact of the training overall [Reference Kirkpatrick and Kirkpatrick25Reference Hansman26]. These outcomes included measures of: 1) the relevance of the overall training, 2) the relevance of the CASE module specifically, 3) whether participants’ colleagues care about the issues the training addresses, 4) the need for this training among the health research workforce, and 5) the intent of participants to work differently as a result of the training course. As a measure of their satisfaction with the course, we asked if participants would recommend the course to their colleagues and/or to partnering organizations and groups. All training outcomes were measured on the same Likert scale of agreement, in which 1 = Strongly disagree, 2 = Somewhat disagree, 3 = Neither disagree nor agree, 4 = Somewhat agree, and 5 = Strongly agree. Participants were asked in open-ended survey questions to describe ways in which the course was useful and how they could use the resources provided in the course individually and within their study teams.

Survey data analysis

Participant data were extracted from U-M’s LMS indicating course completion, the amount of time taken to complete each module, and performance on the embedded exam. These data were aggregated and compared between those participants who participated in the survey and those who did not. Descriptive statistics were used to compare different groups of participants’ experience in the updated course. Responses to open-ended questions were analyzed qualitatively to identify representative testimonials of impact.

Survey data were collected on participants demographic characteristics and professional background, including their roles, certifications, degrees, workplace settings, experience with social and behavioral research, and work with community-engaged or patient-centered research. We examined how participants’ perceptions of the entire course to their work varies across participants with and without experience conducting community-engaged research. Participants with research experience relevant to community and stakeholder engagement were identified as those individuals who indicated either that (a) they work on a study team conducting community-engaged or patient-centered research or (b) they work on research studies at a community site as a paid worker or volunteer. ANOVAs, t-tests, and ranked correlations were used to identify statistically significant differences between participant groups based on their past experience conducting community and stakeholder-engaged research.

Focus group data analysis

The semi-structured interview protocol used for the 1-hour focus group was followed by analysis of focus group recordings and notes. Grounded theory was used to analyze the focus group results for themes concerning the use of the training resources within study teams [Reference Walker and Myrick27]. The first author generated resultant codes which were reviewed by the study team which identified main themes and determined that saturation was reached. The results were used to inform to their conclusions about the evaluation of the course.

Results

A total of 187 individuals completed course between August 2022 and April 2023. Most individuals completed the training course in approximately four hours (Mean = 4.1 hours, SD = 3.4 hours, Median 3.5 hours). On average, participants answered 89% of the exam questions correctly (SD = 3.4%). Every individual who completed the course was sent an invitation to take the survey of which 99 (53%) answered at least one question. The vast majority of participants (N = 93, 93.9%) completed over 90% of the survey form. The survey participants completed the course in roughly the same amount of time and with a similar score as the course participants considered as a whole (Mean course duration = 4.1 hours, SD = 3.2 hours, Median = 3.5 hours; Mean exam performance = 88.9% correct, SD = 3.4%).

Survey results

Participant characteristics

Of the survey participants, most identified as being female (79%) and White (69%). On average, the respondents were 43 years of age (SD = 13.0). As shown in Table 3, a substantive proportion self-identified as members of underrepresented minority groups.

Table 3. Survey Participant Demographics

Notes:

  • Respondents were allowed to choose as many race categories as applicable. 90 individuals selected 95 categories, including the other racial groups they specified for themselves. Respondents who selected more than one race, including ‘Other’ selections, were recoded as “More than one race.”

  • 90 individuals identifying themselves as either one of two ethnic categories.

  • 92 individuals identifying themselves as either one of two sex categories.

  • 92 individuals identifying themselves as either one of three gender categories.

  • Respondents did not choose three additional categories, including 1) Transgender man/ trans man/ female-to-male (FTM), 2) Transgender women/ trans woman/male-to-female (MTF), or 3) Additional gender category.

  • Respondents were allowed to choose as many underrepresented minority categories as applicable. 41 individuals selected 59 categories. The definitions of each category are detailed in the NIH’s notice of institutes’ interest in diversity [28].

Of note, 41% of participants considered themselves to be underrepresented in the extramural scientific workforce, as defined by the NIH [28]. Many identified as belonging to racial and ethnic groups that have been shown to be underrepresented in health-related sciences on a national basis (9%) or were from disadvantaged backgrounds (11%). A notable proportion identified as having a disability (23%). Many (16%) of the participants identified as women from the above backgrounds working at the graduate level and beyond in doctorate-granting research institutions or working at senior and other faculty levels in biomedical-relevant disciplines.

Participants’ professional experiences were also diverse (Table 4). While two-thirds were research staff and administrators (66%), a substantial proportion identified as postsecondary students (16%). A smaller proportion of respondents were faculty (9% university faculty and 4% clinicians). Most (91%) had earned postsecondary degrees, with many possessing a Master’s degree (41%) or Doctorate (16%). Some (12%) reported achieving certification by the Society of Clinical Research Associates (SOCRA) (5%), the Association of Clinical Research Professionals (ACRP) (2%), or another professional certification (5%). Of the 99 individuals who filled responded to the evaluation survey, 16 (16%) had completed the original version of the course in the past as well as the updated version of the course evaluated here.

Table 4. Survey participant professional experiences

Notes:

  • Respondents were allowed to choose as many roles as applicable. 93 individuals selected 123 roles, including other self-specified roles.

  • Respondents were allowed to choose as many certification categories as applicable. 12 individuals selected 12 categories. Other specified options all included references to professional credentials or training certifications.

  • Respondents were allowed to choose as many credential categories as applicable. 91 individuals selected 117 categories. Other Professional degrees include: 1) Current MA student, 2) Current OTD student, 3) Graduate certificate.

  • Respondents were allowed to choose as many work location categories as applicable. 93 individuals selected 128 categories. Other specified locations included a reference to outpatient clinics.

  • Respondents were allowed to choose as many current work categories as applicable. 91 individuals selected 128 categories. Other specified options include references to different types of research work, including bioinformatics research, cancer research, qualitative research, and outcomes research.

Participants were asked to identify their work settings and the types of research in which they are engaged. Most reported that their work settings included a research university (68%) or an academic medical center (42%). Also, 12% reported working in community-based organizations or community settings. Few reported working at governmental agencies (3%), and 2% also reported “other” work settings. Notably, 43% of participants reported having worked on a study team conducting community-engaged or patient-centered research. Most (69%) worked on clinical, translational, or social and behavioral research studies. On average, the participants had worked on social and behavioral research studies for over 5 years (Mean = 5.9, SD = 7.3).

Survey evaluation outcomes

Most participants strongly agreed the course was relevant to their own work (69%) or that it was relevant to the study teams with whom they work (63%). On average participants strongly agreed to both outcome statements (Mean = 4.6, SD = 0.8 & Mean = 4.6, SD = 0.6, respectively). Participants also agreed that the CASE module was relevant to study teams with whom they work (Mean = 4.2, SD = 0.9).

The participants who indicated that the CASE module was relevant to their work were asked to explain their reasoning in an open-ended question. They responded with examples regarding their past and future work, ranging from applications to their understanding of CBPR, formation of community advisory boards, participant recruitment, and participant interactions involving vulnerable populations. Examples of respondents’ comments of the relevance of the training course to their past and future work follow, respectively:

“We interact and collaborate with community members and organizations to help guide and support our research study. In order to utilize the best practices available, we really count on our community advisory board members and focus group participants being as engaged in the research as possible. So, the strategies for community and stakeholder engagement were very useful for me.”

“I find that [the CASE module] is particularly relevant for the consideration of brainstorming and devising new, relevant, and interesting research questions by utilizing the resources and knowledgebase of the community. Additionally, the module provided useful ideas for how to engage and involve the community/stakeholders to enrich various aspects of research projects I may be working on in the future.”

The survey participants also strongly agreed that all members of the clinical and translational research workforce need this course to conduct social and behavioral research (Mean = 4.6, SD = 0.7). Only three respondents (3%) disagreed with this statement and one neither agreed nor disagreed. When asked if they would work differently as a result of having received the training, only 7 individuals disagreed, 42 respondents neither agreed nor disagreed, and just under half (46%) agreed that the course would cause them to work differently in the future (Mean = 3.5, SD = 0.8).

Those who indicated that they would work differently as a result of the training course were asked to describe how they expected to do so in an open-ended question. They responded with examples regarding their current and future behavior, ranging from their adherence to GCP, use of best practices for patient interactions, increased awareness of the proper use of research protocols, and ability to generate new health research questions. Quotations of respondents’ examples of the application of the course to their current and future work follow, respectively:

“Many of the issues discussed in the training are relevant to the work of our research team, and we often discuss these issues. The training made certain aspects of our work more salient for me and had already led me to produce and share materials related specifically to our data security.”

“I will be even more conscious that the communities I interact with are experts in their own experience and [will] keep in mind that I remain a neutral party in conversation and emphasize the participants autonomy.”

The results of this course were also evaluated using two measures regarding the intention of participants to recommend the training to their colleagues and to the organizations or groups with whom they work [Reference Reichheld29Reference Keiningham, Cooil, Aksoy, Andreassen and Weiner30]. Substantial majorities agreed that they would recommend this course to their colleagues (80%) and to the organizations and groups with whom they work (76%). On average, they agreed that they would recommend their training experience to both the individuals and sets of people that they worked with (Mean = 4.4, SD = 0.8; Mean = 4.2, SD = 0.9, respectively).

Those who agreed that they would recommend the course to others were asked why they expected to do so in an open-ended question. Responses typically referred to the fundamental value of the training course, both as an introduction for those new to research and as a refresher for those with considerable experience. Examples of respondents’ reasoning behind their intention to recommend the training to individuals and to organizations or groups they work with follow, respectively:

“I will recommend this training to every new person working in research. This training is fundamental and every person who works on research needs to be trained on these topics. … There are new employees who do not have a research background and need the basic/fundamental training in order to perform well in their job.”

“I think this training would be really helpful for community orgs that are embarking on training with [the University]. Too often they really aren't aware of all that goes into research and making sure someone is giving full consent.”

Association between participant backgrounds and evaluation outcomes

Individuals with research experience relevant to community and stakeholder engagement were more likely to report that the CASE module was relevant to the study teams with whom they work than those participants without such research experience (t = 5.6, p = 0.001, µ = 4.50with relevant research experience, µ = 3.89without relevant research experience). These individuals were also more likely to indicate that they would work differently as a result of the training course considered overall (t = 2.6, p = 0.01, µ = 3.73with relevant research experience, µ = 3.29without relevant research experience).

A small positive ranked correlation was found between the number of years of experience participants reported working on social or behavioral research and the likelihood that they would agree the training course was relevant to the study teams with whom they work (N = 91, Spearman’s rho = 0.23, p = 0.03). No other significant differences were found based on participant demographics. The demographics categories that were tested included participants’ race, ethnicity, sex, gender, and underrepresented status in the clinical and translational research workforce. We did not hypothesize that there would be significant differences between participant demographic groups. Instead, these analyses reflect the importance of acknowledging the inherent diversity of participants’ learning experiences [Reference Hansman26,Reference Boulware, Corbie and Aguilar-Gaxiola31].

Focus group results

There were three primary conclusions made by the participants in the focus group about the application of the course to the work of the clinical and translational research workforce. Participants noted that the impact of the course could be enhanced by focusing on content that is most relevant to participants or teams. For example, they discussed the possibility of using the CASE module content and resources as a training tool with study teams or other individuals that were conducting community-engaged research. They also articulated their belief that the training course can provide social and behavioral researchers with a common understanding of their work. Third, they concluded the course would be most relevant to those who regularly engaged in social and behavioral research.

Table 5 provides the counts of the most frequently used codes applied to the focus group transcript. The representative quotes in this table also demonstrate how participants imagined the interplay between similar colleagues’ potential participation in the course and their similar reactions to the experience. This result is best illustrated by the following representative quote:

Table 5. Top codes for focus group on participants application of the course to practice

“Going back to that community of practice type thing… Having a [training] resource for other study team members that you can call and be like, “hey you want to come [and] watch our study, visit, or observe, or walk through our process with me?” versus having to ask a PI to say, “Hey, can you contact so and so? [Or] do you know anybody that might be able to do that?” But just having that built-in community and resource, I think, would be really, really beneficial.”

The representative quotes in Table 5 also show how participants understood the alignment of their training experience with their work and how participants’ training experience reinforced their prior beliefs about their work. The representative quotes below respectively characterize both aspects of participants’ training experience:

“One of the things that I appreciate about this entire [training] is that it is not something that is overly focused on FDA compliant trials. The fact that this is really social behavioral research and is focused on the kind of work that [study] teams generally do already makes it far more relevant and impactful …. I mean that’s what really makes it salient is the fact that it is for the kind of work we do.”

“I think [the training] would support what the faculty and what I have been trying to standardize across the center in terms of protocols. And it would give from project managers down to data collection staff or phlebotomist, or whoever’s on our team, an appreciation for the role. … I think that that’s really important to have that additional support.”

Discussion

This work details (1) the approach our team took to updating a training course for best practices in social and behavioral research and (2) the results of the evaluation of the participants’ experiences and impact of the knowledge on their work. This approach enabled us to illuminate connections between the professional development of the Social and Behavioral Workforce and the contributions of this workforce to the advancement of clinical and translational research [Reference Boateng, Neilands, Frongillo, Melgar-Quiñonez and Young32Reference Huang and Chang35]. The results of our study make it reasonable to claim that participants outside our university system will find the new training content relevant to their shared work. The feasibility and effectiveness of the process used to update this course suggest that the incorporation of best practices for community and stakeholder engagement into existing educational opportunities may be one strategy to prepare the workforce to conduct social and behavioral research in partnership with communities.

The results of this study suggest this training module is broadly applicable to this research workforce. Community and stakeholder engagement in social and behavioral research advances translational science by involving typically underrepresented populations in research studies [Reference Sheffet, Howard and Sam36Reference Murthy, Krumholz and Gross38]. As such, standardizing participation in this course can contribute to the ability of the workforce to accelerate the translation of discoveries into interventions and policies that improve the health of all people. Future research should focus on the facilitators and barriers to the long-term impact of this and similar trainings designed for the clinical and translational research workforce. Improving the quality of the training creates more opportunity to professionalize the clinical and translational research workforce [Reference Blavos, Kerr, Hancher-Rauch, Brookins-Fisher and Thompson39Reference Grumbach, Cottler and Brown40]. The need for more impactful community-engaged health research depends on the capacity of these interdependent workforces to be efficiently trained and adequately prepared to conduct community-engaged research that is meaningful to researchers and communities [Reference Holsti, Hawkins, Bloom, White, Clark and Byington41Reference Reopell, Nolan and Gray43].

The results of the study further suggest that this updated training course was acceptable and valued by members of the Social and Behavioral Workforce, including those with professional experience in community and stakeholder engagement. In this respect, the updated course has the potential to contribute to the engagement of underrepresented minorities across this workforce. Future research may consider evaluating the impact of such training on study teams’ engagement with and inclusion of minority groups in their research studies [Reference Holsti, Hawkins, Bloom, White, Clark and Byington41Reference Reopell, Nolan and Gray43]. Moreover, the presence and support of underrepresented minorities working in academic medical centers helps to guarantee diverse research and mentorship experiences for junior investigators, helping to enhance critical representation in the workforce [Reference Zambrana, Carvajal and Townsend44Reference Samuels45].

Our process and evaluation also demonstrate the feasibility of updating critical training resources for the Social and Behavioral Workforce. Although this occurred only at the University of Michigan, having experience in updating and improving training opportunities for researchers can enhance the university’s capacity to anticipate and adapt to advances in clinical and translational research and sudden environmental changes, such as the onset of the COVID-19 pandemic [Reference Bredella, Volkov and Doyle46Reference Volkov, Ragon, Holmes, Samuels, Walden and Herzog47]. The improvement of this training course also promotes institutional buy-in into research topics of importance to the broader public, notably including communities impacted by health disparities [Reference Nooraie, Kwan and Cohn48Reference Shay, Schmidt and Thurston49].

Limitations

This work has several limitations that should be kept in mind. We updated this training course to be relevant to the health research workforce and administered it within only one university system. The low response rate to the survey is comparable to similar online surveys administered via email [Reference Wu, Zhao and Fils-Aime50]. The use of a financial incentive may have had a disproportional impact on the willingness to participate, although $50 to complete a 4-hour training course and evaluation seems appropriate. The incentive may have also systematically biased responses. The generalizability of this study is limited to one large research university and the participating research professionals may differ from those working in other settings in ways that might affect the impact of the training experience. Moreover, the scope of this study could not be extended to include an evaluation of the long-term impact of social and behavioral research training course on the work of the research workforce. Therefore, our conclusions are limited to the short-term impacts.

Conclusion

This work presents evidence that this revised social and behavioral research training course is highly relevant to growing proportions of the clinical and translational research workforce involved in community and stakeholder engagement. More broadly, this course provides a standard training, which applies to all members of the clinical and translational research workforce engaged in social and behavioral research. The approach used to update and evaluate this training was effective and is reproduceable.

Acknowledgments

We thank Michelle Culp MPH, Melissa Riddle PhD, Maria Brunette, PhD, Arijit Bhaumik, BA CCRP, Reema Kadri, MLIS, Calia Morais, PhD, and Stacey L. Schepens Niemiec, PhD.

Funding statement

This project was supported by Clinical Translational Science Awards from the Michigan Institute of Clinical and Health Research (MICHR) (UL1TR002240 and UM1TR004404), with co-funding from the NIH Office of Behavioral and Social Sciences Research. During the conduct of this project, Dr Harper was supported by a postdoctoral fellowship award funded by the University of Michigan’s Advanced Rehabilitation Research Training Program in Community Living and Participation from NIDILRR, Administration for Community Living (grant # 90ARCP0003; PIs Murphy/Kratz).

Competing interests

The authors have no conflicts of interest.

References

Shanley, TP, Calvin-Naylor, NA, Divecha, R, et al. Enhancing clinical research professionals’ training and qualifications (ECRPTQ): recommendations for good clinical practice (GCP) training for investigators and study coordinators. J Clin Transl Sci. 2017;1(1):815. doi: 10.1017/cts.2016.1.CrossRefGoogle ScholarPubMed
Murphy, SL, Byks-Jazayeri, C, Calvin-Naylor, N, et al. Best practices in social and behavioral research: report from Enhancing Clinical Research Professional’s Training and Qualifications Project. J Clin Transl Sci. 2017;1(1):2632. doi: 10.1017/cts.2016.3.CrossRefGoogle ScholarPubMed
Blavos, A, Kerr, D, Hancher-Rauch, H, Brookins-Fisher, J, Thompson, A. Faculty perceptions of certifications in health education and public health: implications for professional preparation. Pedag Health Promot. 2020;8(1):4958. doi: 10.1177/2373379920938823.CrossRefGoogle Scholar
Jones, CT, Jester, P, Croker, JA, et al. Creating and testing a GCP game in an asynchronous course environment: the game and future plans. J Clin Transl Sci. 2019;4(1):3642. doi: 10.1017/cts.2019.423.CrossRefGoogle Scholar
Reis, SE, Berglund, L, Bernard, GR, Califf, RM, FitzGerald, GA, Johnson, PC. Reengineering the National Clinical and Translational Research Enterprise: the strategic plan of the national clinical and translational science awards consortium. Acad Med. 2010;85(3):463469. doi: 10.1097/acm.0b013e3181ccc877.CrossRefGoogle ScholarPubMed
Rubio, DM, Schoenbaum, EE, Lee, LS, et al. Defining translational research: implications for training. Acad Med. 2010;85(3):470475. doi: 10.1097/acm.0b013e3181ccd618.CrossRefGoogle ScholarPubMed
Comeau, DL, Escoffery, C, Freedman, A, Ziegler, TR, Blumberg, HM. Improving clinical and translational research training: a qualitative evaluation of the Atlanta Clinical and Translational Science Institute KL2-Mentored Research Scholars Program. J Investig Med. 2017;65(1):2331. doi: 10.1136/jim-2016-000143.CrossRefGoogle ScholarPubMed
Samuels, EM, Ianni, PA, Eakin, B, Champagne, E, Ellingrod, V. A quasiexperimental evaluation of a clinical research training program. Perform Improv Q. 2023;36(1):413. doi: 10.56811/piq-20-0059.CrossRefGoogle Scholar
Eakin, BL, Ianni, PA, Byks-Jazayeri, C, Ellingrod, VL, Woolford, SJ. Reimagining a summer research program during COVID: strategies for enhancing research workforce diversity. J Clin Transl Sci. 2022;6(1):17. doi: 10.1017/cts.2022.371.CrossRefGoogle ScholarPubMed
Hornung, CA, Ianni, PA, Jones, CT, Samuels, EM, Ellingrod, VL. Indices of clinical research coordinators’ competence. J Clin Transl Sci. 2019;3(2-3):7581. doi: 10.1017/cts.2019.381.CrossRefGoogle ScholarPubMed
Passmore, SR, Farrar Edwards, D, Sorkness, CA, Esmond, S, Brasier, AR. Training needs of investigators and research team members to improve inclusivity in clinical and translational research participation. J Clin Transl Sci. 2020;5(1):15. doi: 10.1017/cts.2020.554.Google Scholar
Estapé-Garrastazu, ES, Noboa-Ramos, C, Jesús-Ojeda, L, Pedro-Serbiá, Z, Acosta-Pérez, E, Camacho-Feliciano, DM. Clinical and translational research capacity building needs in Minority Medical and Health Science Hispanic Institutions. Clin Transl Sci. 2014;7(5):406412. doi: 10.1111/cts.12165.CrossRefGoogle ScholarPubMed
DiGirolamo, A, Geller, AC, Tendulkar, SA, Patil, P, Hacker, K. Community-based participatory research skills and training needs in a sample of academic researchers from a clinical and Translational Science Center in the Northeast. Clin Transl Sci. 2012;5(3):301305. doi: 10.1111/j.1752-8062.2012.00406.x.CrossRefGoogle Scholar
Hörig, H, Marincola, E, Marincola, FM. Obstacles and opportunities in translational research. Nature Med. 2005;11(7):705708. doi: 10.1038/nm0705-705.CrossRefGoogle ScholarPubMed
Not-OD-16-148: Policy on Good Clinical Practice Training for NIH Awardees Involved in NIH-funded Clinical Trials. National Institutes of Health, 2016. https://grants.nih.gov/grants/guide/notice-files/NOT-OD-16-148.html. Accessed June 27, 2023.Google Scholar
Murphy, SL, Samuels, EM, Kolb, HR, et al. Best practices in social and behavioral research: a multisite pilot evaluation of the good clinical practice online training course. J Clin Transl Sci. 2018;2(2):95102. doi: 10.1017/cts.2018.27.CrossRefGoogle ScholarPubMed
The Collaborative Institutional Training Initiative (CITI). GCP SBR Advanced Refresher. CITI Program, 2016. https://about.citiprogram.org/course/gcp-sbr-advanced-refresher/. Accessed June 27, 2023.Google Scholar
Office of Behavioral and Social Sciences Research. Download Good Clinical Practice for Social and Behavioral Research Elearning Course. National Institutes of Health, 2018. https://obssr.od.nih.gov/training/download-good-clinical-practice-social-and-behavioral-research-elearning-course. Accessed June 27, 2023.Google Scholar
Weick, KE. Sensemaking in Organizations. Thousand Oaks, CA: Sage Publications, 1995.Google Scholar
Weick, KE, Sutcliffe, KM, Obstfeld, D. Organizing and the process of sensemaking. Organ Sci. 2005;16(4):409421. doi: 10.1287/orsc.1050.0133.CrossRefGoogle Scholar
Maitlis, S, Sonenshein, S. Sensemaking in crisis and change: inspiration and insights from Weick, 1988. J Manage Stud. 2010;47(3):551580. doi: 10.1111/j.1467-6486.2010.00908.x.CrossRefGoogle Scholar
Cooper, CL, Argyris, C, Starbuck, WH, et al. The Blackwell Encyclopedia of Management. Organizational Behavior. Malden, MA: Blackwell Publication, 2005.Google Scholar
Samuels, EM. Evaluating the implementation and impact of a cluster-hiring initiative at a research university: how causal mechanisms link programmatic activities and outcomes. Perform Improv Q. 2019;32(4):401426. doi: 10.1002/piq.21304.CrossRefGoogle Scholar
Michener, L, Scutchfield, FD, Aguilar-Gaxiola, S, et al. Clinical and translational science awards and community engagement. Am J Prevent Med. 2009;37(5):464467. doi: 10.1016/j.amepre.2009.06.018.CrossRefGoogle ScholarPubMed
Kirkpatrick, JD, Kirkpatrick, WK. Kirkpatrick’s Four Levels of Training Evaluation. Alexandria, VA: ATD Press, 2016.Google Scholar
Hansman, CA. Context-based adult learning. New Direct Adult Cont Educ. 2001;89(89):4352.CrossRefGoogle Scholar
Walker, D, Myrick, F. Grounded theory: an exploration of process and procedure. Qual Health Res. 2006;16(4):547559. doi: 10.1177/1049732305285972.CrossRefGoogle ScholarPubMed
Not-OD-20-031: Notice of NIH’s interest in Diversity. National Institutes of Health, 2019. https://grants.nih.gov/grants/guide/notice-files/NOT-OD-20-031.html. Accessed June 27, 2023.Google Scholar
Reichheld, FF. The one number you need to grow. Harv Bus Rev. 2003;81(12):46124.Google ScholarPubMed
Keiningham, TL, Cooil, B, Aksoy, L, Andreassen, TW, Weiner, J. The value of different customer satisfaction and loyalty metrics in predicting customer retention, recommendation, and share-of-wallet. Manag Serv Qual Int J. 2007;17(4):361384. doi: 10.1108/09604520710760526.CrossRefGoogle Scholar
Boulware, LE, Corbie, G, Aguilar-Gaxiola, S, et al. Combating structural inequities—diversity, equity, and inclusion in clinical and translational research. N Engl J Med. 2022;386(3):201203. doi: 10.1056/NEJMp2112233.CrossRefGoogle ScholarPubMed
Boateng, GO, Neilands, TB, Frongillo, EA, Melgar-Quiñonez, HR, Young, SL. Best practices for developing and validating scales for health, social, and behavioral research: a primer. Front Publ Health. 2018;6:149. doi: 10.3389/fpubh.2018.00149.CrossRefGoogle ScholarPubMed
Tashakkori, A, Teddlie, C. Sage Handbook of Mixed Methods in Social & Behavioral Research. Thousand Oaks, CA: SAGE Publications, 2021.Google Scholar
Kraiger, K, Ford, JK, Salas, E. Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. J Appl Psychol. 1993;78(2):311328. doi: 10.1037/0021-9010.78.2.311.CrossRefGoogle Scholar
Huang, MH, Chang, YW. Characteristics of research output in social sciences and humanities: from a research evaluation perspective. J Am Soc Inf Sci Technol. 2008;59(11):18191828. doi: 10.1002/asi.20885.CrossRefGoogle Scholar
Sheffet, AJ, Howard, G, Sam, A, et al. Challenge and yield of enrolling racially and ethnically diverse patient populations in low event rate clinical trials. Stroke. 2018;49(1):8489. doi: 10.1161/strokeaha.117.018063.CrossRefGoogle ScholarPubMed
Rochon, PA, Mashari, A, Cohen, A, et al. The inclusion of minority groups in clinical trials: problems of under representation and under reporting of data. Account Res. 2004;11(3-4):215223. doi: 10.1080/08989620490891412.CrossRefGoogle ScholarPubMed
Murthy, VH, Krumholz, HM, Gross, CP. Participation in cancer clinical trials. JAMA. 2004;291(22):2720. doi: 10.1001/jama.291.22.2720.CrossRefGoogle ScholarPubMed
Blavos, A, Kerr, D, Hancher-Rauch, H, Brookins-Fisher, J, Thompson, A. Faculty perceptions of certifications in health education and public health: implications for professional preparation. Pedag Health Promot. 2022;8(1):4958. doi: 10.1177/2373379920938823.CrossRefGoogle Scholar
Grumbach, K, Cottler, LB, Brown, J, et al. It should not require a pandemic to make community engagement in research leadership essential, not optional. J Clin Transl Sci. 2021;5(e95):17. doi: 10.1017/cts.2021.8.CrossRefGoogle ScholarPubMed
Holsti, M, Hawkins, S, Bloom, K, White, R, Clark, EB, Byington, CL. Increasing diversity of the biomedical workforce through community engagement: The University of Utah Native American Summer Research Internship. Clin Transl Sci. 2015;8(2):8790. doi: 10.1111/cts.12258.CrossRefGoogle ScholarPubMed
Gulaid, LA, Kiragu, K. Lessons learnt from promising practices in community engagement for the elimination of new HIV infections in children by 2015 and keeping their mothers alive: summary of a desk review. J Int Aids Soc. 2012;15(S2):17390. doi: 10.7448/IAS.15.4.17390.CrossRefGoogle Scholar
Reopell, L, Nolan, TS, Gray, DM, et al. Community engagement and clinical trial diversity: navigating barriers and co-designing solutions—a report from the “Health Equity through Diversity” seminar series. PloS One. 2023;18(2):e0281940. doi: 10.1371/journal.pone.0281940.CrossRefGoogle ScholarPubMed
Zambrana, RE, Carvajal, D, Townsend, J. Institutional penalty: mentoring, service, perceived discrimination and its impacts on the health and academic careers of Latino faculty. Ethnic Racial Stud. 2023;46(6):11321157. doi: 10.1080/01419870.2022.2160651.CrossRefGoogle ScholarPubMed
Samuels, EM. Evaluating the implementation and impact of a cluster-hiring initiative at a research university: how causal mechanisms link programmatic activities and outcomes. Perform Improv Q. 2020;32(4):401426. doi: 10.1002/piq.21304.CrossRefGoogle Scholar
Bredella, MA, Volkov, BB, Doyle, JM. Training and cultivating the translational science workforce: responses of clinical and translational science awards program hubs to the COVID-19 pandemic. Clin Transl Sci. 2023;16(1):4349. doi: 10.1111/cts.13437.CrossRefGoogle ScholarPubMed
Volkov, BB, Ragon, B, Holmes, K, Samuels, E, Walden, A, Herzog, K. Leadership and administration to advance translational science: environmental scan of adaptive capacity and preparedness of Clinical and Translational Science Award Program hubs. J Clin Transl Sci. 2023;7(e6):19. doi: 10.1017/cts.2022.409.CrossRefGoogle ScholarPubMed
Nooraie, RY, Kwan, BM, Cohn, E, et al. Advancing health equity through CTSA programs: opportunities for interaction between health equity, dissemination and implementation, and translational science. J Clin Transl Sci. 2020;4(3):168175. doi: 10.1017/cts.2020.10.CrossRefGoogle Scholar
Shay, LA, Schmidt, S, Thurston, AJ, et al. Advancing diversity, equity, and inclusion within clinical and translational science training programs: a qualitative content analysis of the training breakout session at the national CTSA program meeting. J Clin Transl Sci. 2022;6(e110):16. doi: 10.1017/cts.2022.442,CrossRefGoogle ScholarPubMed
Wu, J, Zhao, K, Fils-Aime, F. Response rates of online surveys in published research: a meta-analysis. Comput Human Behav Rep. 2022;1(7):111. doi: 10.1016/j.chbr.2022.100206.Google Scholar
Figure 0

Figure 1. Timeline of updates made to the SBR course and of the course evaluation.

Figure 1

Table 1. Updates for the social and behavioral research training course by module

Figure 2

Table 2. Community and stakeholder engagement module structure

Figure 3

Table 3. Survey Participant Demographics

Figure 4

Table 4. Survey participant professional experiences

Figure 5

Table 5. Top codes for focus group on participants application of the course to practice