Hostname: page-component-78c5997874-m6dg7 Total loading time: 0 Render date: 2024-11-15T05:28:33.922Z Has data issue: false hasContentIssue false

Measuring institutional community engagement: Adding value to academic health systems

Published online by Cambridge University Press:  28 May 2019

Syed M. Ahmed*
Affiliation:
Office of the Senior Associate Dean and Associate Provost for Community Engagement, Medical College of Wisconsin, Milwaukee, WI, USA
Sharon Neu Young
Affiliation:
Office of the Senior Associate Dean and Associate Provost for Community Engagement, Medical College of Wisconsin, Milwaukee, WI, USA
Mia C. DeFino
Affiliation:
DeFino Consulting, LLC, Chicago, IL, USA
Joseph E. Kerschner
Affiliation:
Office of the Dean and Provost, School of Medicine, Medical College of Wisconsin, Milwaukee, WI, USA
*
*Address for correspondence: Dr. S.M. Ahmed, MD, DrPH, Department of Family and Community Medicine, Medical College of Wisconsin, 8701 Watertown Plank Road, Suite H2500, PO Box 26509, Milwaukee, WI 53226, USA. Email: sahmed@mcw.edu
Rights & Permissions [Opens in a new window]

Abstract

Beyond medical schools’ historical focus on pillar missions including clinical care, education, and research, several medical schools now include community engagement (CE) as a mission. However, most academic health systems (AHSs) lack the tools to provide metrics, evaluation, and standardization for quantifying progress and contributions of the CE mission. Several nationwide initiatives, such as that driven by the Institute of Medicine recommending advances in CE metrics at institutions receiving Clinical and Translational Science Awards, have encouraged the research and development of systematic metrics for CE, but more progress is needed. The CE components practical model provides a foundation for analyzing and evaluating different types of CE activities at AHSs through five components: research, education, community outreach and community service, policy and advocacy, and clinical care. At the Medical College of Wisconsin (MCW), an annual survey administered to faculty and staff assessed the types and number of CE activities from the prior year. Survey results were combined to create a CE report for departments across the institution and inform MCW leadership. Insights gathered from the survey have contributed to next steps in CE tracking and evaluation, including the development of a CE dashboard to track CE activities in real time. The dashboard provides resources for how individuals can advance the CE mission through their work and guide CE at the institutional level.

Type
Special Communications
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Association for Clinical and Translational Science 2019

For many years, community engagement (CE) or collaborations between institutions of higher education and their larger communities (local, regional, state, national, and global) for the mutually beneficial exchange of knowledge and resources in a context of partnership and reciprocity have been included in the missions of several land-grant universities [Reference Ahmed and Palermo1,Reference McCloskey2]. However to date, CE is only included as a distinct mission in a few [Reference Goldstein and Bearman3] medical schools. As CE is acknowledged as an important adjunct to improve the health of the nation through partnerships to reduce health disparities in communities, there is increased importance in including CE as a distinct School of Medicine (SOM) mission [Reference Michener49].

In relation to academic health systems (AHSs), most academic institutions have three missions: clinical care, education, and research [Reference Wilkins and Alberti10]. CE as a mission touches all other missions[Reference Wilkins and Alberti10,Reference Ahmed11]. CE in research (CEnR) is a research paradigm that involves communities as partners rather than subjects [Reference Ahmed and Palermo1,Reference Wilkins and Alberti10]. Most institutions may not really connect CEnR with a research mission. CEnR is a research process like any other research (e.g., basic, clinical research) and should be treated in a similar way [Reference Ahmed and Palermo1]. In addition to this assumption, the need for translational science, which goes from T1 to T5, requires involvement of larger stakeholders that include communities, and CE is a recognized process of making community inclusion happen [Reference Wilkins and Alberti10,Reference Selker and Wilkins12Reference Khodyakov14]. Integrating CE with the AHSs missions of clinical care, research, and education is essential to realize the benefits of CE and its ability to improve health [Reference Ahmed and Palermo1,Reference Wilkins and Alberti10,Reference Selker and Wilkins12,Reference Graham13].

In translational science research covering phases T1–T5, the critical aspect is moving research bi-directionally from bench to bedside to curbside [Reference Graham13,Reference Zerhouni15]. To get the community involved (curbside), there is a demand that communities become partners in all stages of research [Reference Wilkins and Alberti10,Reference Selker and Wilkins12,Reference Graham13]. CEnR lends itself to make this process move in an effective manner where collaboration and engagement of stakeholders are fundamentally built on the principles of CE [Reference Ahmed and Palermo1,Reference Selker and Wilkins12,Reference Graham13]. For translational research to succeed, it needs to utilize the skills of CE practitioners in involving relevant stakeholders [Reference Ahmed and Palermo1,Reference Wilkins and Alberti10,Reference Selker and Wilkins12,Reference Graham13].

CE moves beyond community outreach and service, which are traditional concepts of AHSs civic responsibilities primarily focused on volunteer activities that are either related (outreach) or unrelated (service) to professional roles [Reference Wilkins and Alberti10,Reference Ahmed11]. Community outreach and community service are primarily unidirectional on a spectrum of CE activities that extend to and are based on principles of bi-directionality between community-academic partners [Reference Wilkins and Alberti10,Reference Ahmed11]. CEnR is a critical component of CE’s larger platform. It is time for AHSs to go beyond community outreach and community service; although valuable, these activities do not necessarily bring community voices into programming or developing research that impacts communities [Reference Wilkins and Alberti10,Reference Ahmed11]. The confluence of the art and science of CE is actualized using CEnR processes [Reference Ahmed and Palermo1]. If AHSs need to advance science at the same time as effectively working with communities, then it is important to follow the principles of CE [Reference Ahmed and Palermo1,Reference Wilkins and Alberti10,Reference Selker and Wilkins12].

In addition to the need to integrate CE activities, CE stakeholders need to understand how CE affects academic institutions and the value of CE to the institution and community [Reference Goldstein and Bearman3,Reference Chung16Reference Bloodworth18]. The imperative for institutional metrics and valuation is not only for institutional interests, but also emerging requirements for CE valuation from external stakeholders (e.g., grant funding agencies and translational science partners). Within the United States, measuring CE activity and effects is a national directive from the NIH, Clinical and Translational Science Awards (CTSAs), Patient Centered Outcomes Research Institute, and the Carnegie Foundation [19Reference O’Meara and Jaeger21]. Paired with this directive of measuring CE activities, for example, previous Carnegie Foundation applications have required evidence of infrastructural support for CE programs and activities, including documented policies, financial investment, and marketing and promotion of CE programs and activities.

Medical schools have developed systems for tracking the contributions of clinical care, education, and research missions primarily, as well as for tracking faculty contributions to evaluate promotion and tenure status [Reference Michener4,Reference Olds and Barton22Reference Calleson, Jordan and Seifer25]. A mission may or may not generate a positive financial margin for the medical school or AHS, but each medical school mission positively contributes to improving society [Reference Kerschner26], and its value should be considered on multiple levels. For CE, there are measurements that have been described on a project basis and the factors that contribute to partnership success (context, trust, reciprocity, power dynamics, bi-directional communication, others) have been established [Reference Wallerstein and Duran5,Reference Lucero27Reference Wallerstein30]. Although some institutions have taken the initiative to document and catalog the extent of their CE activities [Reference Chung16,Reference Bloodworth18,Reference Fitzgerald, Bargerstock and Van Egeren31], it is uncommon among AHSs or medical schools to have a deep understanding of the types and number of CE activities that occur in their institution [Reference Michener4,Reference Chung16,Reference Calleson, Jordan and Seifer25,Reference Nokes32].

Most institutional processes that have been developed to track, measure, and value CE activities lack the robust, comprehensive, and detailed data comparable to other AHS missions, which limits the institution’s ability to provide accurate assessments to consider the contributions of CE activities to the institution. Therefore, it is critical for AHSs to take next steps in developing systems and processes to integrate CE tracking and metrics similar to other missions. CE activities related to research, education, funding, publications, and community change also need to be accounted for during the promotion and tenure process for faculty that focus primarily on CEnR and community-academic partnership projects for health. Developing new systems and processes for the institution can create administrative burden and requires that staff and faculty have a vested interest in the outcomes. This article describes one approach to measuring institutional CE and provides recommendations for future metrics and tracking CE.

The purpose of this paper is to describe the Medical College of Wisconsin’s (MCW’s) approach to measuring institutional CE as part of MCW’s mission; limitations of and challenges with an CE annual survey; and creating a dashboard for enhancing tracking, measuring, and valuing CE.

To advance CE as a mission in the MCW SOM and to further define institutional CE, a model, the CE components practical model, has been developed.

The CE Components Practical Model

The CE components practical model (Fig. 1) provides a foundation for analyzing different types of CE activities through its five components: research, education, community outreach and community service, policy and advocacy, and clinical care [Reference Ahmed11]. Overall, the model outlines different criteria for which activities are related to each component and proposes that not all CE activities fall under one component, but that there is the potential for activities to have overlapping components. The model includes institutional infrastructure and administrative functions that contribute to supporting and sustaining CE activities (such as promotion and tenure, tracking, and evaluation).

Fig. 1. The community engagement components practical model [Reference Ahmed11].

Measuring the Baseline of Institutional CE

At MCW, the Office of the Senior Associate Dean (SAD) for Community Engagement (OCE) led the development of survey questions to assess the baseline activity of institutional CE. An important part of establishing the institutional imperative and importance of the CE mission at MCW included the creation of an Office and SAD role in the medical school leadership team which was on par with the other recognized missions of research, education, and clinical care.

Survey Questions

The survey was developed based on: (1) background and expertise of the researchers associated with the OCE, (2) the Carnegie Classification Framework and literature, (3) specific definitions and references on CE activities from the literature, (5) defining questions around the CE mission, (6) strategic institutional interests, (7) feedback from the general community at MCW and collaborating institutions, and (8) evolution of the questions from the previous survey administration (i.e., feedback changed some of the questions for the next survey). Overall, the authors’ collective experience in developing and implementing CE programs helped develop the survey to track CE activities that faculty and staff do in communities. As tracking institutional CE is in its nascent state, the survey tool is a first step in attempting to quantify and track CE activities through the creation of administrative data of what is done in the real world.

Survey Administration

There was a total of 5 questions in the survey administered to faculty (2014) and to staff (2015) and 10 questions for the survey in the years that followed (2016 – faculty and 2017 – staff). Administration of the survey alternated years between faculty and staff. The survey was open for individuals to submit responses for ≅30 days regarding CE activities from the past 12 months. The online survey was distributed via a unique survey link sent directly to each individual’s MCW e-mail address. In 2016/2017, those individuals who had submitted responses in 2014/2015 had the opportunity to review data entered from the previous submission, approve the data that were still relevant, or delete data that were no longer relevant. After survey participants reviewed previous data, they entered any new data for the year. Data were collected between 2014 and 2017 and were analyzed in 2018. This study was approved by the IRB at the MCW under protocol PRO00027876.

Survey Results

In 2014, there were 656/1584 (41.4%) faculty responses and in 2016, 896/1688 (53.1%) responses. The total number of CE activities reported each year increased by 3.2-fold (478 vs 1507). In 2014, 282 faculty reported 478 CE activities, suggesting that ≅1.7 activities were reported per faculty (Table 1), whereas in 2016, 381 faculty reported 1507 CE activities, suggesting that ≅4 activities were reported per faculty, a 235% increase. The faculty reporting CE activities were from 23/26 departmentsFootnote a in 2014 and 33/35 departments in 2016, representing involvement from across the institution.

Table 1. Number of activities reported by faculty and staff

* In 2016/2017, service activities were asked about in a separate question from outreach activities.

NA, Not applicable; in 2014/2015, these activities were not asked about.

For staff, the response rate decreased between each year (2015: 56% [1690/3026] vs 2017: 34% [1318/3894]) and the total number of CE activities reported increased by 1.9-fold (686 vs 1330). In 2015, 321 staff reported 686 CE activities, suggesting that ≅2 activities were reported per staff member. In 2017, 421 staff reported 1330 CE activities, suggesting that ≅3.2 activities were reported per staff member, a 160% increase. The staff reporting CE activities were from 51/54 departments in 2015 and 53/56 departments in 2017, representing involvement from across the institution.

Types of CE Activities Reported

The most reported activities were in the outreach and community service component, followed by education and awards and events for faculty and staff; increases were observed each year. Table 1 shows the number of specific activities reported each year. For activities related to policy and advocacy, clinical care, and service, there were no comparison data from the 2014/2015 surveys as these questions were added in the 2016/2017 surveys.

Summary

Although the survey tool is limited in the clarity of the construct being measured (i.e., CE as a survey construct has not been thoroughly defined), it is continually being improved, and the data gathered thus far have provided remarkable, unprecedented insights to both the expansiveness and nature of CE activities throughout the institution. A broad distribution of CE activities was reported by faculty and staff throughout MCW. This is contrary to the expectation of the OCE that the results would show CE activities concentrated within specific departments and programs already known to practice CE, and which strongly identify their work as being related to CE.

Although an increase in activities may show evidence of a positive, upward trend in CE and its integration in the MCW community, additional factors affecting the survey population and tool should be considered in interpreting the results. One notable limitation: there are gaps in administration periods, respondents report on CE activities within the last 12-month period, but the survey is issued to respondents only every 2 years. As a result, the survey may not capture CE activities that occur during the gap year between reporting periods. Also, survey results show marked increases primarily in outreach and service types of CE activities. Multiple factors could contribute to this increase, including: (1) added language to a survey question for years 2016/2017, to explicitly include community service activities, which was absent in 2014/2015 surveys; (2) increase in survey participation; (3) increased awareness of the survey; (4) increase in education and understanding of CE and types of activities; and (5) the addition of two new regional campuses and the School of Pharmacy, increasing the size of the MCW population surveyed.

In addition to capturing CE activities, the survey included feedback questions that were modified each year to gather input relevant to specific MCW CE initiatives and strategic plans, including the CE mission and survey. This sampling of responses to feedback questions speaks to the value of having CE activities visibly tracked and measured: Staff: (1) “I endorse MCW’s efforts to quantify the levels of community engagement that it’s faculty, staff and students engage in. It helps MCW send a positive message to the public and it recognizes and credits MCW members”; (2) “I am very happy to see Community Engagement becoming a bigger priority for MCW, and I am eager for additional opportunities to serve both MCW and our communities. Thank you for moving this mission forward!”; (3) “Excellent survey, it is good to show our community support”; and (4) “Thanks. Hope that more people know the efforts of the MCW involvement of community engagement.”

The survey results were shared with institutional leadership and department chairs to convey where CE is the most prevalent in the institution. In addition, the institutional leadership and department chairs were provided with a supporting document that included guidance in how to interpret the results, and how they may use the survey results to support their departments’ CE staff and mission goals.

Challenges

Respondents also provided comments that address challenges for the institution related to CE and CE tracking and measurement, including annual survey limitations, perceived lack of career growth, lack of buy-in and prioritization for the mission, and need for infrastructural support that compares with tools used for other missions and priorities: Staff: (1) “This survey tool is much too long for someone as active as I am”; (2) “I feel that I spend so much time at work that I don’t know how I would fit in Community work into my schedule. As this is part of the MCW mission, I would love to see it be made a priority”; (3) “Provide more infrastructure that actually supports research and program implementation”; (4) “This seems like an unimportant strategic focus”; and (5) “Many of my colleagues and I are interested in community engagement. However, many of us also feel that it is under-valued (i.e. when it comes to promotions to associate professor or professor) when compared to research, clinical or teaching efforts. Whether or not this is true, I don’t know (maybe there just needs to be some clarification here?).”

The MCW CE survey administration has highlighted several challenges to fully accounting for institutional CE. First, the results are periodic and only available every other year creating difficulty in assessing more proximate interventions and assessments throughout the year. More timely reports of CE activities could be valuable when department chairs are meeting with faculty to assess CE activities or when institutional leadership needs to consider investment strategies or acknowledge high-performing individuals/departments. Second, individuals within the institution often have difficulty perceiving how their participation or prioritization of CE activities are valued and what effect they have on their opportunities for promotion and recognition, or that they are accountable for supporting growth of this mission in their faculty or staff roles. Third, static timepoints of evaluation fail to show how engagement and involvement are evolving and individuals and departments cannot intervene or change their levels of CE based on prior results. Fourth, failure to monitor the consistency and quality of CE efforts can result in not only poor and inconsistent results, but also damaged relationships and result in a decrease in trust from community partners. Finally, while increasing the frequency of data collection, such as an annual survey, may have the positive effect of raising visibility of CE as a purposeful and valued activity, this frequency of data collection will still be in stark contrast to the robust mechanisms used to track and measure other critical operations within an AHS and the other missions of the institution.

Future Directions

MCW is taking steps to develop infrastructure that will address challenges identified by the survey. The CE dashboard, which is in development, will be a central online repository for individuals to access tools that provide ongoing support for CE work. While many elements will take extensive work and are long-term goals, the envisioned scope of the CE dashboard tool includes:

  1. 1. Tools for real-time entry of CE activities to make results timelier.

    1. a. Leaders can use the dashboard to review each faculty’s CE activity at any time, which makes this tracking of CE activities more flexible and accessible (i.e., the timing is not dependent on when the survey is administered).

    2. b. Individual activities can be directly connected to the professional development system to assess progress toward CE-related goals. This creates better alignment with job responsibilities and goals in supporting the CE mission.

  2. 2. Access to software specific to CE work, such as partnership management and networking.

  3. 3. Resource links, including:

    1. a. forms, best practices for establishing partnerships with the community [Reference McCloskey2], related policies, training materials, toolkits, and related templates to facilitate CE work;

    2. b. frequently asked questions; and

    3. c. access to MCW’s CE Learning Repository Resource to assist with sharing CE products across institutions and communities.

By creating a dashboard with these existing tools, CE users will have a resource to incorporate in their daily practice of CE activities, including real-time tracking, to hold them accountable, facilitate their work, and provide access to resources.

We have also identified areas in need of development, which would further enhance institutional CE infrastructure:

  1. 1. Assessing the downstream effects of CE activities remains a longstanding challenge for institutions for which there is no existing best practice (i.e., no established guideline for measuring the effects of CE activities). One approach has been to apply a common volunteer rate/hourly wage to the number of hours of community outreach and service, which yields a financial figure for what was donated to the community as time [Reference Lucero27]. In the era of health economics and health outcomes research, there may be future models developed to accurately account for the effect and value of the CE research and clinical activities. For example, policy and advocacy activities often have a value-added statement to show how much a policy decision will influence the individuals and community downstream.

  2. 2. Scorecard metrics are used at several institutions and could be applied to CE activities. Improving the clarity of CE as a construct will help strengthen the measurement of CE through the scorecard. This is a tool already used in other applications and could benefit AHSs if repurposed more specifically for CE to help individual departments measure the level of CE activity and set goals to advance CE. The scorecard of each department would then be used by institutional leadership to review which departments are involved, making improvements, and engaging their faculty and staff in supporting the CE mission.

  3. 3. Return on investment (ROI) does not include CE in current metrics and measures presented to board representatives. Either developing a separate ROI metric for CE or revamping formulas to include CE as part of the performance and financial measures would be a powerful step in affirming the priority of CE as a mission to help leaders understand how to value their efforts in supporting the CE mission and present that value of CE appropriately on par with other missions in the institution.

  4. 4. Institutions need to develop strategies and processes for bi-directional communication on CE, both institution-wide and to external stakeholders and community partners. Systematic, accessible feedback mechanisms to garner community input that is in-time and relevant is a critical step in holding institutions accountable and being responsive to community partners. Strategies for communicating CE performance to the entire institution, and not just to CE-focused practitioners, is another step in elevating the CE mission in parity with other AHS missions.

How CE is implemented is unique to every institution, and any portfolio of tools will need to be modified per the requirements, goals, and circumstances of the institution. However, the current landscape for AHSs indicates institutions need to approach CE with greater rigor in order to compete and to excel. Beyond the tools listed within the CE dashboard, institutions can look for further innovations similar to those used for other AHS missions that support, catalyze, and improve CE quality and outcomes, as much for the institution’s benefit as for community partners.

Acknowledgements

Funding for this work was partially supported by: National CTSAs 5UL1TR000055-05 and UL1TR001436, NIH and MCW Community Engagement Core Implementation Initiative FP00009037, Advancing a Healthier Wisconsin Research and Education Program.

Disclosures

The authors declare no conflicts of interest.

Footnotes

a The number of departments varies for faculty and staff because of HR classifications and assignments; true values are presented for transparency and to make the statement that the majority of departments and centers responded to the survey each year whether faculty or staff were surveyed.

References

Ahmed, SM, Palermo, AG. Community engagement in research: frameworks for education and peer review. American Journal of Public Health 2010; 100(8): 13801387.CrossRefGoogle ScholarPubMed
McCloskey, DJ, et al. Principles of Community Engagement. 2nd ed. NIH Publication No. 11–7782; Washington DC, USA: Department of Health and Human Services, 2011.Google Scholar
Goldstein, AO, Bearman, RS. Community engagement in US and Canadian medical schools. Advances in Medical Education and Practice 2011; 2: 4349.CrossRefGoogle ScholarPubMed
Michener, L, et al. Aligning the goals of community-engaged research: why and how academic health centers can successfully engage with communities to improve health. Academic Medicine 2012; 87(3): 285291.CrossRefGoogle ScholarPubMed
Wallerstein, N, Duran, B. Community-based participatory research contributions to intervention research: the intersection of science and practice to improve health equity. American Journal of Public Health 2010; 100(Suppl 1): S40S46.CrossRefGoogle ScholarPubMed
Woollard, R. Caring for a common future: medical schools’ social accountability. Medical Education 2006; 40(4): 301313.CrossRefGoogle ScholarPubMed
McLachlan, CS, et al. Towards a science of community engagement. Lancet 2006; 367: 302.Google Scholar
Aguilar-Gaxiola, S, et al. Towards a unified taxonomy of health indicators: academic health centers and communities working together to improve population health. Academic Medicine 2014; 89(4): 564572.CrossRefGoogle ScholarPubMed
The best research is produced when researchers and communities work together. Nature 2018; 562(7725): 7.CrossRefGoogle Scholar
Wilkins, CH, Alberti, PM. Shifting academic health centers from a culture of community service to community engagement and integration [Published ahead of print, March 2019]. Academic Medicine 2019.CrossRefGoogle Scholar
Ahmed, SM, et al. Towards a practical model for community engagement: advancing the art and science in academic health centers. Journal of Clinical and Translational Science 2017; 1(5): 310315.CrossRefGoogle ScholarPubMed
Selker, HP, Wilkins, CH. From community engagement, to community-engaged research, to broadly engaged team science. Journal of Clinical and Translational Science 2017; 1: 56.CrossRefGoogle Scholar
Graham, PW, et al. What is the role of culture, diversity, and community engagement in transdisciplinary translational science? Translational Behavioral Medicine 2016; 6(1): 115124.CrossRefGoogle ScholarPubMed
Khodyakov, D, et al. On using ethical principles of community-engaged research in translational science. Translational Research 2016; 171: 5262 e51.CrossRefGoogle ScholarPubMed
Zerhouni, EA. Translational and clinical science--time for a new vision. The New England Journal of Medicine 2005; 353(15): 16211623.CrossRefGoogle ScholarPubMed
Chung, B, et al. Faculty participation in and needs around community engagement within a large multiinstitutional clinical and translational science awardee. Clinical and Translational Science 2015; 8(5): 506512.CrossRefGoogle ScholarPubMed
Chung, B, et al. Implementing community engagement as a mission at the David Geffen School of Medicine at the University of California, Los Angeles. Journal of Health Care for the Poor and Underserved 2016; 27(1): 821.CrossRefGoogle Scholar
Bloodworth, LS, et al. Considerations for embracing and expanding community engaged scholarship in academic pharmacy: report of the 2013–2014 research and graduate affairs committee. American Journal of Pharmaceutical Education 2014; 78(8): S8.CrossRefGoogle ScholarPubMed
Institute of Medicine. Review of Clinical and Translational Science Award Program at the National Center for Advancing Translational Science. Washington DC, June 25, 2014.Google Scholar
Frank, L, Basch, E, Selby, JV. Patient-Centered Outcomes Research I. The PCORI perspective on patient-centered outcomes research. JAMA 2014; 312(15): 15131514.CrossRefGoogle ScholarPubMed
O’Meara, KA, Jaeger, AJ. Preparing future faculty for community engagement: barriers, facilitators, models and recommendations. Journal of Higher Education Outreach and Engagement 2006; 11(4): 3.Google Scholar
Olds, GR, Barton, KA. Building medical schools around social missions: the case of the University of California, Riverside. Health Systems & Reform 2015; 1(3): 200206.CrossRefGoogle Scholar
Mallon, WT, Jones, RF. How do medical schools use measurement systems to track faculty activity and productivity in teaching? Academic Medicine 2002; 77(2): 115123.CrossRefGoogle ScholarPubMed
Nutter, DO, et al. Measuring faculty effort and contributions in medical education. Academic Medicine 2000; 75(2): 199207.CrossRefGoogle ScholarPubMed
Calleson, DC, Jordan, C, Seifer, SD. Community-engaged scholarship: is faculty work in communities a true academic enterprise? Academic Medicine 2005; 80(4): 317321.CrossRefGoogle ScholarPubMed
Kerschner, JE, et al. Recommendations to sustain the academic mission ecosystem at U.S. medical schools. Academic Medicine 2018; 93(7): 985989.CrossRefGoogle ScholarPubMed
Lucero, J, et al. Development of a mixed methods investigation of process and outcomes of community-based participatory research. Journal of Mixed Methods Research 2018; 12(1): 5574.CrossRefGoogle ScholarPubMed
Eder, MM, et al. A logic model for community engagement within the Clinical and Translational Science Awards consortium: can we measure what we model? Academic Medicine 2013; 88(10): 14301436.CrossRefGoogle ScholarPubMed
Oetzel, JG, et al. Enhancing stewardship of community-engaged research through governance. American Journal of Public Health 2015; 105(6): 11611167.CrossRefGoogle ScholarPubMed
Wallerstein, N. Commentary: challenges for the field in overcoming disparities through a CBPR approach. Ethnicity & Disease 2006;16 (1 Suppl 1): S146S148.Google ScholarPubMed
Fitzgerald, HE, Bargerstock, BA, Van Egeren, LA. The Outreach Engagement Measurement Instrument (OEMI): A Review. East Lansing, Michigan: Michigan State University, 2010. 124.Google Scholar
Nokes, KM, et al. Faculty perceptions of how community-engaged research is valued in tenure, promotion, and retention decisions. Clinical and Translational Science 2013; 6(4): 259266.CrossRefGoogle ScholarPubMed
Figure 0

Fig. 1. The community engagement components practical model [11].

Figure 1

Table 1. Number of activities reported by faculty and staff