We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clinical research is critical for healthcare advancement, but participant recruitment remains challenging. Clinical research professionals (CRPs; e.g., clinical research coordinator, research assistant) perform eligibility prescreening, ensuring adherence to study criteria while upholding scientific and ethical standards. This study investigates the key information CRP prioritizes during eligibility prescreening, providing insights to optimize data standardization, and recruitment approaches.
Methods:
We conducted a freelisting survey targeting 150 CRPs from diverse domains (i.e., neurological disorders, rare diseases, and other diseases) where they listed essential information they look for from medical records, participant/caregiver inquiries, and discussions with principal investigators to determine a potential participant’s research eligibility. We calculated the salience scores of listed items using Anthropac, followed by a two-level analytic procedure to classify and thematically categorize the data.
Results:
The majority of participants were female (81%), identified as White (44%) and as non-Hispanic (64.5%). The first-level analysis universally emphasized age, medication list, and medical history across all domains. The second-level analysis illuminated domain-specific approaches in information retrieval: for instance, history of present illness was notably significant in neurological disorders during participant and principal investigator inquiries, while research participation was distinctly salient in potential participant inquiries within the rare disease domain.
Conclusion:
This study unveils the intricacies of eligibility prescreening, with both universal and domain-specific methods observed. Variations in data use across domains suggest the need for tailored prescreening in clinical research. Incorporating these insights into CRP training and refining prescreening tools, combined with an ethical, participant-focused approach, can advance eligibility prescreening practices.
Social and environmental determinants of health (SEDoH) are crucial for achieving a holistic understanding of patient health. In fact, geographic factors may have more influence on health outcomes than patients’ genetics. Integrating SEDoH into the electronic health record (EHR), however, poses notable technical and compliance-related challenges. We evaluated barriers to the integration of SEDoH in the EHR and developed a privacy-preserving strategy to mitigate risk of protected health information exposure. Using coded identifiers for patient addresses, the strategy evaluates an alternative approach to ensure efficient, secure geocoding of data while preserving privacy throughout the data enrichment processes from numerous SEDoH data sources.
Archaeologists seek to improve our understanding of the past by studying, preserving, protecting, and sharing nonreplaceable archaeological resources. Archaeological collections hold information that can assist these aims as long as they are properly cared for, identified, and accessible. One of the most serious barriers is the lack of large-scale coordinated efforts to make archaeological collections findable and accessible. This article suggests that developing and implementing the use of a standardized set of attributes regarding collections provides solutions and strategies to find collections. These attributes can connect and standardize existing archaeological collections from a variety of sources (federal and state agencies, CRM firms, Indigenous and descendant communities, and academic departments), serving the profession in multiple ways. Most critically, the baseline data can be synthesized to inform and direct priorities for future fieldwork, thereby decreasing redundancy in archaeological collections and improving curation efforts nationwide. Such efforts would also provide a resource to students and researchers looking to understand and interpret the past at multiple scales by encouraging more collections-based research and less archaeological site destruction. Access for descendant communities will also be improved with information about their cultural heritage. This, in turn, encourages transparency and collaboration between those communities and archaeologists.
Integrating social and environmental determinants of health (SEDoH) into enterprise-wide clinical workflows and decision-making is one of the most important and challenging aspects of improving health equity. We engaged domain experts to develop a SEDoH informatics maturity model (SIMM) to help guide organizations to address technical, operational, and policy gaps.
Methods:
We established a core expert group consisting of developers, informaticists, and subject matter experts to identify different SIMM domains and define maturity levels. The candidate model (v0.9) was evaluated by 15 informaticists at a Center for Data to Health community meeting. After incorporating feedback, a second evaluation round for v1.0 collected feedback and self-assessments from 35 respondents from the National COVID Cohort Collaborative, the Center for Leading Innovation and Collaboration’s Informatics Enterprise Committee, and a publicly available online self-assessment tool.
Results:
We developed a SIMM comprising seven maturity levels across five domains: data collection policies, data collection methods and technologies, technology platforms for analysis and visualization, analytics capacity, and operational and strategic impact. The evaluation demonstrated relatively high maturity in analytics and technological capacity, but more moderate maturity in operational and strategic impact among academic medical centers. Changes made to the tool in between rounds improved its ability to discriminate between intermediate maturity levels.
Conclusion:
The SIMM can help organizations identify current gaps and next steps in improving SEDoH informatics. Improving the collection and use of SEDoH data is one important component of addressing health inequities.
During the COVID-19 pandemic, research organizations accelerated adoption of technologies that enable remote participation. Now, there’s a pressing need to evaluate current decentralization practices and develop appropriate research, education, and operations infrastructure. The purpose of this study was to examine current adoption of decentralization technologies in a sample of clinical research studies conducted by academic research organizations (AROs).
Methods:
The setting was three data coordinating centers in the U.S. These centers initiated coordination of 44 clinical research studies during or after 2020, with national recruitment and enrollment, and entailing coordination between one and one hundred sites. We determined the decentralization technologies used in these studies.
Results:
We obtained data for 44/44 (100%) trials coordinated by the three centers. Three technologies have been adopted across nearly all studies (98–100%): eIRB, eSource, and Clinical Trial Management Systems. Commonly used technologies included e-Signature (32/44, 73%), Online Payments Portals (26/44, 59%), ePROs (23/44, 53%), Interactive Response Technology (22/44, 50%), Telemedicine (19/44, 43%), and eConsent (18/44, 41%). Wearables (7/44,16%) and Online Recruitment Portals (5/44,11%) were less common. Rarely utilized technologies included Direct-to-Patient Portals (1/44, 2%) and Home Health Nurse Portals (1/44, 2%).
Conclusions:
All studies incorporated some type of decentralization technology, with more extensive adoption than found in previous research. However, adoption may be strongly influenced by institution-specific IT and informatics infrastructure and support. There are inherent needs, responsibilities, and challenges when incorporating decentralization technology into a research study, and AROs must ensure that infrastructure and informatics staff are adequate.
Randomized clinical trials (RCT) are the foundation for medical advances, but participant recruitment remains a persistent barrier to their success. This retrospective data analysis aims to (1) identify clinical trial features associated with successful participant recruitment measured by accrual percentage and (2) compare the characteristics of the RCTs by assessing the most and least successful recruitment, which are indicated by varying thresholds of accrual percentage such as ≥ 90% vs ≤ 10%, ≥ 80% vs ≤ 20%, and ≥ 70% vs ≤ 30%.
Methods:
Data from the internal research registry at Columbia University Irving Medical Center and Aggregated Analysis of ClinicalTrials.gov were collected for 393 randomized interventional treatment studies closed to further enrollment. We compared two regularized linear regression and six tree-based machine learning models for accrual percentage (i.e., reported accrual to date divided by the target accrual) prediction. The outperforming model and Tree SHapley Additive exPlanations were used for feature importance analysis for participant recruitment. The identified features were compared between the two subgroups.
Results:
CatBoost regressor outperformed the others. Key features positively associated with recruitment success, as measured by accrual percentage, include government funding and compensation. Meanwhile, cancer research and non-conventional recruitment methods (e.g., websites) are negatively associated with recruitment success. Statistically significant subgroup differences (corrected p-value < .05) were found in 15 of the top 30 most important features.
Conclusion:
This multi-source retrospective study highlighted key features influencing RCT participant recruitment, offering actionable steps for improvement, including flexible recruitment infrastructure and appropriate participant compensation.
It is important for SARS-CoV-2 vaccine providers, vaccine recipients, and those not yet vaccinated to be well informed about vaccine side effects. We sought to estimate the risk of post-vaccination venous thromboembolism (VTE) to meet this need.
Methods
We conducted a retrospective cohort study to quantify excess VTE risk associated with SARS-CoV-2 vaccination in US veterans age 45 and older using data from the Department of Veterans Affairs (VA) National Surveillance Tool. The vaccinated cohort received at least one dose of a SARS-CoV-2 vaccine at least 60 days prior to 3/06/22 (N = 855,686). The control group was those not vaccinated (N = 321,676). All patients were COVID-19 tested at least once before vaccination with a negative test. The main outcome was VTE documented by ICD10-CM codes.
Results
Vaccinated persons had a VTE rate of 1.3755 (CI: 1.3752–1.3758) per thousand, which was 0.1 percent over the baseline rate of 1.3741 (CI: 1.3738–1.3744) per thousand in the unvaccinated patients, or 1.4 excess cases per 1,000,000. All vaccine types showed a minimal increased rate of VTE (rate of VTE per 1000 was 1.3761 (CI: 1.3754–1.3768) for Janssen; 1.3757 (CI: 1.3754–1.3761) for Pfizer, and for Moderna, the rate was 1.3757 (CI: 1.3748–1.3877)). The tiny differences in rates comparing either Janssen or Pfizer vaccine to Moderna were statistically significant (p < 0.001). Adjusting for age, sex, BMI, 2-year Elixhauser score, and race, the vaccinated group had a minimally higher relative risk of VTE as compared to controls (1.0009927 CI: 1.007673–1.0012181; p < 0.001).
Conclusion
The results provide reassurance that there is only a trivial increased risk of VTE with the current US SARS-CoV-2 vaccines used in veterans older than age 45. This risk is significantly less than VTE risk among hospitalized COVID-19 patients. The risk-benefit ratio favors vaccination, given the VTE rate, mortality, and morbidity associated with COVID-19 infection.
To identify the informatics educational needs of clinical and translational research professionals whose primary focus is not informatics.
Introduction:
Informatics and data science skills are essential for the full spectrum of translational research, and an increased understanding of informatics issues on the part of translational researchers can alleviate the demand for informaticians and enable more productive collaborations when informaticians are involved. Identifying the level of interest in different topics among various types of of translational researchers will help set priorities for development and dissemination of informatics education.
Methods:
We surveyed clinical and translational science researchers in Clinical and Translational Science Award (CTSA) programs about their educational needs and preferences.
Results:
Researchers from 23 out of the 62 CTSA hubs responded to the survey. 67% of respondents across roles and topics expressed interest in learning about informatics topics. There was high interest in all 30 topics included in the survey, with some variation in interest depending on the role of the respondents.
Discussion:
Our data support the need to advance training in clinical and biomedical informatics. As the complexity and use of information technology and data science in research studies grows, informaticians will continue to be a limited resource for research collaboration, education, and training. An increased understanding of informatics issues across translational research teams can alleviate this burden and allow for more productive collaborations. To inform a roadmap for informatics education for research professionals, we suggest strategies to use the results of this needs assessment to develop future informatics education.
A national electronic health record is being procured for Health Service Executive hospitals in Ireland. A number of hospitals have implemented an electronic document management system. This study aimed to investigate the efficiency and safety of the electronic document management system in our centre.
Methods
A retrospective audit was performed of patients operated on at Galway University Hospital. The availability and location of patients’ admission data on the electronic document management system were recorded. These data were analysed using Microsoft Excel software, version 16.45.
Results
The records of 100 patients were analysed. The main findings were: 5 per cent of operation notes were missing, 80 per cent were in the incorrect section, while 15 per cent were in the correct ‘procedure’ section on the electronic document management system.
Conclusion
This study shows there is potential for error with ‘paper-light’ solutions, whereby delayed scanning, misfiling of scanned records and missing records may lead to significant delays in treatment and potential patient safety issues.
As the USA and the rest of the world raced to fight the COVID-19 pandemic, years of investments from the National Center for Advancing Translational Sciences allowed for informatics services and resources at CTSA hubs to play a significant role in addressing the crisis. CTSA hubs partnered with local and regional partners to collect data on the pandemic, provide access to relevant patient data, and produce data dashboards to support decision-making. Coordinated efforts, like the National COVID Cohort Collaborative (N3C), helped to aggregate and harmonize clinical data nationwide. Even with significant informatics investments, some CTSA hubs felt unprepared in their ability to respond to the fast-moving public health crisis. Many hubs were forced to quickly evolve to meet local needs. Informatics teams expanded critical support at their institutions which included an engagement platform for clinical research, COVID-19 awareness and education activities in the community, and COVID-19 data dashboards. Continued investments in informatics resources will aid in ensuring that tools, resources, practices, and policies are aligned to meet local and national public health needs.
COVID-19 is a major health threat around the world causing hundreds of millions of infections and millions of deaths. There is a pressing global need for effective therapies. We hypothesized that leukotriene inhibitors (LTIs), that have been shown to lower IL6 and IL8 levels, may have a protective effect in patients with COVID-19.
Methods:
In this retrospective controlled cohort study, we compared death rates in COVID-19 patients who were taking a LTI with those who were not taking an LTI. We used the Department of Veterans Affairs (VA) Corporate Data Warehouse (CDW) to create a cohort of COVID-19-positive patients and tracked their use of LTIs between November 1, 2019 and November 11, 2021.
Results:
Of the 1,677,595 cohort of patients tested for COVID-19, 189,195 patients tested positive for COVID-19. Forty thousand seven hundred one were admitted. 38,184 had an oxygen requirement and 1214 were taking an LTI. The use of dexamethasone plus a LTI in hospital showed a survival advantage of 13.5% (CI: 0.23%–26.7%; p < 0.01) in patients presenting with a minimal O2Sat of 50% or less. For patients with an O2Sat of <60 and <50% if they were on LTIs as outpatients, continuing the LTI led to a 14.4% and 22.25 survival advantage if they were continued on the medication as inpatients.
Conclusions:
When combined dexamethasone and LTIs provided a mortality benefit in COVID-19 patients presenting with an O2 saturations <50%. The LTI cohort had lower markers of inflammation and cytokine storm.
The coronavirus crisis is causing considerable disruption and anguish. However, the COVID-19 pandemic and consequent explosion of telehealth services also provide an unparalleled opportunity to consider ethical, legal, and social issues (ELSI) beyond immediate needs. Ethicists, informaticians, and others can learn from experience, and evaluate information technology practices and evidence on which to base policy and standards, identify significant values and issues, and revise ethical guidelines. This paper builds on professional organizations’ guidelines and ELSI scholarship to develop emerging concerns illuminated by current experience. Four ethical themes characterized previous literature: quality of care and the doctor–patient relationship, access, consent, and privacy. More attention is needed to these and to expanding the scope of ethical analysis to include health information technologies. An applied ethics approach to ELSI would addresses context-specific issues and the relationships between people and technologies, and facilitate effective and ethical institutionalization of telehealth and other health information technologies.
Despite significant advancements in healthcare technology, digital health solutions – especially those for serious mental illnesses – continue to fall short of their potential across both clinical practice and efficacy. The utility and impact of medicine, including digital medicine, hinges on relationships, trust, and engagement, particularly in the field of mental health. This paper details results from Phase 1 of a two-part study that seeks to engage people with schizophrenia, their family members, and clinicians in co-designing a digital mental health platform for use across different cultures and contexts in the United States and India.
Methods
Each site interviewed a mix of clinicians, patients, and their family members in focus groups (n = 20) of two to six participants. Open-ended questions and discussions inquired about their own smartphone use and, after a demonstration of the mindLAMP platform, specific feedback on the app's utility, design, and functionality.
Results
Our results based on thematic analysis indicate three common themes: increased use and interest in technology during coronavirus disease 2019 (COVID-19), concerns over how data are used and shared, and a desire for concurrent human interaction to support app engagement.
Conclusion
People with schizophrenia, their family members, and clinicians are open to integrating technology into treatment to better understand their condition and help inform treatment. However, app engagement is dependent on technology that is complementary – not substitutive – of therapeutic care from a clinician.
The recipients of NIH’s Clinical and Translational Science Awards (CTSA) have worked for over a decade to build informatics infrastructure in support of clinical and translational research. This infrastructure has proved invaluable for supporting responses to the current COVID-19 pandemic through direct patient care, clinical decision support, training researchers and practitioners, as well as public health surveillance and clinical research to levels that could not have been accomplished without the years of ground-laying work by the CTSAs. In this paper, we provide a perspective on our COVID-19 work and present relevant results of a survey of CTSA sites to broaden our understanding of the key features of their informatics programs, the informatics-related challenges they have experienced under COVID-19, and some of the innovations and solutions they developed in response to the pandemic. Responses demonstrated increased reliance by healthcare providers and researchers on access to electronic health record (EHR) data, both for local needs and for sharing with other institutions and national consortia. The initial work of the CTSAs on data capture, standards, interchange, and sharing policies all contributed to solutions, best illustrated by the creation, in record time, of a national clinical data repository in the National COVID-19 Cohort Collaborative (N3C). The survey data support seven recommendations for areas of informatics and public health investment and further study to support clinical and translational research in the post-COVID-19 era.
The research footprint of Information Technology (IT) in a legal system has not grown with the same pace as it has penetrated other domains. More specifically, in developing countries such as India, where the digitalization revolution is underway, the growth of legal informatics (LI) is still premature and very limited traces of IT can be observed to assist and elevate the legal system, which still functions very much in an old school way. The faster growth of population and the diminishing proportion of judicial executives and the deteriorating law and order situation along with declining human rights demand the urgent evolution of LI to grow at a very rapid pace to attain its maturity. However, the human harassments are pretty prevailing across the nation, but its intensity increases manifold when it comes to the law-enforcement agencies tasked with responsible policing, more specifically, the state police, which often operates with compromised work ethics. The situation becomes more appalling with a vulnerable population, especially women. As a result, such a population often does not muster enough courage to go to a police station to file their complaints despite acute mental and emotional pain. This is to avoid further trauma by police harassment and ergo a large number of cases go unnoticed. An underprivileged rape victim, who tries to file a report by going to a police station is a classic example of such a situation; where she is not only denied, but also gets harassed by insensitive police official(s) at the station; consequently, a good number of such victims do not go and their cases are not reported.
In this research work, we have developed a computational framework, called eLegalls, an LI-enabled innovation, as an effective solution to the above stated issues. The eLegalls system facilitates users to file their reports to police in their geographic jurisdiction, through its efficient and secure interface without any in-person visit. The eLegalls will help the vulnerable population to avoid unwanted denial and impending harassment by the police official(s) at the police station. The system is also equipped with some secure and pertinent features for the lawyers or attorneys to efficiently advocate in assigned cases. The eLegalls is envisioned to eventually be a successful legal tech, effectively serving the community.
Introduction: There is ongoing concern about the burden placed on healthcare systems by lab tests. Although these concerns are widespread, it is difficult to quantify the extent of the problem. One approach involves use of a metric known as the Mean Abnormal Response Rate (MARR), which is the proportion of tests ordered that return an abnormal result; a higher MARR value indicates higher yield. The primary objective of this study was to calculate MARRs for tests ordered between April 2014 and March 2019 at the four adult emergency departments (EDs) covering a metropolitan population of 1.3 million. Secondary objectives included identifying tests with highest and lowest MARRs; comparison of MARRs for nurse- and physician-initiated orders; correlation of the number of tests per order requisition to MARR; and correlation of physician experience to MARR. Methods: In total, 40 laboratory tests met inclusion criteria for this study. Administrative data on these tests as ordered at the four EDs were obtained and analyzed. Multi-component test results, such as from CBC, were consolidated such that an abnormal result for any component was coded as an abnormal result for the entire test. Repeat tests ordered within a single patient visit were excluded. Physician experience was quantified for 209 ED physicians as number of years since licensure. Analyses were descriptive where appropriate for whole-population data. Risk of bias was attenuated by the focus on administrative data. Results: The population dataset comprised 33,757,004 test results on 415,665 unique patients. Of these results, 30.3% were the outcomes of nurse-initiated orders. The 5-year MARRs for the four hospitals were 38.3%, 40.0%, 40.7% and 40.9%. The highest per-test MARRs were for BNP (80.5%) and CBC (62.6%), while the lowest were for glucose (7.9%) and sodium (11.6%). MARRs were higher for nurse-initiated orders than for physician-initiated orders (44.7% vs. 38.1%), likely due to the greater order frequency of high-yield CBC in nurse-initiated orders (38.6% vs. 18.1%). The number of tests per order requisition was inversely associated with MARR (r = -0.90, p < 0.001). Finally, the number of years since licensure was modestly but significantly associated with MARR (r = 0.28, p < 0.001). Conclusion: This is the first and largest study to apply the MARR in an ED setting. As a metric, MARR effectively identifies differences in test ordering practices on per-test and per-hospital bases, which could be useful for data-informed practice optimization.
Worldwide, early intervention services for young people with recent-onset psychosis have been associated with improvements in outcomes, including reductions in hospitalization, symptoms, and improvements in treatment engagement and work/school participation. States have received federal mental health block grant funding to implement team-based, multi-element, evidence-based early intervention services, now called coordinated specialty care (CSC) in the USA. New York State’s CSC program, OnTrackNY, has grown into a 23-site, statewide network, serving over 1800 individuals since its 2013 inception. A state-supported intermediary organization, OnTrackCentral, has overseen the growth of OnTrackNY. OnTrackNY has been committed to quality improvement since its inception. In 2019, OnTrackNY was awarded a regional hub within the National Institute of Mental Health-sponsored Early Psychosis Intervention Network (EPINET). The participation in the national EPINET initiative reframes and expands OnTrackNY’s quality improvement activities. The national EPINET initiative aims to develop a learning healthcare system (LHS); OnTrackNY’s participation will facilitate the development of infrastructure, including a systematic approach to facilitating stakeholder input and enhancing the data and informatics infrastructure to promote quality improvement. Additionally, this infrastructure will support practice-based research to improve care. The investment of the EPINET network to build regional and national LHSs will accelerate innovations to improve quality of care.
This chapter describes the National Institute of Mental Health (NIMH) Research Domain Criteria (RDoC) initiative, illustrating how elements from different “levels” or “units” of analysis are represented as domains, constructs, and subconstructs to form the RDoC “matrix.” The example of “working memory” is used to show that the matrix possesses conceptual elements drawn from diverse theories and experiments about working memory, but lacks a controlled vocabulary and specification of putative relations among the elements, that would serve as an organizing framework. An ontology (in the informatics sense) can be developed using existing tools to represent and analyze relations between most adjacent levels, with a notable exception: the link between cellular/network systems and observable behavior, which directly confronts the mind-body problem. Finally, the chapter considers how the field may best wrangle with the RDoC matrix using bottom-up and top-down strategies, while leveraging advantages and avoiding pitfalls of computational models and artificial intelligence.
Since the launch of the Materials Genome Initiative (MGI) the field of materials informatics (MI) emerged to remove the bottlenecks limiting the pathway towards rapid materials discovery. Although the machine learning (ML) and optimization techniques underlying MI were developed well over a decade ago, programs such as the MGI encouraged researchers to make the technical advancements that make these tools suitable for the unique challenges in materials science and engineering. Overall, MI has seen a remarkable rate in adoption over the past decade. However, for the continued growth of MI, the educational challenges associated with applying data science techniques to analyse materials science and engineering problems must be addressed. In this paper, we will discuss the growing use of materials informatics in academia and industry, highlight the need for educational advances in materials informatics, and discuss the implementation of a materials informatics course into the curriculum to jump-start interested students with the skills required to succeed in materials informatics projects.
In addition to student assessment, curriculum assessment is a critical element to any pedagogy. It helps the educator assess the teaching of concepts, determine what may be lacking, and make changes for continual improvement. Meaningful assessment can be complicated when disciplines converge or when new approaches are implemented. To facilitate this, we present a network-based visualization schema to represent a materials informatics curriculum that combines materials science and data science concepts. We analyze the curriculum using network representations and relevant concepts from graph theory. This reveals established connections, linkages between materials science and data science, and the extent to which different concepts are connected. We also describe how some materials science topics are introduced from a data perspective, and present an illustrative case study from the curriculum.