We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Text is a major medium of contemporary interpersonal communication but is difficult for social scientists to study unless they have significant resources or the skills to build their own research platform. In this paper, we introduce a cloud-based software solution to this problem: ReChat, an online research platform for conducting experimental and observational studies of live text conversations. We demonstrate ReChat by applying it to a specific phenomenon of interest to political scientists: conversations among co-partisans. We present results from two studies, focusing on (1) self-selection factors that make chat participants systematically unrepresentative and (2) a pre-registered analysis of loquaciousness that finds a significant association between speakers’ ideological extremity and the amount they write in the chat. We conclude by discussing practical implications and advice for future practitioners of chat studies.
One pedagogical finding that has gained recent attention is the utility of active, effortful retrieval practice in effective learning. Essentially, humans learn best when they are asked to actively generate/recall knowledge for themselves, rather than receiving knowledge passively. In this paper, we (a) provide a framework for both practice and assessment within which students can organically develop active study habits, (b) share resources we have built to help implement such a framework in the linguistics classroom, and (c) provide some examples and evaluation of their success in the context of an introductory phonetics/phonology course.
Chapter 7 shows how in the 1980s patent law came to view computer-related subject matter through the lens of ‘abstractness’ and the role that materiality played in determining the fate of that subject matter. The chapter also looks at how as a result of changes in technology, patent law gradually shifted away from the materiality of the subject matter to look at its specificity and in so doing how the subject matter was dematerialised.
After looking at how software was created and consumed in the 1960s and, as this changed, how it gave rise to questions about the role intellectual property might play in the emerging software industry, Chapter 5 looks at the contrasting ways that patentable subject matter was seen within the information technology industry and how these views were received within the law.
Chapter 6 looks at the problems patent law experienced in the 1960s and 1970s in attempting to reconcile the conflicting views of the industry about what the subject matter was and how it should be interpreted.
A Microsoft® Visual Basic software, WinClbclas, has been developed to calculate the chemical formulae of columbite-supergroup minerals based on data obtained from wet-chemical and electron-microprobe analyses and using the current nomenclature scheme adopted by the Commission on New Minerals, Nomenclature and Classification (CNMNC) of the International Mineralogical Association (IMA) for columbite-supergroup minerals. The program evaluates 36 IMA-approved species, three questionable in terms of their unit-cell parameters, four insufficiently studied questionable species and one ungrouped species, all according to the dominant valance and constituent status in five mineral groups including ixiolite (MO2), wolframite (M1M2O4), samarskite (ABM2O8), columbite (M1M2O6) and wodginite (M1M2M32O8). Mineral compositions of the columbite supergroup are calculated on the basis of 24 oxygen atoms per formula unit. However, the formulae of the five ixiolite to wodginite groups can be estimated by the program on the basis of their cation and anion values in their typical mineral formulae (e.g. 4 cations and 8 oxygens for the wodginite group) with normalisation procedures. The Fe3+ and Fe2+ contents from microprobe-derived total FeO (wt.%) amounts are estimated by stoichiometric constraints. WinClbclas allows users to: (1) enter up to 47 input variables for mineral compositions; (2) type and load multiple columbite-supergroup mineral compositions in the data entry section; (3) edit and load the Microsoft® Excel files used in calculating, classifying, and naming the columbite-supergroup minerals, together with the total monovalent to hexavalent ion; and (4) store all the calculated parameters in the output of a Microsoft® Excel file for further data evaluation. The program is distributed as a self-extracting setup file, including the necessary support files used by the program, a help file and representative sample data files.
Medical devices increasingly include software components, which facilitate remote patient monitoring. The introduction of software into previously analog medical devices, as well as innovation in software-driven devices, may introduce new safety concerns – all the more so when such devices are used in patients’ homes, well outside of traditional health care delivery settings. We review four key mechanisms for the post-market surveillance of medical devices in the United States: (1) Post-market trials and registries; (2) manufacturing plant inspections; (3) adverse event reporting; and (4) recalls. We use comprehensive regulatory data documenting adverse events and recalls to describe trends in the post-market safety of medical devices, based on the presence or absence of software. Overall, devices with software are associated with more reported adverse events (i.e. individual injuries and deaths) and more high-severity recalls, compared to devices without software. However, in subgroup analyses of individual medical specialties, we consistently observe differences in recall probability but do not consistently detect differences in adverse events. These results suggest that adverse events are a noisy signal of post-market safety and not necessarily a reliable predictor of subsequent recalls. As patients and health care providers weigh the benefits of new remote monitoring technologies against potential safety issues, they should not assume that safety concerns will be readily identifiable through existing post-market surveillance mechanisms. Both health care providers and developers of remote patient monitoring technologies should therefore consider how they might proactively ensure that newly introduced remote patient monitoring technologies work safely and as intended.
This book takes as its starting point recent debates over the dematerialisation of subject matter which have arisen because of changes in information technology, molecular biology, and related fields that produced a subject matter with no obvious material form or trace. Arguing against the idea that dematerialisation is a uniquely twenty-first century problem, this book looks at three situations where US patent law has already dealt with a dematerialised subject matter: nineteenth century chemical inventions, computer-related inventions in the 1970s, and biological subject matter across the twentieth century. In looking at what we can learn from these historical accounts about how the law responded to a dematerialised subject matter and the role that science and technology played in that process, this book provides a history of patentable subject matter in the United States. This title is available as Open Access on Cambridge Core.
A modern automatic weather station (AWS) is a sophisticated collection of various components, sensors and electronics modules tied together by software, together making up a data acquisition and processing system. Many of today and tomorrow’s products follow a broadly similar set of basic processes, and this chapter sets out to explain these basic processing steps, keeping technical terminology to a minimum, and illustrating three different approaches to ‘system architecture’. The oversight provided by this chapter provides familiarity with the key concepts, system approaches and application types, and from there users can review potential products and suppliers using Internet search facilities to gather up-to-date product information.
The differences between AI software and normal software are important as these have implications for how a transaction of AI software will be treated under sales law. Next, what it means to own an AI system – whether it is a chattel, merely a software, or something more than a software – is explored. If AI is merely a software, it will be protected by copyright, but there will be problems with licensing. But if AI is encapsulated in a physical medium, the transaction may be treated as one of the sale of goods, or a sui generis position may be taken. A detailed analysis of the Court of Justice of the European Union’s decision in Computer Associates v The Software Incubator is provided. An AI transaction can be regarded as a sale of goods. Because the sale of goods regime is insufficient, a transaction regime for AI systems has to be developed, which includes ownership and fair use (assuming AI is regarded as merely a software) and the right to repair (whether AI is treated as goods or software).
Germany’s 2019 Digital Healthcare Act (Digitale-Versorgung-Gesetz, or DVG) created a number of opportunities for the digital transformation of the healthcare delivery system. Key among these was the creation of a reimbursement pathway for patient-centered digital health applications (digitale Gesundheitsanwendungen, or DiGA). Worldwide, this is the first structured pathway for “prescribable” health applications at scale. As of October 10, 2023, 49 DiGA were listed in the official directory maintained by Germany’s Federal Institute for Drugs and Medical Devices (BfArM); these are prescribable by physicians and psychotherapists and reimbursed by the German statutory health insurance system for all its 73 million beneficiaries. Looking ahead, a major challenge facing DiGA manufacturers will be the generation of the evidence required for ongoing price negotiations and reimbursement. Current health technology assessment (HTA) methods will need to be adapted for DiGA.
Methods
We describe the core issues that distinguish HTA in this setting: (i) explicit allowance for more flexible research designs, (ii) the nature of initial evidence generation, which can be delivered (in its final form) up to one year after becoming reimbursable, and (iii) the dynamic nature of both product development and product evaluation. We present the digital health applications in the German DiGA scheme as a case study and highlight the role of RWE in the successful evaluation of DiGA on an ongoing basis.
Results
When a DiGA is likely to be updated and assessed regularly, full-scale RCTs are infeasible; we therefore make the case for using real-world data and real-world evidence (RWE) for dynamic HTAs.
Conclusions
Continous evaluation using RWD is a regulatory innovation that can help improve the quality of DiGAs on the market.
REAP-2 is an interactive dose-response curve estimation tool for Robust and Efficient Assessment of drug Potency. It provides user-friendly dose-response curve estimation for in vitro studies and conducts statistical testing for model comparisons with a redesigned user interface. We also make a major update of the underlying estimation method with penalized beta regression, which demonstrates great reliability and accuracy in dose estimation and uncertainty quantification. In this note, we describe the method and implementation of REAP-2 with a highlight on potency estimation and drug comparison.
In Chapter 1 the different medical study designs are discussed and the difference between age, period and cohort effects is explained. Furthermore, some general information (e.g. prior knowledge, software used for the examples) needed to work through the book is provided. Finally, there is a short section in which the differences between the second and third edition are outlined.
An emergent volume electron microscopy technique called cryogenic serial plasma focused ion beam milling scanning electron microscopy (pFIB/SEM) can decipher complex biological structures by building a three-dimensional picture of biological samples at mesoscale resolution. This is achieved by collecting consecutive SEM images after successive rounds of FIB milling that expose a new surface after each milling step. Due to instrumental limitations, some image processing is necessary before 3D visualization and analysis of the data is possible. SEM images are affected by noise, drift, and charging effects, that can make precise 3D reconstruction of biological features difficult. This article presents Okapi-EM, an open-source napari plugin developed to process and analyze cryogenic serial pFIB/SEM images. Okapi-EM enables automated image registration of slices, evaluation of image quality metrics specific to pFIB-SEM imaging, and mitigation of charging artifacts. Implementation of Okapi-EM within the napari framework ensures that the tools are both user- and developer-friendly, through provision of a graphical user interface and access to Python programming.
These days, because of the coronavirus disease (COVID-19) pandemic, we have faced a number of challenges and scarcities in Iran. Lack of personal protective equipment (PPE) is one of the most remarkable problems that can have damaging consequences on the health system. In this letter, we introduce software that can help hospitals manage their PPE in terms of purchasing, distributing, and predicting the future needs in different time intervals. The software has several distinctive features such as superior speed, cost management, managerial dashboard, a wide range of applicability, comprehensiveness, supply chain management, and quality appraisal. We hope that our findings can assist health authorities in planning and optimizing the use of PPE for the response to COVID-19, where the shortage of resources may occur due to supply chain issues.
This article introduces a training simulator for electron beam alignment using Ronchigrams. The interactive web application, www.ronchigram.com, is an advanced educational tool aimed at making scanning transmission electron microscopy (STEM) more accessible and open. For experienced microscopists, the tool offers on-hand quantification of simulated Ronchigrams and their resolution limits.
Triage is a tool used to determine patients’ severity of illness or injury within minutes of arrival. This study aims to assess the reliability and validity of a new computer-based triage decision support tool, ANKUTRIAGE, prospectively.
Methods:
ANKUTRIAGE, a 5-level triage tool was established considering 2 major factors, patient’s vital signs and characteristics of the admission complaint. Adult patients admitted to the ED between July and October, 2019 were consecutively and independently double triaged by 2 assessors using ANKUTRIAGE system. To measure inter-rater reliability, quadratic-weighted kappa coefficients (Kw) were calculated. For the validity, associations among urgency levels, resource use, and clinical outcomes were evaluated.
Results:
The inter-rater reliability between users of ANKUTRIAGE was excellent with an agreement coefficient (Kw) greater than 0.8 in all compared groups. In the validity phase, hospitalization rate, intensive care unit admission and mortality rate decreased from level 1 to 5. Likewise, according to the urgency levels, resource use decreased significantly as the triage level decreased (P < 0.05).
Conclusions:
ANKUTRIAGE proved to be a valid and reliable tool in the emergency department. The results showed that displaying the key discriminator for each complaint to assist decision leads to a high inter-rater agreement with good correlation between urgency levels and clinical outcomes, as well as between urgency levels and resource consumptions.
With the rise of digital technologies the number and diversity of related tools (such as phones, computers, 3-D printers, etc.) have markedly increased. This chapter examines how digital objects and other new technologies alter human experiences with the material world.
NeXL is a collection of Julia language packages (libraries) for X-ray microanalysis data processing. NeXLCore provides basic atomic and X-ray physics data and models including support for microanalysis-related data types for materials and k-ratios. NeXLMatrixCorrection provides algorithms for matrix correction and iteration. NeXLSpectrum provides utilities and tools for energy-dispersive X-ray spectrum and hyperspectrum analysis including display, manipulation, and fitting. NeXL is integrated with the Julia language infrastructure. NeXL builds on the Gadfly plotting library and the DataFrames tabular data library. When combined with the DrWatson package, NeXL can provide a highly reproducible environment in which to process microanalysis data. Data availability and reproducible data analysis are two keys to scientific reproducibility. Not only should readers of journal articles have access to the data, they should also be able to reproduce the analysis steps that take the data to final results. This paper will both discuss the NeXL framework and provide examples of how it can used for reproducible data analysis.