Case study of the comparison of data from conference abstracts and full-text articles in health technology assessment of rapidly evolving technologies: Does it make a difference?
Published online by Cambridge University Press: 09 August 2006
Abstract
Objectives: The aim of this study was to examine (i) the consistency of reporting research findings presented in conference abstracts and presentations and subsequent full publications, (ii) the ability to judge methodological quality of trials from conference abstracts and presentations, and (iii) the effect of inclusion or exclusion of data from these sources on the pooled effect estimates in a meta-analysis.
Methods: This report is a case study of a selected health technology assessment review (TAR) of a rapidly evolving technology that had identified and included a meta-analysis of trial data from conference abstracts and presentations.
Results: The overall quality of reporting in abstracts and presentations was poor, especially in abstracts. There was incomplete or inconsistent reporting of data in the abstract/presentations. Most often inconsistencies were between conference slide presentations and data reported in published full-text articles. Sensitivity analyses indicated that using data only from published papers would not have altered the direction of any of the results when compared with those using published and abstract data. However, the statistical significance of three of ten results would have changed. If conference abstracts and presentations were excluded from the early analysis, the direction of effect and statistical significance would have changed in one result. The overall conclusions of the original analysis would not have been altered.
Conclusions: There are inconsistencies in data presented as conference abstracts/presentations and those reported in subsequent published reports. These inconsistencies could impact the final assessment results. Data discrepancies identified across sources included in TARs should be highlighted and their impact assessed and discussed. Sensitivity analyses should be carried out with and without abstract/presentation data included in the analysis. Incomplete reporting in conference abstracts and presentations limits the ability of reviewers to assess confidently the methodological quality of trials.
- Type
- GENERAL ESSAYS
- Information
- International Journal of Technology Assessment in Health Care , Volume 22 , Issue 3 , July 2006 , pp. 288 - 294
- Copyright
- © 2006 Cambridge University Press
References
- 7
- Cited by