Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2024-12-25T16:44:30.006Z Has data issue: false hasContentIssue false

QUALITY ASSURANCE OF REGISTRIES FOR HEALTH TECHNOLOGY ASSESSMENT

Published online by Cambridge University Press:  25 September 2018

Kate L. Mandeville
Affiliation:
London School of Hygiene & Tropical Medicine
Maja Valentic
Affiliation:
Croatian Institute of Public Health
Damir Ivankovic
Affiliation:
Croatian Institute of Public Health
Ivan Pristas
Affiliation:
Croatian Institute of Public Health
Jae Long
Affiliation:
National Institute for Health and Care Excellence
Hannah E. Patrick
Affiliation:
National Institute for Health and Care Excellencehannahpatrick@nice.org.uk

Abstract

Objectives:

The aim of this study was to identify guidelines and assessment tools used by health technology agencies for quality assurance of registries and investigate the current use of registry data by HTA organizations worldwide.

Methods:

As part of a European Network for Health Technology Assessment Joint Action work package, we undertook a literature search and sent a questionnaire to all partner organizations on the work package and all organizations listed in the International Society for Pharmaco-economics and Outcomes Research directory.

Results:

We identified thirteen relevant documents relating to quality assurance of registries. We received fifty-five responses from organizations representing twenty-one different countries, a response rate of 40.5 percent (43/110). Many agencies, particularly in Europe, are already drawing on a range of registries to provide data for their HTA. Less than half, however, use criteria or standards to assess the quality of registry data. Nearly all criteria or standards in use have been internally defined by organizations rather than referring to those produced by an external body. A comparison of internal and external standards identified consistency in several quality dimensions, which can be used as a starting point for the development of a standardized tool.

Conclusion:

The use of registry data is more prevalent than expected, strengthening the need for a standardized registry quality assessment tool. A user-friendly tool developed in conjunction with stakeholders will support the consistent application of approved quality standards, and reassure critics who have traditionally considered registry data to be unreliable.

Type
Method
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

This work was funded by the European network for Health Technology Assessment, Joint Action 3, Work Package 5. The authors are most grateful to all survey contributors.

References

REFERENCES

1.Accelerated Access Review: Final Report. Review of innovative medicines and medical technologies. London 2016; https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/565072/AAR_final.pdf (accessed December 6, 2017).Google Scholar
2.Zaletel, M, Kralj, MM, eds. Methodological guidelines and recommendations for efficient and rational governance of patient registries. Ljubljana: National Institute of Public Health; 2015. https://ec.europa.eu/health/sites/health/files/ehealth/docs/patient_registries_guidelines_en.pdf (accessed March 12, 2018).Google Scholar
3.Dhruva, SS, Bero, LA, Redberg, RF. Strength of study evidence examined by the FDA in premarket approval of cardiovascular devices. JAMA. 2009;302:2679-2685.Google Scholar
4.Wilkinson, J, Crosbie, A. A UK medical devices regulator's perspective on registries. Biomed Tech (Berl). 2016;61:233-237.Google Scholar
5.Smeeth, L, Douglas, I, Hubbard, R. Commentary: We still need observational studies of drugs--They just need to be better. Int J Epidemiol. 2006;35:1310-1311.Google Scholar
6.Glynn, D, Campbell, B, Marlow, M, Patrick, H. How to improve the quality of evidence when new treatments are funded conditional on collecting evidence of effectiveness and safety. J Health Serv Res Policy. 2016;21:71-72.Google Scholar
7.U.S. Department of Health and Human Services Food and Drug Administration (FDA). Use of real-world evidence to support regulatory decision-making for medical devices: Guidance for industry and food and drug administration staff. Jones AB, Smith JK. Computer diagnosis and results. New York: Penta Publishers; 2011. https://www.fda.gov/downloads/medicaldevices/deviceregulationandguidance/guidancedocuments/ucm513027.pdf (accessed December 6, 2017).Google Scholar
8.ANQ, FMH, H+, SAMS, University Medicine Switzerland. Recommendations for the development and operation of health-related registries. Bern; 2016. http://www.anq.ch/fileadmin/redaktion/deutsch/20160926_Empfehlungen_Register_final_en.pdf (accessed December 6, 2017).Google Scholar
9.International Society for Pharmacoeconomics and Outcomes Research (ISPOR) Directory of HTA organisations worldwide. https://www.ispor.org/htadirectory/index.aspx (accessed December 6, 2017).Google Scholar
11.Review Body for Interventional Procedures Programme (ReBIP). Evaluating databases. Sheffield: ReBIP; 2005. https://www.sheffield.ac.uk/polopoly_fs/1.43797!/file/Evaluating-Databases-15-March-2006.pdf (accessed December 6, 2017).Google Scholar
12.International Medical Device Regulators Forum (IMDRF) Patient Registries Working Group. Principles of international system of registries linked to other data sources and tools. IMDRF; 2016. http://www.imdrf.org/docs/imdrf/final/technical/imdrf-tech-160930-principles-system-registries.pdf (accessed December 6, 2017).Google Scholar
13.National Institute for Health and Clinical Excellence (NICE). Interventional procedures programme manual. London: NICE; 2016. https://www.ncbi.nlm.nih.gov/books/NBK425827/ (accessed December 6, 2017).Google Scholar
14.Gliklich, RE, Dreyer, NA. eds. Registries for Evaluating Patient Outcomes: A User's Guide. 2nd ed. AHRQ Publication No.10-EHC049. Rockville, MD: Agency for Healthcare Research and Quality (AHRQ); 2010. https://ahrq-ehc-application.s3.amazonaws.com/media/pdf/registries-guide-2nd-edition_research.pdf (accessed March 13, 2018)Google Scholar
15.Australian Commission on Safety and Quality in Health Care, National E-Health Transition Authority and Monash University. Centre of Research Excellence in Patient Safety Operating principles and technical standards for Australian clinical quality registries. Darlinghurst, New South Wales: The Commission; 2008. https://trove.nla.gov.au/version/165068807 (accessed March 13, 2018).Google Scholar
16.Posada, M, del Otero, L, Villaverde, A, et al. Data quality, validation and data source integration in rare disease registries. WP 7 deliverable EPIRARE project. 2014. http://www.epirare.eu/_down/del/D4_GuidelinesfordatasourcesandqualityforRDRegistriesinEurope.pdf (accessed December 6, 2017).Google Scholar
17.von Elm, E, Altman, DG, Egger, M, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: Guidelines for reporting observational studies. J Clin Epidemiol. 2008;61:344-349. doi:10.1016/S0140-6736(07)61602-X.Google Scholar
18.Dreyer, NA, Velentgas, P, Westrich, K. et al. The GRACE checklist for rating the quality of observational studies of comparative effectiveness: a tale of hope and caution. Journal of managed care & specialty pharmacy. 2014;20(3): 301-8. doi:10.18553/jmcp.2014.20.3.301.Google Scholar
19.Stark, N (CDG Whitepapers). Registry studies: Why and how. http://clinicaldevice.typepad.com/cdg_whitepapers/2011/07/registry-studies-why-and-how.html. (accessed December 6, 2017).Google Scholar
20.Mandeville, KL, Patrick, H, McKenna, T, Harris, K. Assessing the quality of health technology registers for national guidance development. Eur J Public Health. 2018;28:220-223. doi:10.1093/eurpub/ckx135.Google Scholar
21.Council on Library and Information Sources (CLIR). What is data curation. https://www.clir.org/initiatives-partnerships/data-curation/ (accessed February 28, 2018).Google Scholar
Supplementary material: File

Mandeville et al. supplementary material

Tables S1-S13 and Figure S1

Download Mandeville et al. supplementary material(File)
File 331.3 KB