Hostname: page-component-78c5997874-4rdpn Total loading time: 0 Render date: 2024-11-15T19:19:11.613Z Has data issue: false hasContentIssue false

Cross-design Synthesis: A New Form of Meta-analysis for Combining Results from Randomized Clinical Trials and Medical-practice Databases

Published online by Cambridge University Press:  10 March 2009

Judith Droitcour
Affiliation:
U.S. General Accounting Office
George Silberman
Affiliation:
U.S. General Accounting Office
Eleanor Chelimsky
Affiliation:
U.S. General Accounting Office

Abstract

Cross-design synthesis is a new (and still evolving) strategy for providing quantitative results that capture the strengths and minimize the weaknesses of different kinds of research. The strategy, which is being developed to answer questions about the effects of treatment in medical practice, includes (a) identifying complementary research designs and studies conducted according to those designs; (b) completing an in-depth assessment of each study with respect to the chief potential bias(es) that are associated with its design; (c) making “secondary adjustments” of study results to correct known biases; and (d) developing synthesis frameworks and models that will minimize the impact of hidden biases.

Type
Statistics
Copyright
Copyright © Cambridge University Press 1993

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1.Barnett, H. J. M., Sackett, D., Taylor, D. W., et al. Are the results of the Extracranial Intracranial Bypass Trial generalizable? New England Journal of Medicine, 1987, 316, 820–24.CrossRefGoogle ScholarPubMed
2.Byar, D. P.Why data bases should not replace randomized clinical trials. Biometrics, 1980, 36, 337–42.CrossRefGoogle Scholar
3.Byar, D. P., Schoenfeld, D. A., Green, S. B., et al. Design considerations for AIDS trials. New England Journal of Medicine, 1990, 323, 1343–48.CrossRefGoogle ScholarPubMed
4.Calif, R. M., Pryor, D. B., & Greenfield, J. C.Beyond randomized clinical trials: applying clinical experience in the treatment of patients with coronary artery disease. Circulation, 1986, 74, 1191–94.CrossRefGoogle Scholar
5.Chalmers, T. C., Smith, H. Jr., Blackburn, B., et al. A Method for assessing the quality of a randomized control trial. Controlled Clinical Trials, 1981, 2, 3149.CrossRefGoogle ScholarPubMed
6.Colditz, G., Miller, J., & Mosteller, F.The effect of study design on gain in evaluation of new treatments in medicine and surgery. Drug Information Journal, 1988, 22, 343–52.CrossRefGoogle Scholar
7.Cordray, D. S. An assessment from the policy perspective. In Wachter, K. W. & Straf, M. L. (eds.), The future of meta-analysis. New York: Russell Sage Foundation, 1990.Google Scholar
8.Cordray, D. S. Strengthening causal interpretations of nonexperimental data: The role of meta-analysis. In Sechrest, L., et al. (eds.), Research methodology: Strengthening causal interpretations of nonexperimental data. Rockville, MD: Agency for Health Care Policy and Research, 1990.Google Scholar
9.Davis, K. The comprehensive cohort study: The use of registry data to confirm and extend a randomized trial. Recent results in cancer research, vol. 111. Berlin-Heidelberg: Springer-Verlag, 1988.Google Scholar
10.Eddy, D. M., Hasselblad, V., & Schacter, R.Meta-analysis by the confidence profile method: The statistical synthesis of evidence. Boston, MA: Academic Press (Harcourt Brace Jovanovich), 1992.Google Scholar
11.Eddy, D. M., Hasselblad, V., & Shacter, R.The statistical synthesis of evidence: Meta analysis by the confidence profile method. Report issued by the Center for Health Policy Research and Education, Duke University, and by the Department of Engineering Economic Systems, Stanford University, 1989.Google Scholar
12.Edlund, M. J., Craig, T. J., & Richardson, M. A.Informed consent as a form of volunteer bias. American Journal of Psychiatry, 1985, 142, 624–27.Google ScholarPubMed
13.Ellenberg, S.Meta-analysis: The quantitative approach to research review. Seminars in Oncology, 1988, 15, 472–81.Google ScholarPubMed
14.Ellwood, P. M.A technology of patient experience. New England Journal of Medicine, 1988, 318, 1549–56.CrossRefGoogle ScholarPubMed
15.Fischerman, K., & Mouridsen, H. T.Danish Breast Cancer Cooperative Group (DBCG): Structure and results of the organization. Acta Oncologica, 1988, 27, 593–96.CrossRefGoogle ScholarPubMed
16.Fisher, R. A.Statistical methods for research workers, 1st ed.Edinburgh: Oliver and Boyd, 1925.Google Scholar
17.Fisher, R. A.The design of experiments. Edinburgh: Oliver and Boyd, 1935.Google Scholar
18.Glass, G. V.Primary, secondary, and meta-analysis of research. Educational Researcher, 1976, 6, 38.CrossRefGoogle Scholar
19.Glass, G. V., McGaw, B., & Smith, M. L.Meta-analysis in social research. Beverly Hills CA: Sage, 1981.Google Scholar
20.Hedges, L., & Olkin, O.Statistical methods for meta-analysis. New York: Academic Press, 1985.Google Scholar
21.Himel, H. N., Liberati, A., Gelber, R., & Chalmers, T. C.Adjuvant chemotherapy for breast cancer: A pooled estimate based on published randomized control trials. Journal of the American Medical Association, 1986, 256, 1148–59.CrossRefGoogle ScholarPubMed
22.Hlatky, M. A.Using databases to evaluate therapy. Statistics in Medicine, 1991, 10, 647–52.CrossRefGoogle ScholarPubMed
23.Hlatky, M. A., Califf, R. M., Harrell, F. E. Jr., et al. Comparison of predictions based on observational data with the results of controlled clinical trials of coronary artery bypass surgery. Journal of the American College of Cardiology, 1988, 1, 237–45.CrossRefGoogle Scholar
24.Jackson, G. B. Methods for integrative reviews. Review of Educational Research, 1980, 50, 438–60. (Reprinted in Light, R. J. (ed.), Evaluation studies review annual, vol. 8. Beverly Hills, CA: Sage, 1983.)Google Scholar
25.Krakauer, H.Assessment of alternative technologies for the treatment of end-stage renal disease. Israel Journal of Medical Sciences, 1986, 22, 245–59.Google ScholarPubMed
26.Krakauer, H., & Bailey, R. C.Epidemiological oversight of the medical care provided to Medicare beneficiaries. Statistics in Medicine, 1991, 10, 521–40.CrossRefGoogle ScholarPubMed
27.Lichtman, S. M., & Budman, D. R.Letter to the editor. New England Journal of Medicine, 1989, 321, 470.Google Scholar
28.Light, R. J., & Pillemer, D. B.Summing up: The science of reviewing research. Cambridge, MA: Harvard University Press, 1984.CrossRefGoogle Scholar
29.Lipsey, M. Juvenile delinquency treatment: A meta-analytic inquiry into the variability of effects. In Cook, T. D., Cooper, H., Cordray, D. S., et al. (eds.), Meta-analysis for explanation: A casebook. New York: Russell Sage Foundation, 1992.Google Scholar
30.Louis, T. A., Fineberg, H. V., & Mosteller, F.Findings for public health from metaanalyses. Annual Review of Public Health, 1985, 6, 120.CrossRefGoogle ScholarPubMed
31.McDonald, C., & Siu, H.The analysis of humongous databases: Problems and promises. Statistics in Medicine, 1991, 10, 511–18.CrossRefGoogle Scholar
32.Moffitt, R.Program evaluation with nonexperimental data. Evaluation Review, 1991, 15(3), 291314.CrossRefGoogle Scholar
33.Mosteller, F. Improving research methodology: An overview. In Sechrest, L., et al. (eds.), Research methodology: Strengthening causal interpretations of nonexperimental data. Rockville, MD: Agency for Health Care Policy and Research, 1990.Google Scholar
34.Peto, R.Why do we need systematic overviews of randomized trials? Statistics in Medicine, 1987, 6, 233–40.CrossRefGoogle ScholarPubMed
35.Pocock, S.Clinical trials: A practical approach. New York: Wiley, 1983.Google Scholar
36.Pryor, D. B., Califf, R. M., Harrell, F. E. Jr., et al. Clinical data bases: Accomplishments and unrealized potential. Medical Care, 1985, 23(5), 623–47.CrossRefGoogle ScholarPubMed
37.Roos, N., Wennberg, J., Malenka, D., et al. Mortality and reoperation after open and transurethral resection of the prostate for benign prostatic hyperplasia. New England Journal of Medicine, 1989, 320, 1120–24.CrossRefGoogle ScholarPubMed
38.Roper, W. L., Winkenwerder, W., Hackbarth, G. M., & Krakauer, H.Effectiveness in health care: An initiative to evaluate and improve medical practice. New England Journal of Medicine, 1988, 319, 11971202.CrossRefGoogle ScholarPubMed
39.Rosenthal, R., Meta-analytic procedures for social research. Beverly Hills, CA: Sage, 1984.Google Scholar
40.Rubin, D. B.Practical implications of modes of statistical inference for causal effects and the critical role of the assignment mechanism. Biometrics, 1991, 47(4), 1213–34.CrossRefGoogle ScholarPubMed
41.Rubin, D. B. A new perspective. In Wachter, K. W. & Straf, M. L. (eds.), The future of meta-analysis. New York: Russell Sage Foundation, 1990.Google Scholar
42.Rubin, D. B.Estimating causal effects of treatments in randomized and nonrandomized studies. Journal of Educational Psychology, 1974, 66, 688701.CrossRefGoogle Scholar
43.Schooler, N. R.How generalizable are the results of clinical trials? Psychopharmacology Bulletin, 1980, 16, 2931.Google ScholarPubMed
44.Taylor, K. M., Margolese, R. G., & Soskoline, C. L.Physicians reasons for not entering eligible patients in a randomized clinical trial of surgery for breast cancer. New England Journal of Medicine, 1984, 310, 1363–67.CrossRefGoogle ScholarPubMed
45.U.S. General Accounting Office. Cross design synthesis: A new strategy for medical effectiveness research (GAO/PEMD-92–18). Washington, DC: U.S. General Accounting Office, 1992.Google Scholar
46.U.S. General Accounting Office. Breast cancer: Patients’ survival (GAO/PEMD-89-9). Washington, DC: U.S. General Accounting Office, 1989.Google Scholar
47. U.S. General Accounting Office, forthcoming.Google Scholar
48.Wennberg, J. E., Freeman, J. L., Shelton, R. M., & Bubolz, T. A.Hospital use and mortality among medicare beneficiaries in Boston and New Haven. New England Journal of Medicine, 1989, 321, 1168–73.CrossRefGoogle ScholarPubMed
49.Wortman, P. M., & Yeaton, W. H. Synthesis of results in controlled trials of coronary artery bypass graft surgery. In Light, R. J. (ed.), Evaluation studies review annual, 1983, 8, 536–51. Beverly Hills, CA.Google Scholar