Survey of the methods and reporting practices in published meta-analyses of test performance: 1987 to 2009

Issa J. Dahabreh, Mei Chung, Georgios D. Kitsios, Teruhiko Terasawa, Gowri Raman, Athina Tatsioni, Annette Tobar, Joseph Lau, Thomas A. Trikalinos, Christopher H. Schmid

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)

Abstract

We performed a survey of meta-analyses of test performance to describe the evolution in their methods and reporting. Studies were identified through MEDLINE (1966-2009), reference lists, and relevant reviews. We extracted information on clinical topics, literature review methods, quality assessment, and statistical analyses. We reviewed 760 publications reporting meta-analyses of test performance, published between 1987 and 2009. Eligible reviews included a median of 18 primary studies that were used in quantitative analyses. Most common clinical areas were cardiovascular disease (21%) and oncology (25%); most common test categories were imaging (44%) and biomarker tests (28%). Assessment of verification and spectrum bias, blinding, prospective study design, and consecutive patient recruitment became more common over time (p < 0.001 comparing reviews published through 2004 vs 2005 onwards). These changes coincided with the increasing use of checklists to guide assessment of methodological quality. Heterogeneity tests were used in 58% of meta-analyses; subgroup or regression analyses were used in 57%. Random effects models were employed in 57% of meta-analyses (38% through 2004 vs 72% 2004-onwards; p < 0.001). Use of bivariate models of sensitivity and specificity increased in recent years (21% in 2008-2009 vs 7% in earlier years; p < 0.001). Methods employed in meta-analyses of test performance have improved with the introduction of quality assessment checklists and the development of more sophisticated statistical methods.

Original languageEnglish
Pages (from-to)242-255
Number of pages14
JournalResearch synthesis methods
Volume4
Issue number3
DOIs
Publication statusPublished - 01-09-2013
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Education

Fingerprint

Dive into the research topics of 'Survey of the methods and reporting practices in published meta-analyses of test performance: 1987 to 2009'. Together they form a unique fingerprint.

Cite this