Context The relative merits of various study designs and their placement in
hierarchies of evidence are often discussed. However, there is limited knowledge
about the relative citation impact of articles using various study designs.
Objective To determine whether the type of study design affects the rate of citation
in subsequent articles.
Design and Setting We measured the citation impact of articles using various study designs—including
meta-analyses, randomized controlled trials, cohort studies, case-control
studies, case reports, nonsystematic reviews, and decision analysis or cost-effectiveness
analysis—published in 1991 and in 2001 for a sample of 2646 articles.
Main Outcome Measure The citation count through the end of the second year after the year
of publication and the total received citations.
Results Meta-analyses received more citations than any other study design both
in 1991 (P<.05 for all comparisons) and in 2001
(P<.001 for all comparisons) and both in the first
2 years and in the longer term. More than 10 citations in the first 2 years
were received by 32.4% of meta-analyses published in 1991 and 43.6% of meta-analyses
published in 2001. Randomized controlled trials did not differ significantly
from epidemiological studies and nonsystematic review articles in 1991 but
clearly became the second-cited study design in 2001. Epidemiological studies,
nonsystematic review articles, and decision and cost-effectiveness analyses
had relatively similar impact; case reports received negligible citations.
Meta-analyses were cited significantly more often than all other designs after
adjusting for year of publication, high journal impact factor, and country
of origin. When limited to studies addressing treatment effects, meta-analyses
received more citations than randomized trials.
Conclusion Overall, the citation impact of various study designs is commensurate
with most proposed hierarchies of evidence.