This issue of JAMA includes 3 reports1- 3 first presented at the Seventh International Congress on Peer Review and Biomedical Publication in September 2013.4 At the first congress, held in 1989, the most common topic of the presented abstracts was editorial peer review.5 Since then, the research presented and discussed has substantially broadened to include all aspects of biomedical publication—from research proposals to sharing data after publication.4
In this issue of JAMA, Malički and colleagues1 report their findings from an analysis of the 614 abstracts presented at the 7 congresses held from 1989 through 2013. Of these abstracts, overall, 76% were observational studies, 16% were studies of interventions aimed at improving peer review and scientific reporting, and 8% were opinion papers. At the most recent congress, 27% of the 110 presented studies were interventional, including 5 randomized trials. The authors also found that 305 (61%) of the presentations from the first 6 congresses were eventually published and that 265 articles had received at least 1 citation (with a median of 20 citations per article), with the most-cited articles focusing on reporting guidelines, synthesis of evidence, and publication bias. Of the articles published after presentation at the first 6 congresses (N = 294), 36% reported being funded, whereas of the 110 abstracts presented at the 2013 congress, 41% reported being funded. The authors point out that interventional studies aimed at improving peer review and scientific reporting are still underrepresented, and, echoing previous calls for research for these congresses,6 they suggest that systematic approaches and funding schemes are still needed to further improve research into peer review and biomedical publication.
Two other reports from the 2013 congress provide important information about the quality of reporting results from clinical trials. Becker and colleagues2 conducted a cross-sectional analysis of clinical trials with primary results published in high-impact journals between July 2010 and June 2011 and compared trial information and results reported in ClinicalTrials.gov with that reported in peer-reviewed publications. The authors found that 93 of 96 trials had at least 1 discrepancy, with the highest rates of discordance involving completion rates (22%) and trial interventions (16%). In addition, in 91 trials that described 156 primary efficacy end points, including 132 end points described in both sources, 21 trials (16%) had discordant end points and 30 end points (23%) could not be compared. These investigators suggest that further efforts are needed to ensure the accuracy of reporting results of clinical trials.
In another report, Kasenda and colleagues3 describe a multinational study that examined 894 clinical trials involving patients, after approval of the relevant trial protocols by 6 research ethics committees between 2000 and 2003, with follow-up through April 2013 to determine the prevalence of and reasons for discontinuation and publication status. The investigators found that 25% of all trials approved by the 6 research ethics committees were discontinued, with poor recruitment being the most common cause. Moreover, these discontinued trials were often not reported and the ethics committees often were not informed.
The results of these 2 studies2,3 highlight the need for greater transparency and efficiency in the reporting of research. Moreover, these studies are reminiscent of demonstrations at the early congresses of major defects in reporting of research, which were soon followed by the development of influential guidelines to help correct these defects, led primarily by David Moher of Ottawa and Douglas G. Altman of Oxford. The first of these reporting guidelines to be successful was the Consolidated Standards of Reporting Trials (CONSORT),7 which has standardized the reporting of clinical trials and also has proved to be an excellent guide to establishing recommendations on the reporting of numerous other kinds of studies. Such guidelines have been adopted simply because the community of scientists was convinced by the weight of the evidence supporting their need.
In addition, the reports presented at the congresses and elsewhere8- 10 confirmed the high prevalence of publication bias involving studies—most importantly, clinical trials—and, over the course of 2 decades, accumulated the evidentiary basis for an effort to persuade editors to support the establishment of registration of trials at inception.11 The success of ClinicalTrials.gov and other publicly available clinical trial registries has followed.
Ioannidis, another congress contributor, and his colleagues are continuing to produce evidence concerning the fragility of findings in clinical trials and of the failure to replicate the trials.12 The National Institutes of Health is responding to the evidence that replication is poor,13 and editors who publish preclinical and basic science are beginning to express the same concerns expressed by editors of biomedical journals about problems with reproducibility.14 There is renewed interest in the crucial but emotional, perhaps threatening, and technically challenging issue of sharing of data. The Institute of Medicine is seeking public comment on its Discussion Framework for Clinical Trial Data Sharing.15 Although these developments are encouraging, journals have not yet determined if and how they can assist in data sharing, let alone conduct review of the massive databases needed to assess such studies.
In addition to such promising initiatives, articles on how to improve research, of which publication is an integral part,16 are important reminders that no matter how much research on peer review and publication has been presented at the Peer Review Congresses and elsewhere, these studies are but part of a widespread movement to improve the scientific literature. As the reports in this issue of JAMA indicate, discovering the extent of the problems and testing methods to correct them will require a massive and prolonged effort on the part of researchers, funders, institutions, and journal editors. The International Congresses on Peer Review and Biomedical Publication will continue to promote and support these important efforts, and plans will soon be announced for the eighth congress to be held in 2017.4
Corresponding Author: Annette Flanagin, RN, MA, JAMA, 330 N Wabash Ave, Chicago, IL 60611 (email@example.com).
Conflict of Interest Disclosures: The authors have completed ICMJE Form for Disclosure of Potential Conflicts of Interest and reported funding provided to support the Seventh International Congress on Peer Review and Biomedical Publication from the following: Elsevier, Wolters Kluwer Health/Lippincott Williams & Wilkins, eJournal Press, Highwire/Stanford University, McGraw-Hill Professional, New England Journal of Medicine, Silverchair Information Systems, Wiley, and Global Advances in Health and Medicine.
Additional Contributions: We acknowledge and thank all of the contributors to the 7 Peer Review Congresses and the support of our colleagues: Howard Bauchner and the editorial staff of JAMA, Fiona Godlee and Trish Groves of the BMJ, and Constance Murphy for assistance with the manuscript.
Some tools below are only available to our subscribers or users with an online account.
Download citation file:
Web of Science® Times Cited: 1
Customize your page view by dragging & repositioning the boxes below.
Enter your username and email address. We'll send you a link to reset your password.
Enter your username and email address. We'll send instructions on how to reset your password to the email address we have on record.
Athens and Shibboleth are access management services that provide single sign-on to protected resources. They replace the multiple user names and passwords necessary to access subscription-based content with a single user name and password that can be entered once per session. It operates independently of a user's location or IP address. If your institution uses Athens or Shibboleth authentication, please contact your site administrator to receive your user name and password.