0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Editorial |

Research on Peer Review and Biomedical Publication:  Furthering the Quest to Improve the Quality of Reporting FREE

Drummond Rennie, MD1; Annette Flanagin, RN, MA2
[+] Author Affiliations
1University of California, San Francisco
2JAMA, Chicago, Illinois
JAMA. 2014;311(10):1019-1020. doi:10.1001/jama.2014.1362.
Text Size: A A A
Published online

This issue of JAMA includes 3 reports13 first presented at the Seventh International Congress on Peer Review and Biomedical Publication in September 2013.4 At the first congress, held in 1989, the most common topic of the presented abstracts was editorial peer review.5 Since then, the research presented and discussed has substantially broadened to include all aspects of biomedical publication—from research proposals to sharing data after publication.4

In this issue of JAMA, Malički and colleagues1 report their findings from an analysis of the 614 abstracts presented at the 7 congresses held from 1989 through 2013. Of these abstracts, overall, 76% were observational studies, 16% were studies of interventions aimed at improving peer review and scientific reporting, and 8% were opinion papers. At the most recent congress, 27% of the 110 presented studies were interventional, including 5 randomized trials. The authors also found that 305 (61%) of the presentations from the first 6 congresses were eventually published and that 265 articles had received at least 1 citation (with a median of 20 citations per article), with the most-cited articles focusing on reporting guidelines, synthesis of evidence, and publication bias. Of the articles published after presentation at the first 6 congresses (N = 294), 36% reported being funded, whereas of the 110 abstracts presented at the 2013 congress, 41% reported being funded. The authors point out that interventional studies aimed at improving peer review and scientific reporting are still underrepresented, and, echoing previous calls for research for these congresses,6 they suggest that systematic approaches and funding schemes are still needed to further improve research into peer review and biomedical publication.

Two other reports from the 2013 congress provide important information about the quality of reporting results from clinical trials. Becker and colleagues2 conducted a cross-sectional analysis of clinical trials with primary results published in high-impact journals between July 2010 and June 2011 and compared trial information and results reported in ClinicalTrials.gov with that reported in peer-reviewed publications. The authors found that 93 of 96 trials had at least 1 discrepancy, with the highest rates of discordance involving completion rates (22%) and trial interventions (16%). In addition, in 91 trials that described 156 primary efficacy end points, including 132 end points described in both sources, 21 trials (16%) had discordant end points and 30 end points (23%) could not be compared. These investigators suggest that further efforts are needed to ensure the accuracy of reporting results of clinical trials.

In another report, Kasenda and colleagues3 describe a multinational study that examined 894 clinical trials involving patients, after approval of the relevant trial protocols by 6 research ethics committees between 2000 and 2003, with follow-up through April 2013 to determine the prevalence of and reasons for discontinuation and publication status. The investigators found that 25% of all trials approved by the 6 research ethics committees were discontinued, with poor recruitment being the most common cause. Moreover, these discontinued trials were often not reported and the ethics committees often were not informed.

The results of these 2 studies2,3 highlight the need for greater transparency and efficiency in the reporting of research. Moreover, these studies are reminiscent of demonstrations at the early congresses of major defects in reporting of research, which were soon followed by the development of influential guidelines to help correct these defects, led primarily by David Moher of Ottawa and Douglas G. Altman of Oxford. The first of these reporting guidelines to be successful was the Consolidated Standards of Reporting Trials (CONSORT),7 which has standardized the reporting of clinical trials and also has proved to be an excellent guide to establishing recommendations on the reporting of numerous other kinds of studies. Such guidelines have been adopted simply because the community of scientists was convinced by the weight of the evidence supporting their need.

In addition, the reports presented at the congresses and elsewhere810 confirmed the high prevalence of publication bias involving studies—most importantly, clinical trials—and, over the course of 2 decades, accumulated the evidentiary basis for an effort to persuade editors to support the establishment of registration of trials at inception.11 The success of ClinicalTrials.gov and other publicly available clinical trial registries has followed.

Ioannidis, another congress contributor, and his colleagues are continuing to produce evidence concerning the fragility of findings in clinical trials and of the failure to replicate the trials.12 The National Institutes of Health is responding to the evidence that replication is poor,13 and editors who publish preclinical and basic science are beginning to express the same concerns expressed by editors of biomedical journals about problems with reproducibility.14 There is renewed interest in the crucial but emotional, perhaps threatening, and technically challenging issue of sharing of data. The Institute of Medicine is seeking public comment on its Discussion Framework for Clinical Trial Data Sharing.15 Although these developments are encouraging, journals have not yet determined if and how they can assist in data sharing, let alone conduct review of the massive databases needed to assess such studies.

In addition to such promising initiatives, articles on how to improve research, of which publication is an integral part,16 are important reminders that no matter how much research on peer review and publication has been presented at the Peer Review Congresses and elsewhere, these studies are but part of a widespread movement to improve the scientific literature. As the reports in this issue of JAMA indicate, discovering the extent of the problems and testing methods to correct them will require a massive and prolonged effort on the part of researchers, funders, institutions, and journal editors. The International Congresses on Peer Review and Biomedical Publication will continue to promote and support these important efforts, and plans will soon be announced for the eighth congress to be held in 2017.4

ARTICLE INFORMATION

Editorials represent the opinions of the authors and JAMA and not those of the American Medical Association.

Corresponding Author: Annette Flanagin, RN, MA, JAMA, 330 N Wabash Ave, Chicago, IL 60611 (annette.flanagin@jamanetwork.org).

Conflict of Interest Disclosures: The authors have completed ICMJE Form for Disclosure of Potential Conflicts of Interest and reported funding provided to support the Seventh International Congress on Peer Review and Biomedical Publication from the following: Elsevier, Wolters Kluwer Health/Lippincott Williams & Wilkins, eJournal Press, Highwire/Stanford University, McGraw-Hill Professional, New England Journal of Medicine, Silverchair Information Systems, Wiley, and Global Advances in Health and Medicine.

Additional Contributions: We acknowledge and thank all of the contributors to the 7 Peer Review Congresses and the support of our colleagues: Howard Bauchner and the editorial staff of JAMA, Fiona Godlee and Trish Groves of the BMJ, and Constance Murphy for assistance with the manuscript.

Malički  M, von Elm  E, Marušić  A.  Study design, publication outcome, and funding of research presented at the International Congresses on Peer Review and Biomedical Publication. JAMA. doi:10.1001/jama.2014.143.
Becker  JE, Krumholz  HM, Ben-Josef  G, Ross  JS.  Reporting of results in ClinicalTrials.gov and high-impact journals. JAMA. doi:10.1001/jama.2013.285634.
Kasenda  B, von Elm  E, You  J,  et al.  Prevalence, characterstics, and publication of discontinued randomized trials. JAMA. doi:10.1001/jama.2014.1361.
International Congress on Peer Review and Biomedical Publication website.http://www.peerreviewcongress.org/index.html. Accessed February 8, 2014.
Rennie  D.  Guarding the guardians: research on editorial peer review: selected proceedings from the First International Congress on Peer Review in Biomedical Publication, May 10-12, 1989, Chicago, Ill. JAMA. 1990;263(10):1317-1441.
PubMed   |  Link to Article
Rennie  D.  Guarding the guardians: a conference on editorial peer review. JAMA. 1986;256(17):2391-2392.
PubMed   |  Link to Article
Begg  C, Cho  M, Eastwood  S,  et al.  Improving the quality of reporting of randomized controlled trials: the CONSORT statement. JAMA. 1996;276(8):637-639.
PubMed   |  Link to Article
Chalmers  I.  Underreporting research is scientific misconduct. JAMA. 1990;263(10):1405-1408.
PubMed   |  Link to Article
Simes  RJ.  Publication bias: the case for an international registry of clinical trials. J Clin Oncol. 1986;4(10):1529-1541.
PubMed
Dickersin  K, Rennie  D.  Registering clinical trials. JAMA. 2003;290(4):516-523.
PubMed   |  Link to Article
DeAngelis  CD, Drazen  JM, Frizelle  FA,  et al; International Committee of Medical Journal Editors.  Clinical trial registration: a statement from the International Committee of Medical Journal Editors [published online September 8, 2014]. JAMA. 2004;292(11):1363-1364.
PubMed   |  Link to Article
Ioannidis  JPR. Replication and reproducible research: utopia or reality? Lecture presented at: Seventh International Congress on Peer Review and Biomedical Publication; September 8, 2013; Chicago, IL.
Collins  FS, Tabak  LA.  Policy: NIH plans to enhance reproducibility. Nature. 2014;505(7485):612-613.
PubMed   |  Link to Article
 Announcement: reducing our irreproducibility [published online April 25, 2013]. Nature. doi:10.1038/496398a.
National Research Council. Discussion Framework for Clinical Trial Data Sharing: Guiding Principles, Elements, and Activities. Washington, DC: National Academies Press; 2014.
Macleod  MR, Michie  S, Roberts  I,  et al.  Biomedical research: increasing value, reducing waste. Lancet. 2014;383(9912):101-104.
PubMed   |  Link to Article

Figures

Tables

References

Malički  M, von Elm  E, Marušić  A.  Study design, publication outcome, and funding of research presented at the International Congresses on Peer Review and Biomedical Publication. JAMA. doi:10.1001/jama.2014.143.
Becker  JE, Krumholz  HM, Ben-Josef  G, Ross  JS.  Reporting of results in ClinicalTrials.gov and high-impact journals. JAMA. doi:10.1001/jama.2013.285634.
Kasenda  B, von Elm  E, You  J,  et al.  Prevalence, characterstics, and publication of discontinued randomized trials. JAMA. doi:10.1001/jama.2014.1361.
International Congress on Peer Review and Biomedical Publication website.http://www.peerreviewcongress.org/index.html. Accessed February 8, 2014.
Rennie  D.  Guarding the guardians: research on editorial peer review: selected proceedings from the First International Congress on Peer Review in Biomedical Publication, May 10-12, 1989, Chicago, Ill. JAMA. 1990;263(10):1317-1441.
PubMed   |  Link to Article
Rennie  D.  Guarding the guardians: a conference on editorial peer review. JAMA. 1986;256(17):2391-2392.
PubMed   |  Link to Article
Begg  C, Cho  M, Eastwood  S,  et al.  Improving the quality of reporting of randomized controlled trials: the CONSORT statement. JAMA. 1996;276(8):637-639.
PubMed   |  Link to Article
Chalmers  I.  Underreporting research is scientific misconduct. JAMA. 1990;263(10):1405-1408.
PubMed   |  Link to Article
Simes  RJ.  Publication bias: the case for an international registry of clinical trials. J Clin Oncol. 1986;4(10):1529-1541.
PubMed
Dickersin  K, Rennie  D.  Registering clinical trials. JAMA. 2003;290(4):516-523.
PubMed   |  Link to Article
DeAngelis  CD, Drazen  JM, Frizelle  FA,  et al; International Committee of Medical Journal Editors.  Clinical trial registration: a statement from the International Committee of Medical Journal Editors [published online September 8, 2014]. JAMA. 2004;292(11):1363-1364.
PubMed   |  Link to Article
Ioannidis  JPR. Replication and reproducible research: utopia or reality? Lecture presented at: Seventh International Congress on Peer Review and Biomedical Publication; September 8, 2013; Chicago, IL.
Collins  FS, Tabak  LA.  Policy: NIH plans to enhance reproducibility. Nature. 2014;505(7485):612-613.
PubMed   |  Link to Article
 Announcement: reducing our irreproducibility [published online April 25, 2013]. Nature. doi:10.1038/496398a.
National Research Council. Discussion Framework for Clinical Trial Data Sharing: Guiding Principles, Elements, and Activities. Washington, DC: National Academies Press; 2014.
Macleod  MR, Michie  S, Roberts  I,  et al.  Biomedical research: increasing value, reducing waste. Lancet. 2014;383(9912):101-104.
PubMed   |  Link to Article
CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 1

Related Content

Customize your page view by dragging & repositioning the boxes below.

See Also...
Articles Related By Topic
Related Collections