0
ARTICLE |

Do Readers and Peer Reviewers Agree on Manuscript Quality?

Amy C. Justice, MD; Jesse A. Berlin, ScD; Suzanne W. Fletcher, MD; Robert H. Fletcher, MD; Steven N. Goodman, MD, PhD
JAMA. 1994;272(2):117-119. doi:10.1001/jama.1994.03520020043011.
Text Size: A A A
Published online

Objective.  —To study readers' judgments of manuscript quality and the degree to which readers agreed with peer reviewers.

Design.  —Cross-sectional study.

Setting.  Annals of Internal Medicine.

Subjects.  —One hundred thirteen consecutive manuscripts reporting original research and selected for publication. Each of two manuscript versions (one before and one after revision) was judged by two readers, randomly sampled from those who said (based on the title) that they would read the article; one peer reviewer (peer), chosen in the usual way for Annals; and one expert in clinical research methods (expert). Each judge completed an instrument that included a 10-point subjective summary grade of manuscript quality.

Main Outcome Measures.  —Agreement on the 10-point summary grade of manuscript quality between reader-expert, reader-peer, and reader-reader.

Results.  —Readers and peers gave high grades (77% and 73% gave a grade of 5 or better, respectively), while experts were more critical (52% gave a grade of 5 or better; P<.0001). Agreement was relatively high among judge groups (in all cases, >69%) but agreement beyond chance was poor (κ<0.04). One third of readers (33%) thought that the manuscript had little relevance to their work.

Conclusion.  —Readers, like most peer reviewers, are generally satisfied with the quality of manuscripts but would like research articles to be more relevant to their clinical practice.(JAMA. 1994;272:117-119)

Topics

Sign In to Access Full Content

Don't have Access?

Register and get free email Table of Contents alerts, saved searches, PowerPoint downloads, CME quizzes, and more

Subscribe for full-text access to content from 1998 forward and a host of useful features

Activate your current subscription (AMA members and current subscribers)

Purchase Online Access to this article for 24 hours

Figures

Tables

References

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Sign In to Access Full Content

Related Content

Customize your page view by dragging & repositioning the boxes below.

Jobs
brightcove.createExperiences();