0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
ARTICLE |

Watching the Doctor-Watchers:  How Well Do Peer Review Organization Methods Detect Hospital Care Quality Problems?

Haya R. Rubin, MD, PhD; William H. Rogers, PhD; Katherine L. Kahn, MD; Lisa V. Rubenstein, MD; Robert H. Brook, MD, ScD
JAMA. 1992;267(17):2349-2354. doi:10.1001/jama.1992.03480170075032.
Text Size: A A A
Published online

Objective.  —To determine how well one state's peer review organization (PRO) judged the quality of hospital care compared with an independent, credible judgment of quality of care.

Design.  —Retrospective study comparing a PRO's review, including initial screening, physician review, and final judgments, with an independent "study judgment" based on blinded, structured, implicit reviews of hospital records.

Setting.  —One state's medical and surgical Medicare hospitalizations during 1985 through 1987 audited randomly by the state's PRO.

Sample.  —Stratified random sampling of records: 62 records that passed the PRO initial screening process and were not referred for PRO physician review; 50 records that failed PRO screen and then were confirmed by PRO physicians to be "quality problems."

Intervention.  —None.

Main Outcome Measure.  —A study judgment of below standard or standard or above based on the mean of overall ratings by five internists for records in medical diagnosis related groups (DRGs) and by five internists and five surgeons for surgical DRGs. Each step in the PRO review was evaluated for how many records passing or failing that step were judged standard or above or below standard in the study (positive and negative predictive value) and how well that step classified records that the study judged below standard or standard or above (sensitivity and specificity).

Results.  —An estimated 18% of records reviewed by the PRO were below standard according to the study judgment, compared with 6.3% quality problems according to the PRO's final judgment (difference, 12%; 95% confidence interval, 1 to 23). The PRO's initial screening process failed to detect and refer for PRO physician review two of three records that the study judged below standard. In addition, only one of three of the records that PRO physicians judged to be quality problems were judged below standard by the study judgment. Therefore, the PRO's final quality of care judgment and the study judgment agreed little more than expected by chance, especially about poor quality of care. Although the PRO correctly classified 95% of the records that the study judged standard or above, it detected only 11% of records judged below standard by the study.

Conclusions.  —Most of all, this PRO review process would be improved by additional preliminary screens to identify the 67% of records that the study judged below standard but that passed its initial screening. The screening process also must be more accurate in order to be cost-effective, as it was only slightly better than random sampling at correctly identifying below standard care. More reproducible physician review is also needed and might be accomplished through improved reviewer selection and training, a structured review method, and more physician reviewers per record.(JAMA. 1992;267:2349-2354)

Topics

Sign in

Create a free personal account to sign up for alerts, share articles, and more.

Purchase Options

• Buy this article
• Subscribe to the journal

Figures

Tables

References

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
NOTE:
Citing articles are presented as examples only. In non-demo SCM6 implementation, integration with CrossRef’s "Cited By" API will populate this tab (http://www.crossref.org/citedby.html).

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Sign in

Create a free personal account to sign up for alerts, share articles, and more.

Purchase Options

• Buy this article
• Subscribe to the journal

Related Content

Customize your page view by dragging & repositioning the boxes below.

Jobs
brightcove.createExperiences();