We're unable to sign you in at this time. Please try again in a few minutes.
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Review |

Tools for Direct Observation and Assessment of Clinical Skills of Medical Trainees A Systematic Review

Jennifer R. Kogan, MD; Eric S. Holmboe, MD; Karen E. Hauer, MD
JAMA. 2009;302(12):1316-1326. doi:10.1001/jama.2009.1365.
Text Size: A A A
Published online

Context Direct observation of medical trainees with actual patients is important for performance-based clinical skills assessment. Multiple tools for direct observation are available, but their characteristics and outcomes have not been compared systematically.

Objectives To identify observation tools used to assess medical trainees' clinical skills with actual patients and to summarize the evidence of their validity and outcomes.

Data Sources Electronic literature search of PubMed, ERIC, CINAHL, and Web of Science for English-language articles published between 1965 and March 2009 and review of references from article bibliographies.

Study Selection Included studies described a tool designed for direct observation of medical trainees' clinical skills with actual patients by educational supervisors. Tools used only in simulated settings or assessing surgical/procedural skills were excluded. Of 10 672 citations, 199 articles were reviewed and 85 met inclusion criteria.

Data Extraction Two authors independently abstracted studies using a modified Best Evidence Medical Education coding form to inform judgment of key psychometric characteristics. Differences were reconciled by consensus.

Results A total of 55 tools were identified. Twenty-one tools were studied with students and 32 with residents or fellows. Two were used across the educational continuum. Most (n = 32) were developed for formative assessment. Rater training was described for 26 tools. Only 11 tools had validity evidence based on internal structure and relationship to other variables. Trainee or observer attitudes about the tool were the most commonly measured outcomes. Self-assessed changes in trainee knowledge, skills, or attitudes (n = 9) or objectively measured change in knowledge or skills (n = 5) were infrequently reported. The strongest validity evidence has been established for the Mini Clinical Evaluation Exercise (Mini-CEX).

Conclusion Although many tools are available for the direct observation of clinical skills, validity evidence and description of educational outcomes are scarce.

Figures in this Article

Sign in

Purchase Options

• Buy this article
• Subscribe to the journal
• Rent this article ?


Place holder to copy figure label and caption
Figure. Literature Search and Article Selection Process
Graphic Jump Location



Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.


Some tools below are only available to our subscribers or users with an online account.

177 Citations

Sign in

Purchase Options

• Buy this article
• Subscribe to the journal
• Rent this article ?

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles

Users' Guides to the Medical Literature: A Manual for Evidence-Based Clinical Practice, 3rd ed
How Can I Apply the Results?

Users' Guides to the Medical Literature: A Manual for Evidence-Based Clinical Practice, 3rd ed
Are the results credible?