0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
msJAMA |

High-Stakes Medical Performance Testing: The Clinical Skills Assessment Program FREE

Gerald Whelan, MD
[+] Author Affiliations

Not Available


Not Available


JAMA. 2000;283(13):1748. doi:10.1001/jama.283.13.1748-JMS0405-4-1.
Text Size: A A A
Published online

On July 1, 1998, the Educational Commission for Foreign Medical Graduates (ECFMG) implemented its Clinical Skills Assessment (CSA) as a new requirement for graduates of foreign medical schools seeking certification for entry into an accredited residency training program in the United States. The CSA is a day-long practical examination designed to assess graduates' ability to gather and interpret clinical data and to communicate effectively with patients and health professionals in English.

From the start, the CSA was conceived as only 1 of several assessment elements leading to certification by the ECFMG. Others include passing scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2, completion of an English comprehension test, and graduation from a medical school listed in the World Health Organization directory.

The CSA developed as a result of widespread concern among medical educators that basic clinical skills, including medical history-taking, physical examination, and doctor-patient communication, were not being adequately addressed in undergraduate medical education.1 Within the United States, the Liaison Committee on Medical Education (LCME) responded to this concern by incorporating requirements on the teaching and assessment of basic clinical skills into its criteria for accreditation of US schools.

However, no international equivalent of the LCME was available to develop similar curriculum requirements for medical schools outside of the United States. Educators were concerned that medical schools around the world were variable in quality and not always subject to outside review.2 Since neither the ECFMG nor any other agency could directly impact schools' curricula, the CSA was developed as a tool for evaluating educational outcomes.

To develop a process that yielded a reliable and fair assessment of clinical skills, a number of preliminary studies were undertaken under the auspices of the ECFMG. Two of the larger-scale studies involved both US medical students in their third and fourth years and graduates of foreign medical schools at varying levels of certification. The first, conducted in Baltimore, Md, involved a consortium of 6 schools; the second, conducted in Philadelphia, Pa, involved 2 local schools.

The introduction of this assessment is a milestone in US medical evaluation in that it is the first time basic clinical skills, including interpersonal skills and spoken English proficiency, are being assessed in a high-stakes environment. To ensure that the test is administered in a standardized manner, the CSA is offered only at the ECFMG headquarters in Philadelphia. The CSA uses standardized patients (SPs), many of whom are actors, to portray cases to the candidates and to score the encounters. SPs are recruited not for the presence of physical findings but for their ability to portray cases and score accurately; each receives 15 to 20 hours of training on assessing interpersonal skills and English proficiency, plus 6 hours of training for each case he or she portrays.

The ECFMG now has had experience with 8383 candidates who presented themselves for assessment at its center from July 1, 1998, through January 31, 2000. While the test is administered throughout the year, results are standardized through an analysis and equation of mean scores received on each case-SP combination. Analysis shows that 96.9% of these candidates received a passing decision. (CSA reports only pass/fail decisions, not numerical scores.)

Evidence suggests this high pass rate is a reflection of considerable self-selection on the part of the certification candidates. Describing their experience in questionnaires distributed after the examination, 80% of candidates reported making special preparations, including clinical observerships in the United States. Candidates also reported prescreening their spoken English proficiency by taking the Test of Spoken English, and only 3 candidates reported having scored below the level suggested in the CSA orientation manual as a prudent cutoff. Trial runs that were used to set CSA examinations standards involved test takers who had no investment in the results.

Successful candidates must meet or exceed standards for 2 separate components of the CSA: an Integrated Clinical Encounter (ICE) component assessing ability to gather data and compose a clinical note, and a Doctor-Patient Communications (COM) component based on spoken English proficiency and interpersonal skills. To date, 80% of those failing the CSA have done so on the COM side. Of the cohort described, 92% reported that SP portrayals in the test were realistic, and 90% felt that the SP format was appropriate.

The most widely heard complaint made by graduates of foreign medical schools regarding the CSA is that it is unfair for US medical graduates to be exempt from taking a similar national assessment examination. The National Board of Medical Examiners is currently piloting a Standardized Patient Examination for possible incorporation into the USMLE series, and, if adopted, this would be open to graduates of foreign schools in lieu of a separate CSA.

REFERENCES

Sutrick  AIStillman  PLNoroini  JJ  et al.  ECFMG assessment of clinical competence of graduates of foreign medical schools. JAMA. 1993;2701041- 1045
Link to Article
Institute of Medicine, The Nations' Physician Workforce: Options of Balancing Supply and Requirements.  Washington, DC National Academy Press1996;

Figures

Tables

References

Sutrick  AIStillman  PLNoroini  JJ  et al.  ECFMG assessment of clinical competence of graduates of foreign medical schools. JAMA. 1993;2701041- 1045
Link to Article
Institute of Medicine, The Nations' Physician Workforce: Options of Balancing Supply and Requirements.  Washington, DC National Academy Press1996;

Letters

CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.