0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Review |

Impact of Formal Continuing Medical Education:  Do Conferences, Workshops, Rounds, and Other Traditional Continuing Education Activities Change Physician Behavior or Health Care Outcomes? FREE

Dave Davis, MD; Mary Ann Thomson O'Brien, MSc; Nick Freemantle, PhD; Fredric M. Wolf, PhD; Paul Mazmanian, PhD; Anne Taylor-Vaisey, MLS
[+] Author Affiliations

Author Affiliations: Continuing Education and the Centre for Research in Education, University of Toronto, Faculty of Medicine, Toronto, Ontario (Dr Davis and Ms Taylor-Vaisey); School of Rehabilitation Science, McMaster University, and Social and Public Health Services, Region of Hamilton-Wentworth, Hamilton, Ontario (Ms O'Brien); Medicines Evaluation Group, Centre for Health Economics, University of York, York, England (Dr Freemantle); Department of Medical Education, University of Washington, Seattle (Dr Wolf); and Medical College of Virginia Campus, Virginia Commonwealth University, Richmond (Dr Mazmanian).


JAMA. 1999;282(9):867-874. doi:10.1001/jama.282.9.867.
Text Size: A A A
Published online

Context Although physicians report spending a considerable amount of time in continuing medical education (CME) activities, studies have shown a sizable difference between real and ideal performance, suggesting a lack of effect of formal CME.

Objective To review, collate, and interpret the effect of formal CME interventions on physician performance and health care outcomes.

Data Sources Sources included searches of the complete Research and Development Resource Base in Continuing Medical Education and the Specialised Register of the Cochrane Effective Practice and Organisation of Care Group, supplemented by searches of MEDLINE from 1993 to January 1999.

Study Selection Studies were included in the analyses if they were randomized controlled trials of formal didactic and/or interactive CME interventions (conferences, courses, rounds, meetings, symposia, lectures, and other formats) in which at least 50% of the participants were practicing physicians. Fourteen of 64 studies identified met these criteria and were included in the analyses. Articles were reviewed independently by 3 of the authors.

Data Extraction Determinations were made about the nature of the CME intervention (didactic, interactive, or mixed), its occurrence as a 1-time or sequenced event, and other information about its educational content and format. Two of 3 reviewers independently applied all inclusion/exclusion criteria. Data were then subjected to meta-analytic techniques.

Data Synthesis The 14 studies generated 17 interventions fitting our criteria. Nine generated positive changes in professional practice, and 3 of 4 interventions altered health care outcomes in 1 or more measures. In 7 studies, sufficient data were available for effect sizes to be calculated; overall, no significant effect of these educational methods was detected (standardized effect size, 0.34; 95% confidence interval [CI], −0.22 to 0.97). However, interactive and mixed educational sessions were associated with a significant effect on practice (standardized effect size, 0.67; 95% CI, 0.01-1.45).

Conclusions Our data show some evidence that interactive CME sessions that enhance participant activity and provide the opportunity to practice skills can effect change in professional practice and, on occasion, health care outcomes. Based on a small number of well-conducted trials, didactic sessions do not appear to be effective in changing physician performance.

Two apparently conflicting pieces of evidence exist about physicians' continuing medical education (CME). Physicians report spending, on average (and among other activities), 50 hours per year in CME activities,13 ostensibly geared toward improving their performance and/or optimizing the outcomes of their patients. In addition, producing and accrediting formal, planned CME events and activities are a large enterprise, especially in the United States,4,5 intended to bring physicians up-to-date with rapidly expanding medical information. Patterned after undergraduate medical education consisting of lectures, audio visual presentations, and printed materials, CME activities appear underpinned by a belief that gains in knowledge lead physicians to improve how they practice and thus improve patient outcomes. Despite this belief and the level of participation in and resources for CME, many studies have demonstrated a lack of effect on physicians' performance of current practice guidelines6,7 or sizable gaps between real and ideal performance.8,9 In addition, a relatively weak effect of formal, planned CME on physician performance has been demonstrated in 2 of our previous reviews.10,11

Prompted by such evidence, this study (one of a series of reviews of the educational literature and its effect on practicing health professionals10,1214) focuses on the effect of formal CME. This form of continuing education is highly variable, ranging from passive, didactic, large-group presentations to highly interactive learning methods, such as workshops, small groups, and individualized training sessions. Examples of such educational activities include rounds, educational meetings, conferences, refresher courses, programs, seminars, lectures, workshops, and symposia. Given the wide variability of formal CME formats, and the weight given by adult education theorists to interactive learning,1518 our objective in this review was to answer these questions: Overall, is formal CME effective? Under what conditions is formal CME effective? What is particularly effective within formal CME in changing physician performance or health care outcomes?

Search Strategy

Building on previous work and methods, we searched the Research and Development Resource Base in CME,19 a bibliographic database of continuing health professional education literature (Continuing Education, Faculty of Medicine, University of Toronto, Toronto, Ontario). Regular searches of MEDLINE, CINAHL, ERIC, EMBASE, and PsycInfo form the repository of all references cited in this and previous reviews. To update our previous MEDLINE search, 2 authors separately searched from January 1993 (the concluding date of our last major review-focused search10) to January 31, 1999. We combined the medical subject terms randomized controlled trials and random allocation with randomized controlled trial as both a publication type and text words. We then added to the strategy medical subject terms such as education, medical, continuing; education, continuing; and education, as well as variations of the following text words: lectures, rounds, seminars, meetings, symposia, conferences, courses, workshops, and small groups. We also searched the Specialised Register of the Cochrane Effective Practice and Organisation of Care Group.20 The reference lists and abstracts of all articles that met initial criteria were reviewed independently by 3 authors (D.D., M.A.T.O., A.T.-V.).

Selection Criteria

We included only those studies that met the following criteria: randomized controlled trials of formal educational interventions (such as conferences, rounds, meetings, symposia, and individualized training sessions) using only educational activities meant to be persuasive rather than those that coerced or provided incentives to the learner-participant; objective determination of health professional performance in the workplace and/or determinations of health care outcomes; and more than 50% of the participants were practicing physicians. Health care outcomes included patient behavior outcomes, such as adherence to medication or smoking cessation rates. Although we recognize that a myriad of complex factors influence patient behavior, we included these outcomes if reported in the primary studies. For purposes of comparison, we included only those formal CME activities that were didactic and/or may have used interactive educational techniques. We excluded interventions that deployed postcourse reminders, audit, and feedback or other measures to change physician performance and studies that included more than 50% nonphysician health professionals or residents, given the marked difference in these populations' regulations, work environments, and expectations.

We developed and applied the following categorization of educational interventions. Didactic sessions were defined as predominantly lectures or presentations with minimal audience interaction or discussion. Interactive sessions included those using techniques to enhance physician participation, such as role-play, discussion groups, hands-on training, problem solving, or case solving. These interventions were termed mixed if they specified the use of both interactive and didactic methods. We classified the interventions as single or in a series if content was delivered on more than 1 occasion.

We also noted, where possible, the size of the groups educated, the presence of needs assessment techniques, and the inclusion of enabling elements within the session, such as those that would assist the physician-learner to make changes in his/her practice environment (eg, patient education materials, flow charts, reminders, and protocols). Further, we classified the outcomes as positive if 1 or more primary outcome measures related to physician performance or patient health care demonstrated a statistically significant change and negative if no such change occurred. Finally, 2 of 3 reviewers (M.A.T.O., N.F., D.D.) independently applied all inclusion and exclusion criteria using specific standards for each study. Once extracted, data were subjected to meta-analytic techniques.

Statistical Analysis

Where available, data were reported and a standardized effect size was estimated for each study group. The standardized effect size is the difference in means divided by the square root of the pooled-group variances. Transforming differences in means to standardized scores enables the comparison of effects in different outcomes on a common dimensionless scale. We used the full random effects approach described by Smith et al,21 which takes into account uncertainty in the distribution of observed effects and works well with sparse data. This approach also enabled us to estimate the extent to which the presence of some interactive element in a teaching intervention predicted an improved outcome.

Literature Search

Fourteen studies met our inclusion criteria2235 (Table 1). These were derived from a larger pool of studies retrieved from the Cochrane Effective Practice and Organisation of Care36 module (which yielded 39 studies, of which 13 met our inclusive criteria) and MEDLINE (which yielded 25 potentially relevant studies, of which 1 met our criteria).

Table Graphic Jump LocationTable 1. Description of Studies Included*
Description of Studies

All trials studied the effect of formal, planned CME on American,2325,2834 Canadian,26,27,35 or French22 physicians. Of these, 1 study reported the effect of formal CME interventions on internists,31 5 on family physicians/general practitioners,22,2628,35 and 2 on pediatricians.24,30 Six studies targeted outcomes from a mixture of these physicians.23,25,29,32,33,35 Clinical dimensions of care addressed by the studies included prevention and screening,22,23,25,27 disease management,24,26,34 counseling or communication skills,2931,33,35 smoking cessation,28 and, in one instance, manual skills.32

Quality of Studies

Using previously published criteria for assessing the quality of randomized controlled trials,36 we determined that, of the 14 trials reviewed, only one23 described adequate concealment of allocation to randomization indicated in the published report and 10 trials documented adequate follow-up. Outcomes were assessed blindly in only 7 studies.

Overall Effect of Interventions

Within the 14 studies, 24 separate interventions were tested; several studies used more than 1 intervention (Table 2). Of these, we report the outcomes of only 17 interventions; 5 did not conform to our definition of formal CME, 1 intervention itself was didactic and was used as a usual care control,26 and, in a component of 1 trial, physicians were not randomized to a longitudinal formal CME intervention.29 Of these 17 comparisons, the effect on physician performance was measured in all instances and on both performance and health care outcomes in 4 studies. The majority of these interventions were positive: 9 of 17 had an impact on 1 or more measures related to physician performance and 3 of 4 were related to health care outcomes.

Table Graphic Jump LocationTable 2. Studies of the Impact of CME vs a Control Group by Intervention Type and Intensity*
Effect of Didactic CME

Three interventions were found that used didactic measures to carry their educational message. One study focused on screening for cholesterol,23 1 on family practice topics,26 and a third study22 attempted to improve screening techniques for breast and cervical cancer. Heale et al26 provided family practice topics by mostly large-group, didactic, case-based presentations, comparing this method with a didactic, traditional, topic-oriented control. Overall, none of the 4 interventions altered physician performance.

Effect of Interactive and Mixed CME

We found 6 interventions that described only interactive techniques.2427,32 Of these, 2 were single interventions25,26 and 4 used sequenced sessions.24,27,32 All 6 interventions measured the impact on physician performance, and all but 2 demonstrated a significant impact. These 2 were Heale et al,26 using small group discussions of family practice topics, and Dietrich et al,25 who demonstrated a change in only 1 of 10 outcome measures following an interactive CME session. Clark et al,24 for example, studied the effect of two 2½-hour sessions in pediatric care that used interactive techniques, a video, and case studies over a 2- to 3-week period. Clark et al, conducted parent interviews that demonstrated a positive effect on both physician performance and on child health.

Seven interventions were described as being mixed (ie, including both didactic and interactive elements).2831,3335 All 7 measured their impact on physician performance; of these, 5 were positive.25,28,30,33,35 Three measured the impact on health care outcomes,28,30,33 of which 2 demonstrated positive changes.30,33 Roter et al,33 for example, used two 4-hour sessions that included didactic elements, roundtable discussions, and interactive presentations to improve physician communication skills. Positive changes were demonstrated by audiotaped analysis of patient interviews.

Effect of Single vs Multiple or Longitudinal Interventions

Seven interventions were single in nature, held in 1 period, and ranged from 234 to 6 hours.26 All 7 measured their impact on physician performance and 2 were deemed positive.25,34 In contrast, there were 8 interventions that used 2 sessions in a series24,28,3033,35 and 1 that offered a series of formal CME interventions, including small groups and teleconferences27 in 2 separate topic-based interventions. These ranged from 2 hours35 to a total of 48 hours.27 Of these 10 interventions, seven24,27,28,30,32,33,35 demonstrated positive outcomes on physician performance; 3 of 4 demonstrated a similar impact on health care outcomes.24,30,33

Effect of Other Variables

Group size varied in these studies. One trial studied the effect of individualized training sessions,32 while 3 used small groups of fewer than 10 individuals.26,29,31 Moderate-sized groups, from 10 to 19 participants, were reported by 6 authors2325,27,30,35 and 3 interventions22,26,34 used groups of 20 or more. No relationship between group size and positive outcomes was noted. Four studies22,24,32,34 reported the development of the intervention following a needs assessment survey. One study27 used a 6-step process to identify and evaluate the learning needs of participating physicians. Of these 5 studies, only one22 generated no change in physician performance or health care outcomes. Several studies used enabling methods, such as patient education materials23,28,31,35 delivered in the context of the formal CME intervention; of these, three28,31,35 demonstrated change in physician performance.

Finally, length of time to outcome assessment was reviewed in each of the studies (Table 1), ranging from 1 month29 to 24 months31 after the intervention. Those studies that effected a positive change in physician performance varied in the length of time to outcome from 1.535 to 22 months,24 while those that effected no such change ranged from 1 29 to 24 months.31 Of the 5 studies that analyzed patient outcomes, those that generated negative results assessed outcomes at 1228 and 18 months.23 In contrast, the 3 studies that effected positive changes included those by Maiman et al30 and Roter et al,33 each at 6 months, and Clark et al24 at 22 months after intervention.

Quantitative Analysis

Only 7 trials provided data for quantitative pooling.22,23,25,28,29,34,35 Overall, the pooled, standardized weighted mean difference from these studies was 0.34 (95% confidence interval [CI], −0.22 to 0.97), indicating a nonsignificant overall benefit from formal CME. However, when a random effects factor was added to the model, describing the effects of sessions that included an interactive element, this was associated with a significant positive effect (standardized weighted mean difference, 0.67 [95% CI, 0.01-1.45]).

Designing Effective Formal CME: Major Variables

The use of traditional CME activities such as lectures has been widely criticized.37 This criticism appears justified because didactic interventions analyzed in this review failed to achieve success in changing performance or health care outcomes. While such interventions may change other elements of competence, such as knowledge, skills, or attitudes, or may act as predisposing elements to change, didactic lectures by themselves do not play a significant role in immediately changing physician performance or improving patient care.

In contrast, studies that used interactive techniques such as case discussion, role-play, or hands-on practice sessions were generally more effective changing those outcomes documented in this review. Sessions that were sequenced also appeared to have more impact. Both of these findings match closely those principles promoted by adult educators,17,38 who describe successful adult education as learner-centered, active rather than passive, relevant to the learner's needs, engaging, and reinforcing—characteristics of CME interventions more frequently found in the interactive rather than the passive educational setting. Further, the learn-work-learn opportunities afforded by sequenced sessions, in which education may be translated into practice and reinforced (or discussed) at a further session, may explain the success of sequenced interventions. Communication theory39 suggests that communication sustained over time may enable ideas to converge across gaps such as those that exist between CME teachers and learner-participants.

Other factors within the purview of the CME provider may also affect the impact of the formal CME intervention. The addition of methods we have termed enabling (eg, patient education materials), which may facilitate adapting to changes in the practice site, appear to be effective in the few trials in this review. These enabling agents conform in general terms to our previous conclusions14 about the success of multiple interventions, to the broader understanding of the complexity of the environment in which the change is to occur,40 and to the promotion of health literature.41

Other Variables Affecting the Impact of Formal CME

Other variables appear effective but fall outside the traditional domain of CME providers. The first cluster of these may be termed external, relating to the practice environment, such as the use of after-course office management systems in which nurse-facilitators provide onsite, practice-based suggestions. Such variables also include the use of objective needs assessment methods, which appear to be precursors of effective CME interventions.10 Both these variables speak to the need for collaboration between CME providers and the practice sector and their data sources, exemplifying, from an educational perspective, the new paradigm for CME,42 or from a health service perspective, Berwick's model of quality improvement.43 Such collaboration would appear to provide an environment in which clinical changes could be sustained over time.

The second cluster relates to the internal or intrapersonal aspects of the physician-participant. In this context, it is important to note that what was measured in this review was performance change, not learning. An evolving body of knowledge suggests that physicians attend formal CME events with varying levels of motivation to change, and that the level of their commitment to change may supersede both the immediate clinical value of the information and the method by which it was delivered, as predictors of change in performance.44,45 Additionally, the interaction between members of groups in some of these studies may influence individuals' learning and change,46 perhaps by producing a level of cognitive dissonance between what peers know and do compared with the learner.47 Finally, physicians appear to develop their own learning priorities based on external and internal forces: the CME course or conference may be just one of many such forces.48

Limitations

We offer several cautions about the interpretation of these results. First, as in all such reviews, publication bias may generate a biased sample of studies, increasing the likelihood of finding a false-positive result. This bias may also include the selective publication of data within trials by describing results as nonsignificant (P>.05).49 Second, the process of study selection, data extraction, and estimation of trial quality, although performed by 3 independent reviewers, may be flawed. For example, our descriptions of interactive or didactic elements reflect the reporting authors' description of the degree of interaction, not necessarily the learning process. Third, the limited number of randomized controlled trials and settings may undermine the generalizability of these findings. However, the fact that this small number of studies identified a significant result may imply a large and fairly consistent effect from interactive educational methods. This latter point is especially important, given that we used a conservative method for random effects meta-analysis, which takes into account uncertainty in the distribution of the effects of studies.21 Fifth, these trials were, for the most part, all performed with primary care physicians, who permitted an examination of their performance data or the health outcomes of their patients, who were volunteers, and who (in some trials) engaged in sequenced CME activities; as such, they may not represent the target population of most CME activities. That is, these studies may be viewed more as efficacy studies conducted under better than normal conditions than as effectiveness studies of everyday CME. Finally, the degree to which this review may be termed a meta-analysis and the extent to which pooled effect sizes may be reported here depends to a large extent on the comparability and likeness of these CME interventions, an issue that continues to be debated.

Implications for Research

As in previous reviews, we noted the relatively narrow range of clinical areas studied by these trials. While screening, smoking cessation, and counseling or communication skills are important clinical topic areas and key ingredients of primary care, many issues that involve the complex management of patients in surgery, internal medicine, psychiatry, and other areas of specialty or subspecialty interest were not covered in the studies located. It appears to us that the setting of both the learning experience and the practicing physician, particularly in an era of increasing managed care, warrant careful study. An area ripe for investigation appears to be the changing demographics of the physician population and gender mix, the increasing numbers of graduates from problem-based schools and primary care training, and the inclination toward and skills required for self-directed lifelong learning.38

We provided several comments about the methods and quality of trials of CME interventions. When investigators pursue controlled trial methods, we recommend that they, along with journal editors and referees, adhere more closely to the CONSORT50 recommendations for reporting randomized controlled trials. In particular, it is important that investigators choose a suitable design51 and an appropriate unit of analysis.52,53 Although these trials produce quantifiable end points, they cannot fully explain why change does or does not occur as a result of CME participation.54 Qualitative methods such as those used by Fox et al48 need to be applied, and tools such as those evolving from the commitment to change model44,55 used.

While these studies evaluated the impact of the CME intervention on physician performance or health care outcomes, we recognized that these measures are more distal to the intervention compared with the more proximal and easier-to-measure components of competence: knowledge, skills, and attitudes. Acknowledging the diminished impact of CME along this continuum (competence to performance to health care outcomes), we urge further research into those factors that accelerate or impede translation from one domain to another.

Conclusion

The ultimate effect of formal CME interventions on the practice of physicians and the health of their patients (as in the case of any intervention) must be understood in the context of the methods by which the CME is delivered, including but not limited to the nature of the interaction and the quality of the enabling resources where available, the environment in which the translated competence is played out, and in the complex intrapersonal, interpersonal, and professional educational variables that affect the physician-learner. Despite this complexity and the cautions one should bear in mind when interpreting these trials, we conclude that where performance change is the immediate goal of a CME activity, the exclusively didactic CME modality has little or no role to play. Knowledge is clearly necessary but not in and of itself sufficient to bring about change in physician behavior and patient outcomes. Such didactic interventions should—as they do in the Canadian Maintenance of Competence Program system of the Royal College of Physicians and Surgeons of Canada56—receive less credit than do more effective methods and perhaps no credit. In contrast, variables over which the CME provider has control and appear to have a positive effect are the degree of active learning opportunities, learning delivered in a longitudinal or sequenced manner, and the provision of enabling methods to facilitate implementation in the practice setting.

While numerous questions remain regarding formal CME, including group size, the role of the learning and practice environment, the clinical dimensions of care, the assessment of learner needs, and barriers to change, 1 large question remains. In the face of longstanding knowledge about adult, self-directed learning and the general disinclination to believe that didactic CME works, now coupled with the findings of this review, why would the medical profession persist in delivering such a product and accrediting its consumption? The reasons for the persistence of didactic CME include (but are definitely not limited to) the ease of designing and providing such activities, the substantial pharmaceutical sponsorship that promotes the transfer of information about new medications, and the dependence on traditional undergraduate models of education that are easy-to-mount and revenue generating.

Changing this delivery system carries serious implications for several groups of stakeholders that want to design and deliver effective CME. First, medical licensing boards and others with a genuine interest in assuring the public of physician competence must rethink the value of the CME credit system, including the American Medical Association's Physician Recognition Award57 category I credit, the most visible US CME currency exchanged for the privilege of practicing medicine. Second, medical schools, specialty societies, and other providers of CME must reconsider the value of the credit they provide, as well as the type and duration of learning activities they produce. Third, the Accreditation Council for CME in the United States and other organizations intending to ensure the quality of CME must evaluate the services they provide to a large, complex, and expensive CME enterprise that values the production of single-session, teacher-centered activities over learner achievement. Finally, physicians must reflect on what they perceive as the CME experience itself and weigh the costs and lost learning opportunities of attendance at ineffective didactic sessions against participating in interactive, challenging, and sequenced activities that have increased potential for positively affecting their performance and the health of the patients they serve—the most important outcome of all.

Difford F. General practitioners' attendance at courses accredited for the postgraduate education allowance.  Br J Gen Pract.1992;42:290-293.
Goulet F, Gagnon RJ, Desrosiers G, Jacques A, Sindon A. Participation in CME activities.  Can Fam Physician.1998;44:541-548.
Curry L, Putnam W. Continuing medical education in Maritime Canada: the methods physicians use, would prefer and find most effective.  CMAJ.1981;124:563-566.
Accreditation Council for Continuing Medical Education.  The ACCME Report. Chicago, Ill: Accreditation Council for Continuing Medical Education; 1997.
Mazmanian PE, Harrison RV, Osborne CE. Diversity across medical schools: programs, enrollment, and fees for continuing medical education.  J Continuing Educ Health Professions.1990;10:23-33.
Grimshaw JM, Russell IT. Effect of clinical guidelines on medical practice: a systematic review of rigorous evaluations.  Lancet.1993;342:1317-1322.
Davis DA, Taylor-Vaisey AL. Translating guidelines into practice: a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines.  CMAJ.1997;157:408-416.
Lau J, Antman EM, Jimenez-Silva J, Kupelnick B, Mosteller F, Chalmers TC. Cumulative meta-analysis of therapeutic trials for myocardial infarction.  N Engl J Med.1992;327:248-254.
Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA.and the Cochrane Effective Practice and Organisation of Care Review Group.  Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings.  BMJ.1998;317:465-468.
Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies.  JAMA.1995;274:700-705.
Thomson MA, Freemantle N, Wolf F, Davis DA, Oxman AD. Educational meetings, workshops and preceptorships to improve the practice of health professionals and health care outcomes (Cochrane Protocol on CD-ROM). Oxford, England: Cochrane Library, Update Software; 1999; issue 1.
Haynes RB, Davis DA, McKibbon A, Tugwell P. A critical appraisal of the efficacy of continuing medical education.  JAMA.1984;251:61-64.
Davis DA, Thomson MA, Oxman AD, Haynes RB. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials.  JAMA.1992;268:1111-1117.
Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice.  CMAJ.1995;153:1423-1431.
Brookfield SD. Understanding and Facilitating Adult Learning: A Comprehensive Analysis of Principles and Effective Practices. San Francisco, Calif: Jossey-Bass Publishers; 1986.
Cross KP. Adults as Learners: Increasing Participation and Facilitating Learning. San Francisco, Calif: Jossey-Bass Publishers; 1981.
Knowles MS. The Modern Practice of Adult Education: Andragogy Versus Pedagogy. New York, NY: New York Association Press; 1970.
Schon DA. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions. San Francisco, Calif: Jossey-Bass Publishers; 1990.
Taylor-Vaisey AL. Information needs of CME providers: research and development resource base in continuing medical education.  J Continuing Educ Health Prof.1995;15:117-121.
Grimshaw JM, Thomson MA.for the Cochrane Effective Practice and Organisation of Care Group.  What have new efforts to change professional practice achieved?  J R Soc Med.1998;91(suppl 35):20-25.
Smith TC, Spiegelhalter DJ, Thomas A. Bayesian approaches to random-effects meta-analysis: a comparative study.  Stat Med.1995;14:2685-2699.
Boissel JP, Collet JP, Alborini A.  et al. for the PRESAGF Collaborative Group.  Education program for general practitioners on breast and cervical cancer screening: a randomized trial.  Rev Epidemiol Sante Publique.1995;43:541-547.
Browner WS, Baron RB, Solkowitz S, Adler LJ, Gullion DS. Physician management of hypercholesterolemia: a randomized trial of continuing medical education.  West J Med.1994;161:572-578.
Clark NM, Gong M, Schork MA.  et al.  Impact of education for physicians on patient outcomes.  Pediatrics.1998;101:831-836.
Dietrich AJ, O'Connor GT, Keller A, Carney PA, Levy D, Whaley FS. Cancer: improving early detection and prevention; a community practice randomised trial.  BMJ.1992;304:687-691.
Heale J, Davis DA, Norman GR, Woodward C, Neufeld VR, Dodd P. A randomized controlled trial assessing the impact of problem based versus didactic teaching methods in CME.  Proc Conference Res Med Educ Assoc Am Med Coll.1988;27:72-77.
Jennett PA, Laxdal OE, Hayton RC.  et al.  The effects of continuing medical education on family doctor performance in office practice: a randomized control study.  Med Educ.1988;22:139-145.
Kottke TE, Brekke ML, Solberg LI, Hughes JR. A randomized trial to increase smoking intervention by physicians: Doctors Helping Smokers, Round I.  JAMA.1989;261:2101-2106.
Levinson W. The effects of two continuing medical education programs on communication skills of practicing primary care physicians.  J Gen Intern Med.1993;8:318-324.
Maiman LA, Becker MH, Liptak GS, Nazarian LF, Rounds KA. Improving pediatricians' compliance-enhancing practices.  AJDC.1988;142:773-779.
Ockene IS, Hebert JR, Ockene JK, Merriam PA, Hurley TG, Saperia GM. Effect of training and a structured office practice on physician-delivered nutrition counseling: the Worcester-Area Trial for Counseling in Hyperlipidemia (WATCH).  Am J Prev Med.1996;12:252-258.
Perera DR, LoGerfo JP, Shulenberger E, Ylvisaker JT, Kirz HL. Teaching sigmoidoscopy to primary care physicians: a controlled study of continuing medical education.  J Fam Pract.1983;16:785-788.
Roter DL, Hall JA, Kern DE, Barker LR, Cole KA, Roca RP. Improving physicians' interviewing skills and reducing patients' emotional distress: a randomized clinical trial.  Arch Intern Med.1995;155:1877-1884.
White CW, Albanese MA, Brown DD, Caplan RM. The effectiveness of continuing medical education in changing the behavior of physicians caring for patients with acute myocardial infarction.  Ann Intern Med.1985;102:686-692.
Wilson DM, Ciliska D, Singer J, Williams K, Alleyne J, Lindsay E. Family physicians and exercise counseling: can they be influenced to provide more?  Can Fam Physician.1992;38:2003-2010.
Bero LA, Grilli R, Grimshaw J, Oxman AD, Zwarenstein M. The Cochrane Effective Practice and Organisation of Care Group (EPOC) module (Cochrane Review on CD-ROM). Oxford, England: Cochrane Library, Update Software; 1999; issue 2.
Kanouse DE, Jacoby I. When does information change practitioners' behavior?  Int J Technol Assess Health Care.1988;4:27-33.
Candy PC. Self-Direction for Lifelong Learning: A Comprehensive Guide to Theory and Practice. San Francisco, Calif: Jossey-Bass Publishers; 1991.
Berlo DK. The Process of Communication: An Introduction to Theory and Practice. New York, NY: Holt, Rinehart, & Winston; 1960.
Lomas J, Haynes RB. A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from "official" to "individual" clinical policy.  Am J Prev Med.1988;4(suppl 4):77-94.
Green LW, Eriksen MP, Schor EL. Preventive practices by physicians: behavioral determinants and potential interventions.  Am J Prev Med.1988;4:101-107.
Moore DEJ, Green JS, Jay SJ, Leist JC, Maitland FM. Creating a new paradigm for CME: seizing opportunities within the health care revolution.  J Continuing Educ Health Professions.1994;14:4-31.
Berwick DM. Continuous improvement as an ideal in health care.  N Engl J Med.1989;320:53-56.
Mazmanian PE, Daffron SR, Johnson RE, Davis DA, Kantrowitz MP. Information about barriers to planned change: a randomized controlled trial involving continuing medical education lectures and commitment to change.  Acad Med.1998;73:882-886.
Campbell CM, Parboosingh J, Gondocz T, Babitskaya G, Pham B. Study of the factors influencing the stimulus to learning recorded by physicians keeping a learning portfolio.  J Continuing Educ Health Professions.1999;19:16-24.
Lewin K. Field Theory in Social Science: Selected Theoretical Papers. Westport, Conn: Greenwood Press; 1975.
Festinger L. A Theory of Cognitive Dissonance. Stanford, Calif: Stanford University Press; 1957.
Fox RD, Mazmanian PE, Putnam RW. Changing and Learning in the Lives of Physicians. New York, NY: Praeger Publications; 1989.
Begg CB, Berlin JA. Publication bias and dissemination of clinical research.  J Natl Cancer Inst.1989;81:107-115.
Moher D. CONSORT: an evolving tool to help improve the quality of reports of randomized controlled trials.  JAMA.1998;279:1489-1491.
Mason JM, Wood J, Freemantle N. Designing evaluations of interventions to change professional practice.  J Health Serv Res Policy.1999;4:106-111.
Wood J, Freemantle N. Choosing an appropriate unit of analysis in trials of interventions that attempt to influence practice.  J Health Serv Res Policy.1999;4:44-88.
Freemantle N, Wood J. Cluster randomised trials: standardised approach to analysing and reporting these trials is misguided.  BMJ.1999;318:1286.
Freemantle N, Wood J, Crawford F. Evidence into practice, experimentation and quasi experimentation: are the methods up to the task?  J Epidemiol Community Health.1998;52:75-81.
Curry L, Purkis IE. Validity of self-reports of behavior changes by participants after a CME course.  J Med Educ.1986;61:579-584.
Parboosingh J, Gondocz ST. The Maintenance of Competence Program (MOCOMP): motivating specialists to appraise the quality of their continuing medical education activities.  Can J Surg.1993;36:29-32.
American Medical Association.  The Physician's Recognition Award Information Booklet. Chicago, Ill.: American Medical Association; 1998.

Figures

Tables

Table Graphic Jump LocationTable 1. Description of Studies Included*
Table Graphic Jump LocationTable 2. Studies of the Impact of CME vs a Control Group by Intervention Type and Intensity*

References

Difford F. General practitioners' attendance at courses accredited for the postgraduate education allowance.  Br J Gen Pract.1992;42:290-293.
Goulet F, Gagnon RJ, Desrosiers G, Jacques A, Sindon A. Participation in CME activities.  Can Fam Physician.1998;44:541-548.
Curry L, Putnam W. Continuing medical education in Maritime Canada: the methods physicians use, would prefer and find most effective.  CMAJ.1981;124:563-566.
Accreditation Council for Continuing Medical Education.  The ACCME Report. Chicago, Ill: Accreditation Council for Continuing Medical Education; 1997.
Mazmanian PE, Harrison RV, Osborne CE. Diversity across medical schools: programs, enrollment, and fees for continuing medical education.  J Continuing Educ Health Professions.1990;10:23-33.
Grimshaw JM, Russell IT. Effect of clinical guidelines on medical practice: a systematic review of rigorous evaluations.  Lancet.1993;342:1317-1322.
Davis DA, Taylor-Vaisey AL. Translating guidelines into practice: a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines.  CMAJ.1997;157:408-416.
Lau J, Antman EM, Jimenez-Silva J, Kupelnick B, Mosteller F, Chalmers TC. Cumulative meta-analysis of therapeutic trials for myocardial infarction.  N Engl J Med.1992;327:248-254.
Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA.and the Cochrane Effective Practice and Organisation of Care Review Group.  Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings.  BMJ.1998;317:465-468.
Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies.  JAMA.1995;274:700-705.
Thomson MA, Freemantle N, Wolf F, Davis DA, Oxman AD. Educational meetings, workshops and preceptorships to improve the practice of health professionals and health care outcomes (Cochrane Protocol on CD-ROM). Oxford, England: Cochrane Library, Update Software; 1999; issue 1.
Haynes RB, Davis DA, McKibbon A, Tugwell P. A critical appraisal of the efficacy of continuing medical education.  JAMA.1984;251:61-64.
Davis DA, Thomson MA, Oxman AD, Haynes RB. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials.  JAMA.1992;268:1111-1117.
Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice.  CMAJ.1995;153:1423-1431.
Brookfield SD. Understanding and Facilitating Adult Learning: A Comprehensive Analysis of Principles and Effective Practices. San Francisco, Calif: Jossey-Bass Publishers; 1986.
Cross KP. Adults as Learners: Increasing Participation and Facilitating Learning. San Francisco, Calif: Jossey-Bass Publishers; 1981.
Knowles MS. The Modern Practice of Adult Education: Andragogy Versus Pedagogy. New York, NY: New York Association Press; 1970.
Schon DA. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions. San Francisco, Calif: Jossey-Bass Publishers; 1990.
Taylor-Vaisey AL. Information needs of CME providers: research and development resource base in continuing medical education.  J Continuing Educ Health Prof.1995;15:117-121.
Grimshaw JM, Thomson MA.for the Cochrane Effective Practice and Organisation of Care Group.  What have new efforts to change professional practice achieved?  J R Soc Med.1998;91(suppl 35):20-25.
Smith TC, Spiegelhalter DJ, Thomas A. Bayesian approaches to random-effects meta-analysis: a comparative study.  Stat Med.1995;14:2685-2699.
Boissel JP, Collet JP, Alborini A.  et al. for the PRESAGF Collaborative Group.  Education program for general practitioners on breast and cervical cancer screening: a randomized trial.  Rev Epidemiol Sante Publique.1995;43:541-547.
Browner WS, Baron RB, Solkowitz S, Adler LJ, Gullion DS. Physician management of hypercholesterolemia: a randomized trial of continuing medical education.  West J Med.1994;161:572-578.
Clark NM, Gong M, Schork MA.  et al.  Impact of education for physicians on patient outcomes.  Pediatrics.1998;101:831-836.
Dietrich AJ, O'Connor GT, Keller A, Carney PA, Levy D, Whaley FS. Cancer: improving early detection and prevention; a community practice randomised trial.  BMJ.1992;304:687-691.
Heale J, Davis DA, Norman GR, Woodward C, Neufeld VR, Dodd P. A randomized controlled trial assessing the impact of problem based versus didactic teaching methods in CME.  Proc Conference Res Med Educ Assoc Am Med Coll.1988;27:72-77.
Jennett PA, Laxdal OE, Hayton RC.  et al.  The effects of continuing medical education on family doctor performance in office practice: a randomized control study.  Med Educ.1988;22:139-145.
Kottke TE, Brekke ML, Solberg LI, Hughes JR. A randomized trial to increase smoking intervention by physicians: Doctors Helping Smokers, Round I.  JAMA.1989;261:2101-2106.
Levinson W. The effects of two continuing medical education programs on communication skills of practicing primary care physicians.  J Gen Intern Med.1993;8:318-324.
Maiman LA, Becker MH, Liptak GS, Nazarian LF, Rounds KA. Improving pediatricians' compliance-enhancing practices.  AJDC.1988;142:773-779.
Ockene IS, Hebert JR, Ockene JK, Merriam PA, Hurley TG, Saperia GM. Effect of training and a structured office practice on physician-delivered nutrition counseling: the Worcester-Area Trial for Counseling in Hyperlipidemia (WATCH).  Am J Prev Med.1996;12:252-258.
Perera DR, LoGerfo JP, Shulenberger E, Ylvisaker JT, Kirz HL. Teaching sigmoidoscopy to primary care physicians: a controlled study of continuing medical education.  J Fam Pract.1983;16:785-788.
Roter DL, Hall JA, Kern DE, Barker LR, Cole KA, Roca RP. Improving physicians' interviewing skills and reducing patients' emotional distress: a randomized clinical trial.  Arch Intern Med.1995;155:1877-1884.
White CW, Albanese MA, Brown DD, Caplan RM. The effectiveness of continuing medical education in changing the behavior of physicians caring for patients with acute myocardial infarction.  Ann Intern Med.1985;102:686-692.
Wilson DM, Ciliska D, Singer J, Williams K, Alleyne J, Lindsay E. Family physicians and exercise counseling: can they be influenced to provide more?  Can Fam Physician.1992;38:2003-2010.
Bero LA, Grilli R, Grimshaw J, Oxman AD, Zwarenstein M. The Cochrane Effective Practice and Organisation of Care Group (EPOC) module (Cochrane Review on CD-ROM). Oxford, England: Cochrane Library, Update Software; 1999; issue 2.
Kanouse DE, Jacoby I. When does information change practitioners' behavior?  Int J Technol Assess Health Care.1988;4:27-33.
Candy PC. Self-Direction for Lifelong Learning: A Comprehensive Guide to Theory and Practice. San Francisco, Calif: Jossey-Bass Publishers; 1991.
Berlo DK. The Process of Communication: An Introduction to Theory and Practice. New York, NY: Holt, Rinehart, & Winston; 1960.
Lomas J, Haynes RB. A taxonomy and critical review of tested strategies for the application of clinical practice recommendations: from "official" to "individual" clinical policy.  Am J Prev Med.1988;4(suppl 4):77-94.
Green LW, Eriksen MP, Schor EL. Preventive practices by physicians: behavioral determinants and potential interventions.  Am J Prev Med.1988;4:101-107.
Moore DEJ, Green JS, Jay SJ, Leist JC, Maitland FM. Creating a new paradigm for CME: seizing opportunities within the health care revolution.  J Continuing Educ Health Professions.1994;14:4-31.
Berwick DM. Continuous improvement as an ideal in health care.  N Engl J Med.1989;320:53-56.
Mazmanian PE, Daffron SR, Johnson RE, Davis DA, Kantrowitz MP. Information about barriers to planned change: a randomized controlled trial involving continuing medical education lectures and commitment to change.  Acad Med.1998;73:882-886.
Campbell CM, Parboosingh J, Gondocz T, Babitskaya G, Pham B. Study of the factors influencing the stimulus to learning recorded by physicians keeping a learning portfolio.  J Continuing Educ Health Professions.1999;19:16-24.
Lewin K. Field Theory in Social Science: Selected Theoretical Papers. Westport, Conn: Greenwood Press; 1975.
Festinger L. A Theory of Cognitive Dissonance. Stanford, Calif: Stanford University Press; 1957.
Fox RD, Mazmanian PE, Putnam RW. Changing and Learning in the Lives of Physicians. New York, NY: Praeger Publications; 1989.
Begg CB, Berlin JA. Publication bias and dissemination of clinical research.  J Natl Cancer Inst.1989;81:107-115.
Moher D. CONSORT: an evolving tool to help improve the quality of reports of randomized controlled trials.  JAMA.1998;279:1489-1491.
Mason JM, Wood J, Freemantle N. Designing evaluations of interventions to change professional practice.  J Health Serv Res Policy.1999;4:106-111.
Wood J, Freemantle N. Choosing an appropriate unit of analysis in trials of interventions that attempt to influence practice.  J Health Serv Res Policy.1999;4:44-88.
Freemantle N, Wood J. Cluster randomised trials: standardised approach to analysing and reporting these trials is misguided.  BMJ.1999;318:1286.
Freemantle N, Wood J, Crawford F. Evidence into practice, experimentation and quasi experimentation: are the methods up to the task?  J Epidemiol Community Health.1998;52:75-81.
Curry L, Purkis IE. Validity of self-reports of behavior changes by participants after a CME course.  J Med Educ.1986;61:579-584.
Parboosingh J, Gondocz ST. The Maintenance of Competence Program (MOCOMP): motivating specialists to appraise the quality of their continuing medical education activities.  Can J Surg.1993;36:29-32.
American Medical Association.  The Physician's Recognition Award Information Booklet. Chicago, Ill.: American Medical Association; 1998.
CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 1041

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles
Defining family-centered rounds. Teach Learn Med 2007;19(3):319-22.