0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Contribution |

Improving Residents' Compliance With Standards of Ambulatory Care:  Results From the VA Cooperative Study on Computerized Reminders FREE

John G. Demakis, MD; Charles Beauchamp, MD, PhD; William L. Cull, PhD; Robbin Denwood, RN, MSN, MBA; Seth A. Eisen, MD, MSc; Richard Lofgren, MD, MPH; Kristen Nichol, MD, MPH; James Woolliscroft, MD; William G. Henderson, PhD; for the Department of Veterans Affairs Cooperative Study Group on Computer Reminders in Ambulatory Care
[+] Author Affiliations

Author Affiliations: VA Health Services Research and Development Service, Washington, DC (Dr Demakis); Department of Ambulatory Care, Durham VAMC, Durham, NC (Dr Beauchamp); Hines VA Cooperative Studies Program Coordinating Center, Hines, Ill (Drs Cull and Henderson and Ms Denwood); Department of Rheumatology, St Louis VAMC, St Louis, Mo (Dr Eisen); Department of General Internal Medicine, Medical College of Wisconsin, Milwaukee (Dr Lofgren); Department of Internal Medicine, Minneapolis VAMC, Minneapolis, Minn (Dr Nichol); and Department of General Medicine, University of Michigan, Ann Arbor (Dr Woolliscroft).


JAMA. 2000;284(11):1411-1416. doi:10.1001/jama.284.11.1411.
Text Size: A A A
Published online

Context Computerized systems to remind physicians to provide appropriate care have not been widely evaluated in large numbers of patients in multiple clinical settings.

Objective To examine whether a computerized reminder system operating in multiple Veterans Affairs (VA) ambulatory care clinics improves resident physician compliance with standards of ambulatory care.

Design, Setting, and Participants A total of 275 resident physicians at 12 VA medical centers were randomly assigned in firms or half-day clinic blocks to either a reminder group (n = 132) or a control group (n = 143). During a 17-month study period (January 31, 1995–June 30, 1996), the residents cared for 12,989 unique patients for whom at least 1 of the studied standards of care (SOC) was applicable.

Main Outcome Measures Compliance with 13 SOC, tracked using hospital databases and encounter forms completed by residents, compared between residents in the reminder group vs those in the control group.

Results Measuring compliance as the proportion of patients in compliance with all applicable SOC by their last visit during the study period, the reminder group had statistically significantly higher rates of compliance than the control group for all standards combined (58.8% vs 53.5%; odds ratio [OR], 1.24; 95% confidence interval [CI], 1.08-1.42; P = .002) and for 5 of the 13 standards examined individually. Measuring compliance as the proportion of all visits for which care was indicated in which residents provided proper care, the reminder group also had statistically significantly higher rates of compliance than the control group for all standards combined (17.9% vs 12.2%; OR, 1.57; 95% CI, 1.45-1.71; P<.001) and for 9 of the 13 standards examined individually. The benefit of reminders, however, declined throughout the course of the study, even though the reminders remained active.

Conclusions Our data indicate that reminder systems installed at multiple sites can improve residents' compliance to multiple SOC. The benefits of such systems, however, appear to deteriorate over time. Future research needs to explore methods to better sustain the benefits of reminders.

Figures in this Article

Randomized controlled trials of computerized reminders (CRs) to physicians to improve their compliance with specific standards of care (SOC) were first reported in 1976.1 During the next 17 years, other randomized controlled trials221 were reported that compared the use of CRs with manual reminders or no reminders. In some of the trials,11,16,17 CRs were given to patients as well as physicians, but generally, CRs were given only to resident physicians.

Although the overall effects of the CRs were positive, there were serious limitations to the published studies. All studies were single-site trials and nearly all involved relatively few patients (eg, between 32 and 1460 patients, with the exception of 2 studies of 70007 and 12,46717 patients). Since most of the computer systems were developed to meet the needs of a particular institution, there was little opportunity to extend the studies beyond the single sites. The number of conditions for which CRs were generated varied from 1 to 11, with the exception of 1 study of more than 100 conditions. Five studies involved 5 or fewer reminder conditions. All CRs were for preventive care such as influenza and pneumococcal vaccines or mammography. Two sites accounted for 11 of the previously published trials.13,79,1115

The Department of Veterans Affairs centralized database and Cooperative Studies Program provided a unique opportunity to assess the generalizability and overall utility of using CRs to improve the quality of care involving multiple sites and using a host of different SOC. Specifically, we examined whether providing CRs of well-accepted SOC to resident physicians in ambulatory care clinics can increase compliance with those standards. This study also addressed (1) whether CRs can work in multiple sites around the country with the same database and (2) whether CRs can work with multiple reminders that include treatment as well as prevention interventions.

Participants

A total of 275 resident physicians from 12 Veterans Affairs (VA) medical centers participated in the study. Resident physicians were chosen for participation because they were the VA physicians who were most involved in patient primary care at the time of the study. During the course of the study, the residents cared for 18,700 unique patients, and 12,989 of these patients were eligible for at least 1 of the investigated SOC.

SOC Development

The SOC were chosen on the basis of their importance to the Department of Veterans Affairs ambulatory care patient population, a population composed mainly of middle-aged and elderly males with chronic diseases; and feasibility of identifying diagnoses, exclusionary factors, treatments, procedures, and instructions via the hospital computer system. In the planning stages of the study, a committee of Chiefs of Ambulatory Care Services in the VA generated a preliminary list of SOC using existing published medical guidelines and literature search strategies. More than 30 potential SOC were identified. The capability of the computer system to monitor the standards was then assessed to identify the 13 standards that were used in the trial (Table 1).

Table Graphic Jump LocationTable 1. The 13 Standards of Care Used in the Trial
Study Design and Procedures

The study was a clinical trial comparing the performance of residents receiving CRs with the performance of residents not receiving CRs. Data collection for the study began January 31, 1995, and ended June 30, 1996. At each site, resident physicians were assigned to either the reminder group or the control group. For sites using a firm or team system, each firm was randomly assigned to 1 of the 2 groups, and all residents in the firm were assigned to that group. For sites not using a firm system, half-day blocks of residents were randomly assigned to the reminder or control groups. Firms or half-day blocks of residents, rather than individual residents, were randomized to reduce communication between members of the intervention and control arms. Also, a concerted effort was made at each site to have residents from the same firm or block substitute for one another when necessary to prevent contamination. A total of 153 residents were assigned to the control group, and 146 residents were assigned to the reminder group. A total of 143 residents in the control group and 132 residents in the reminder group completed the study. For the 24 residents who failed to complete the study, 12 had residencies that concluded before the completion of the study, 4 left their residencies prematurely, and 8 remained in their residencies but no longer wished to participate in the study. There was no significant difference in the drop-out rates between the treatment groups (P = .33). The 275 residents completing the study exceeded the target sample size for the study of 260 residents total.

At the beginning of the study, residents in both the reminder and the control groups were asked to complete a questionnaire exploring their knowledge of and attitudes toward the SOC being studied. Later, they attended a 1-hour instruction session where the principal investigator at each site discussed the rationale for and benefits of the SOC being studied. During the instruction session, all residents received a booklet that listed the SOC and provided the rationale and several references supporting each standard. Residents in the reminder group were also provided with an introduction to the reminder system. This consisted of an education session of 1 to 2 hours that explained the general value of reminder systems and the presentation of a videotape, designed specifically for this study, that demonstrated in detail how the reminder system worked.

Data were collected for at least 2 weeks before reminders were activated to provide a period for comparing baseline adherence rates between the reminder and control groups for a sample of residents' patients. Once the intervention period began, CRs were presented to residents in 2 ways to ensure that they would see each reminder prior to evaluating each patient. First, each examination room has a computer terminal that is connected to the hospital computer server. When a resident in the intervention group entered a patient name into the computer, all reminders pertaining to that patient were automatically presented in bold letters. Each reminder consisted of a notification that the SOC applied to the patient. Accompanying each notification was a brief rationale for the standard. Second, a computer-generated summary (typically 6-8 pages) of a patient's health, including a list of his or her medical conditions and a list of his or her most recent clinic visits, is routinely placed at the beginning of the medical chart on the day of the clinic visit. For the intervention group, this health summary was modified to include all reminders that pertained to the patient. This information appeared on page 2 of the health summary after patient-identifying information. Control group residents continued to receive standard health summaries without the reminders.

At no point in the study was any performance feedback given to the hospitals or the residents concerning individual resident adherence levels or overall hospital adherence levels. Accordingly, residents were never evaluated based on their compliance with the SOC.

Adherence Definitions

Adherence to the SOC was the primary dependent variable, and adherence was measured for all patients visiting a study physician whom were eligible for 1 or more of the SOC. Adherence was measured in 2 different ways: general adherence or visit-specific adherence. General adherence used a patient's last visit as a reference point and measured the proportion of patients with 1 of the SOC who were in compliance with that SOC at a time point following the patient's last visit. This measure did not determine the specific point in time during the study when the care was provided, but only whether care had been provided by their last visit.

Visit-specific adherence, on the other hand, measured whether residents delivered appropriate care at the time of a specific patient visit by providing the care according to the SOC. This was accomplished by tracking adherence both before and after each visit. Before each visit, it was determined, based on review of prior care, whether a patient needed to receive specific services. The adherence measurement after the visit then identified whether residents provided care according to the SOC. This measure allowed multiple observations in the database for the same patient and SOC combination if a patient visited the clinic repeatedly and the suggested care had not been provided at the time of each visit.

For both measures, adherence was treated dichotomously, with adherence equal to 1 if proper care had been given to the patient and to 0 if proper care had not been given. An adherence measurement was included in the database for all SOC for which the patient was eligible.

SOC Tracking Computer Program

Table 1 provides a list of the 13 SOC that were studied. Eligibility and compliance with the various SOC were determined using information from 2 sources: (1) an encounter form completed at the time of each visit and (2) each hospital's computer system. The encounter forms were optical scan forms that contained a list of 152 diagnoses, 20 procedures, and 17 patient instructions, such as a discussion of the benefits of exercise or proper nutrition. The diagnoses, procedures, and instructions listed on the encounter form were those that were most likely to be reported in the VA ambulatory clinics. Residents were instructed to mark each diagnosis that was treated, each procedure that was administered, and each instruction that was given. There were also spaces on the encounter form to enter "other" diagnoses, instructions, or procedures.

For this study, computer software was developed that downloaded the information from the encounter forms and integrated this information with hospital information about patients' demographics, prescriptions, treatments, and laboratory test results. The program then identified all eligible patients and determined whether proper care had been given to those patients. This program was run weekly at each site.

Several actions were taken to ensure that adherence was being tracked accurately. First, prior to data collection, each of the SOC was individually checked at the 12 sites by a data evaluation committee who sampled cases and checked the accuracy of the integration program with patient charts and hospital files. Because of the considerable time it took the Data Evaluation Committee to check the computer program for each SOC at each site, the exact timing of data collections varied for the different SOC within and among sites.

Second, at each site, monthly completion rates were determined to guarantee that residents were using the encounter forms. The average completion rate across all sites was 96.9% (range, 92.2%-100%).

Finally, for roughly 5% of the encounter forms that were completed each month, audits were conducted to check the accuracy of the marked information. Specifically, these audits assessed the concordance between information marked on the encounter form and information written in the medical chart. Most of the diagnoses (76.9%) were present both in the medical chart and on the encounter form. It was expected that some diagnoses would not be found on the encounter forms because the residents were instructed to only mark diagnoses that they treated at that visit.

Statistical Analyses

Statistical analyses were based on the sample of 12,989 unique patients who were eligible for 1 or more of the SOC studied. An α level of .05 or 95% confidence interval (CI) was used for all statistical analyses (exact P values are also reported).

The dependent measure of interest in the study was adherence to the SOC. Because adherence is potentially influenced by both patient and physician variables, multilevel logistic regression was used to analyze adherence as a dichotomous variable at the patient level while accounting for the clustering of patients within resident physicians. This was accomplished using generalized estimating equations.22 For visit-specific adherence, multilevel logistic regression analyses that accounted for the clustering of visits within patients were used to test the statistical significance of the differences between the groups.

Descriptive Results

The mean age of the resident physicians at the start of the study was 28.4 years (SD = 3.12 years). There were more male residents (70.9%) than female residents (29.1%), and 17.5% of the residents graduated from foreign medical schools. As part of the knowledge and attitudes survey that was given prior to the study, residents used a 5-point scale (5 = highest) to rate for each SOC their level of agreement with the standard, the level of research evidence supporting the standard, and the feasibility of implementing the standard at their hospital. The overall mean ratings for the standards were 4.3, 4.0, and 3.6 for agreement, research support, and feasibility, respectively.

The mean age of the patients was 65.9 years (SD = 10.9 years), and 98.4% were male. There were no statistically significant differences in physician or patient characteristics at baseline between the reminder and control groups. There were also no significant differences in baseline general adherence between the control group and the reminder group for all SOC combined (55.3% vs 53.0%) and for 11 of the 13 SOC examined individually (Table 2). There was wide variation in the baseline adherence rates for the different standards, ranging from 4.4% for pneumococcal vaccination to 79.1% for atrial fibrillation-warfarin, aspirin, or ticlopidine.

Table Graphic Jump LocationTable 2. Comparisons of Baseline General Adherence for Reminder and Control Groups*
Intervention Adherence

Table 3 presents adherence rates for the reminder and control groups during the intervention period. The general adherence method was used. These rates represent the proportion of patients seen by a study physician during the intervention period who were in compliance with the SOC by their last visit. The results show that the reminder group had a statistically significantly higher adherence rate (58.8%) compared with the control group (53.5%) for all SOC combined (odds ratio [OR], 1.24; 95% CI, 1.08-1.42). Significantly higher adherence rates were also found for 5 of the 13 SOC analyzed individually, and another 6 SOC showed nonsignificant differences favoring the reminder group. The largest effect was for pneumococcal vaccination (12.7% vs 4.3%; OR, 3.26; 95% CI, 2.09-5.09).

Table Graphic Jump LocationTable 3. The Effect of Reminders on General Adherence

The change in adherence rates for all SOC combined from the baseline to the intervention period favored the reminder group at 11 of the 12 participating sites. The reminder group showed statistically significantly greater improvement in adherence compared with controls (P≤.05) at 4 of the 12 sites.

Visit-Specific Adherence

Table 4 shows a comparison of residents' visit-specific adherence in the control and reminder groups for all SOC combined and for each SOC individually. As was previously shown for general adherence, the reminder group had a significantly higher rate of visit-specific adherence (17.9%) than the control group (12.2%) for all SOC combined (OR, 1.57; 95% CI, 1.45-1.71) and for 9 of the SOC examined individually. Large effects were found for pneumococcal vaccine (OR, 7.85; 95% CI, 3.83-16.08), diabetes/peripheral vascular disease-foot examination (OR, 2.57; 95% CI, 2.02-3.26), and for diabetes-eye examination (OR, 2.19; 95% CI, 1.63-2.94). The 9 SOC that showed significant benefits of reminders were all activated in the early phase of the study.

Table Graphic Jump LocationTable 4. The Effect of Reminders on Visit-Specific Adherence to Standards of Care*

Figure 1 plots the rates of visit-specific adherence across the course of the study for the 9 SOC that were initiated early in the study. The results show that residents' responsiveness to the reminders decreased throughout the study. This treatment by time interaction was statistically significant (P < .001). Simple effect analyses showed that adherence rates significantly declined across time (P<.001) for the reminder group, but the adherence rates for the control group were unaffected by time (P = .16).

Figure. Adherence by Time and Treatment
Graphic Jump Location

The purpose of this VA cooperative study was to test the hypothesis that CRs of well-accepted SOC can effect an increase in residents' compliance with multiple standards of ambulatory care across multiple medical centers. In addition to involving multiple centers and concurrently using 13 SOC, this clinical trial was unique in involving both preventive measures as well as treatment for specific diagnoses, having a large sample size, measuring adherence using both general and visit-specific analyses, and in using a lengthy intervention period that allowed the tracking of adherence rates over time.

The trial furthered previous research by demonstrating higher adherence with the reminder system across 12 medical centers for the combined 13 SOC. The general adherence rates increased from an already fairly high 53.0% at baseline to 58.8% during the intervention period for the reminder group (a 10.9% relative increase or a 5.8 percentage point absolute increase), while the control group showed a slight decrease from 55.3% adherence at baseline to 53.5% adherence in follow-up (a 3.3% relative decrease or a 1.8 percentage point absolute decrease). However, the increase in the general adherence rate in the CR group was not as large as that reported by some other studies.23

Visit-specific adherence rates in both the reminder and control groups were low. Residents' failure to deliver the suggested care more regularly may be related to busy clinic schedules combined with a belief that the standards may not have an immediate impact on patients' health. The standards with the lowest adherence rates tended to be prevention rather than treatment oriented, such as smoking cessation, counseling, and pnuemococcal vaccination. Still, the fairly high number of patients in compliance with the standards by the time of the patient's last visit, as indicated by the general adherence measure, shows that patients' compliance with the standards is determined by more than residents' actions at any single ambulatory care visit. Patients often attend the ambulatory clinics multiple times in a year, giving physicians several opportunities to provide the recommended care, and in many instances, patients may receive the suggested care as an inpatient or as part of larger hospitalwide patient education efforts.

This study demonstrated that improvements in adherence could be realized for multiple standards concurrently across multiple sites. However, enthusiasm for the reminder system was tempered by the finding that physicians became less likely to respond to reminders across the time course of the study. Previous studies have shown sizable decrements in SOC compliance during wash-out periods that followed reminder deactivation. We showed a similar decrement, in spite of maintaining the reminders.

We observed that the initial positive response to CRs systematically declined across the course of the study even though the reminders remained active. There are several potential reasons for this finding. One possibility is that external changes in the practice of medicine lead to the residents discounting the reminders. However, no similar decrement in performance was observed for the control group, which would be expected if residents began questioning the standards. Another more likely explanation is that in a busy clinic the competing demands on residents' time lead to inattention to the reminders.

A lack of feedback concerning residents' performance may have contributed to the observed decline. Perhaps providing performance feedback2426 and/or educational reinforcement to residents would help sustain the positive effects of reminders. Nonetheless, our data indicated that computer reminders installed at multiple sites can improve compliance with multiple SOCs.

McDonald C. Use of a computer to detect and respond to clinical events: its effect on clinician behavior.  Ann Intern Med.1976;84:162-167.
McDonald CJ. Protocol-based computer reminders, the quality of care and the non-perfectability of man.  N Engl J Med.1976;295:1351-1355.
McDonald CJ, Wilson GA, McCabe GP. Physician response to computer reminders.  JAMA.1980;244:1579-1581.
Barnett GO, Winickoff RN, Dorsey JL.  et al.  Quality assurance through automated monitoring and concurrent feedback using computer-based medical information system.  Med Care.1978;16:962-970.
Barnett GO, Winickoff RN, Morgan MM, Zielstorff RD. A computer-based system for follow-up of elevated blood pressure.  Med Care.1983;21:400-409.
Goldman L, Weinberg M, Weisberg M.  et al.  A computer-derived protocol to aid in the diagnosis of emergency room patients with acute chest pain.  N Engl J Med.1982;307:588-596.
McDonald C, Hui S, Smith D.  et al.  Reminders to physicians from an introspective computer medical record.  Ann Intern Med.1984;100:130-138.
McDonald C, Hui S, Tierney W. Effects of computer reminders for influenza vaccination on morbidity during influenza epidemics.  MD Comput.1992;9:304-312.
Tierney W, Hui S, McDonald C. Delayed feedback of physician performance versus immediate reminders to perform prevention care.  Med Care.1986;24:659-666.
Turner B, Day S, Borenstein B. A controlled trial to improve delivery of preventive care: physician or patient reminders.  J Gen Intern Med.1989;4:403-409.
McDowell I, Newell C, Rosser W. A randomized trial of computerized reminders for blood pressure screening in primary care.  Med Care.1989;27:297-305.
McDowell I, Newell C, Rosser W. Computerized reminders to encourage cervical screening in family practice.  J Fam Pract.1989;28:420-424.
Rosser W, Hutchison B, McDowell I, Newell C. Use of reminders to increase compliance with tetanus booster vaccination.  CMAJ.1992;146:911-917.
McDowell I, Newell C, Rosser W. Comparison of three methods of recalling patient for influenza vaccination.  CMAJ.1986;135:991-997.
Rosser W, McDowell I, Newell C. Use of reminders for prevention procedures in family medicine.  CMAJ.1991;145:807-813.
Becker D, Gomez E, Kaiser D.  et al.  Improving prevention care at a medical clinic: how can the patient help?  Am J Prev Med.1989;5:353-359.
Ornstein S, Garr D, Jenkins R.  et al.  Computer-generated physician and patient reminders: tools to improve population adherence to selected prevention services.  J Fam Pract.1991;32:82-90.
Litzelman DK, Dittus R, Miller M, Tierney W. Requiring physicians to respond to computerized reminders improves their compliance with prevention care protocols.  J Gen Intern Med.1993;8:311-317.
McPhee S, Bird JA, Jenkins C, Fordham D. Promoting cancer screening.  Arch Intern Med.1989;149:1866-1872.
Landis S, Hulkower S, Peirson S. Enhancing adherence with mammography through patient letters and physician prompts.  N C Med J.1992;53:575-578.
McPhee S, Bird JA, Fordham D.  et al.  Promoting cancer prevention activities by primary care physicians.  JAMA.1991;266:538-544.
Lipsitz SH, Fitzmaurice GM, Orad EJ, Laird NM. Performance of generalized estimating equations in practical situations.  Biometrics.1994;50:270-278.
Shea S, DuMouchel W, Bahamonde L. A meta-analysis of 16 randomized controlled trials to evaluate computer-based clinical reminder systems for preventive care in the ambulatory setting.  J Am Med Inform Assoc.1996;3:399-409.
Wensing M, Gorol R. Single and combined strategies for implementing changes in primary care.  Int J Qual Health Care.1994;6:115-132.
Davis D. Does CME work? an analysis of the effect of educational activities on physician performance or health care outcomes.  Int J Psychiatry Med.1998;28:21-39.
Sicotte C, Pinenult R, Tilquin C, Contrandriopoulos A. The diluting effect of medical work groups on feedback efficacy in changing physician's practice.  J Behav Med.1996;19:367-383.

Figures

Figure. Adherence by Time and Treatment
Graphic Jump Location

Tables

Table Graphic Jump LocationTable 1. The 13 Standards of Care Used in the Trial
Table Graphic Jump LocationTable 2. Comparisons of Baseline General Adherence for Reminder and Control Groups*
Table Graphic Jump LocationTable 3. The Effect of Reminders on General Adherence
Table Graphic Jump LocationTable 4. The Effect of Reminders on Visit-Specific Adherence to Standards of Care*

References

McDonald C. Use of a computer to detect and respond to clinical events: its effect on clinician behavior.  Ann Intern Med.1976;84:162-167.
McDonald CJ. Protocol-based computer reminders, the quality of care and the non-perfectability of man.  N Engl J Med.1976;295:1351-1355.
McDonald CJ, Wilson GA, McCabe GP. Physician response to computer reminders.  JAMA.1980;244:1579-1581.
Barnett GO, Winickoff RN, Dorsey JL.  et al.  Quality assurance through automated monitoring and concurrent feedback using computer-based medical information system.  Med Care.1978;16:962-970.
Barnett GO, Winickoff RN, Morgan MM, Zielstorff RD. A computer-based system for follow-up of elevated blood pressure.  Med Care.1983;21:400-409.
Goldman L, Weinberg M, Weisberg M.  et al.  A computer-derived protocol to aid in the diagnosis of emergency room patients with acute chest pain.  N Engl J Med.1982;307:588-596.
McDonald C, Hui S, Smith D.  et al.  Reminders to physicians from an introspective computer medical record.  Ann Intern Med.1984;100:130-138.
McDonald C, Hui S, Tierney W. Effects of computer reminders for influenza vaccination on morbidity during influenza epidemics.  MD Comput.1992;9:304-312.
Tierney W, Hui S, McDonald C. Delayed feedback of physician performance versus immediate reminders to perform prevention care.  Med Care.1986;24:659-666.
Turner B, Day S, Borenstein B. A controlled trial to improve delivery of preventive care: physician or patient reminders.  J Gen Intern Med.1989;4:403-409.
McDowell I, Newell C, Rosser W. A randomized trial of computerized reminders for blood pressure screening in primary care.  Med Care.1989;27:297-305.
McDowell I, Newell C, Rosser W. Computerized reminders to encourage cervical screening in family practice.  J Fam Pract.1989;28:420-424.
Rosser W, Hutchison B, McDowell I, Newell C. Use of reminders to increase compliance with tetanus booster vaccination.  CMAJ.1992;146:911-917.
McDowell I, Newell C, Rosser W. Comparison of three methods of recalling patient for influenza vaccination.  CMAJ.1986;135:991-997.
Rosser W, McDowell I, Newell C. Use of reminders for prevention procedures in family medicine.  CMAJ.1991;145:807-813.
Becker D, Gomez E, Kaiser D.  et al.  Improving prevention care at a medical clinic: how can the patient help?  Am J Prev Med.1989;5:353-359.
Ornstein S, Garr D, Jenkins R.  et al.  Computer-generated physician and patient reminders: tools to improve population adherence to selected prevention services.  J Fam Pract.1991;32:82-90.
Litzelman DK, Dittus R, Miller M, Tierney W. Requiring physicians to respond to computerized reminders improves their compliance with prevention care protocols.  J Gen Intern Med.1993;8:311-317.
McPhee S, Bird JA, Jenkins C, Fordham D. Promoting cancer screening.  Arch Intern Med.1989;149:1866-1872.
Landis S, Hulkower S, Peirson S. Enhancing adherence with mammography through patient letters and physician prompts.  N C Med J.1992;53:575-578.
McPhee S, Bird JA, Fordham D.  et al.  Promoting cancer prevention activities by primary care physicians.  JAMA.1991;266:538-544.
Lipsitz SH, Fitzmaurice GM, Orad EJ, Laird NM. Performance of generalized estimating equations in practical situations.  Biometrics.1994;50:270-278.
Shea S, DuMouchel W, Bahamonde L. A meta-analysis of 16 randomized controlled trials to evaluate computer-based clinical reminder systems for preventive care in the ambulatory setting.  J Am Med Inform Assoc.1996;3:399-409.
Wensing M, Gorol R. Single and combined strategies for implementing changes in primary care.  Int J Qual Health Care.1994;6:115-132.
Davis D. Does CME work? an analysis of the effect of educational activities on physician performance or health care outcomes.  Int J Psychiatry Med.1998;28:21-39.
Sicotte C, Pinenult R, Tilquin C, Contrandriopoulos A. The diluting effect of medical work groups on feedback efficacy in changing physician's practice.  J Behav Med.1996;19:367-383.
CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Please click the checkbox indicating that you have read the full article in order to submit your answers.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 122

Related Content

Customize your page view by dragging & repositioning the boxes below.

Articles Related By Topic
Related Collections
PubMed Articles