0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Contribution |

Relationship Between Low Quality-of-Care Scores and HMOs' Subsequent Public Disclosure of Quality-of-Care Scores FREE

Danny McCormick, MD, MPH; David U. Himmelstein, MD; Steffie Woolhandler, MD, MPH; Sidney M. Wolfe, MD; David H. Bor, MD
[+] Author Affiliations

Author Affiliations: Department of Medicine, Cambridge Hospital and Harvard Medical School, Cambridge, Mass (Drs McCormick, Himmelstein, Woolhandler, and Bor) and Public Citizen Health Research Group, Washington, DC (Dr Wolfe).


JAMA. 2002;288(12):1484-1490. doi:10.1001/jama.288.12.1484.
Text Size: A A A
Published online

Context Public disclosure of quality data on health maintenance organizations (HMOs) might improve public accountability, inform consumer decision making, and promote quality improvement. But, because disclosure is voluntary, some HMOs could subvert these objectives by refusing to release unfavorable data.

Objective To determine the association between HMO quality of care and withdrawal from public disclosure of quality-of-care data the subsequent year.

Design and Setting Retrospective cohort study of administrative and quality-of-care data on HMOs from the National Committee for Quality Assurance (NCQA) annual Quality Compass databases for 1997, 1998, and 1999, including Health Plan Employer Data and Information Set (HEDIS) quality scores.

Main Outcome Measure One-year rates of HMO withdrawal from public disclosure of HEDIS scores for plans in the highest and lowest tertiles of HEDIS scores, adjusted for method of data collection and plan model type.

Results Of the 329 HMOs that publicly disclosed HEDIS scores in 1997, 161 plans (49%) withdrew from public disclosure in 1998. Of the 292 HMOs that disclosed their scores in 1998 (including 130 newly participating plans), 67 plans (23%) withdrew from public disclosure in 1999. Plans whose scores ranked in the lowest-quality tertile were much more likely than plans ranking in the highest-quality tertile to withdraw from public disclosure in 1998 (odds ratio [OR], 3.6; 95% confidence interval [CI], 2.1-7.0) and 1999 (OR, 5.7; 95% CI, 2.7-17.7).

Conclusion Compared with HMOs receiving higher quality-of-care scores, lower-scoring plans are more likely to stop disclosing their quality data. Voluntary reporting of quality data by HMOs is ineffective; selective nondisclosure undermines both informed consumer decision making and public accountability.

Figures in this Article

Employers,1 government purchasers of health insurance,2 individual consumers,3 and lawmakers4,5 are seeking more information on the quality of health care. Recently, the President's Commission on Consumer Protection and Quality in the Health Care Industry called for widespread public disclosure of quality data by all health care provider organizations including health plans.6 Public disclosure is seen as a way to enhance informed consumer decision making,7 promote quality improvement,810 and increase health plans' accountability for health care delivery.6,1114

Public disclosure of data on quality by health maintenance organizations (HMOs), except those enrolling Medicare patients, is voluntary. In 1998, only 32.5% of all HMOs disclosed their scores on the National Committee for Quality Assurance (NCQA) Health Plan Employer Data and Information Set (HEDIS) measures,15 the most widely used set of quality indicators. If health plans that refuse to disclose quality data provide inferior care, publicly available data would overstate the average quality of HMO care nationally and result in a distorted picture of how a given plan that discloses quality data compares with that average. Selective nondisclosure could also undermine public accountability and quality improvement efforts by weakening the impetus to improve quality.

Despite the importance of this issue, no peer-reviewed studies have examined the relationship of HMO quality to willingness to disclose quality scores. We linked data for multiple years from the NCQA's annual Quality Compass databases to determine if withdrawal from public disclosure of HEDIS scores was related to an HMO's HEDIS performance 1 year earlier.

Study Sample

The NCQA currently uses HEDIS measures, a standardized set of clinical quality indicators, as the principal clinical criteria for its HMO accreditation program. Health plans voluntarily submit these data to the NCQA. The NCQA lists HEDIS scores of individual HMOs in its annual Quality Compass database, designed for use by health insurance purchasers and consumers. Until recently, the NCQA allowed plans to decline public disclosure of their HEDIS scores, yet remain fully eligible for NCQA accreditation. Plans may also disclose data privately (eg, to large purchasers) but refuse public disclosure.

To determine which HMOs that disclosed HEDIS scores in 1997 (the "1997 cohort," n = 329) or 1998 (the "1998 cohort," n = 292) withdrew from public disclosure in the subsequent year, we linked the 199716 (the first year of use of HEDIS version 3.0), 1998,17 and 199918 Quality Compass databases (which reflect plan characteristics and performance in 1996, 1997, and 1998, respectively). We used a "link-file" database provided by the NCQA to assist in tracking plans in the Quality Compass databases from year to year, since name changes were common. In addition to identifying plans that withdrew from public disclosure, we identified plans that merged or closed from one year to the next. We also identified HMOs that newly began public disclosure in 1998 or 1999. Last, we telephoned each HMO that we identified as having withdrawn from public disclosure to confirm the plan's identity and whether it had changed its name, merged with another plan, or closed. For the single plan that had closed, we obtained the date of closure from its parent company.

Data Collection

The NCQA requires HMOs to follow a detailed guide that defines each HEDIS measure and specifies standards for data collection. Plans may garner data from administrative records (administrative method) or supplement the administrative method with chart reviews (hybrid method).

For each quality indicator, the plan first draws a sample from the target population (eg, for mammography, women aged 52-69 years continuously enrolled in the HMO for at least 1 year). The HMO then searches administrative records (eg, payment or radiology files) to determine if the intervention occurred within a set time frame (eg, 2 years for a mammogram). If no evidence of the intervention is found, the HMO may choose to search for exclusions (eg, a history of bilateral mastectomy). For the hybrid method, when administrative records fail to give evidence either of the intervention or an exclusion, the plan reviews sampled patients' charts for such evidence. The HEDIS score is calculated as the number of patients who received the intervention divided by the number of eligible patients. The hybrid method, used by most plans for most measures, usually results in higher quality scores.

The NCQA's Quality Compass databases contain information on other health plan characteristics including ownership status (ie, investor-owned vs not-for-profit). When data on ownership status was missing, we consulted InterStudy's HMO Directory19,20 and/or telephoned the plan to determine ownership status.

Designation of HMO Quality

We assessed HMO quality using all HEDIS measures listed under the NCQA's rubric, "effectiveness of care." This rubric encompasses 13 distinct measures for the 1997 cohort, 4 of which are rates for individual childhood immunizations (measles-mumps-rubella, hepatitis B, diphtheria-pertussis-tetanus, and oral polio virus), and one of which is a rate for completion of all of these 4 childhood immunizations. For the 1998 cohort, vaccination for varicella and Haemophilus influenzae type B were added as measures and are included in the rate for completion of all recommended childhood immunizations that year. To avoid giving undue weight to childhood immunizations, we analyzed only the combined immunization rate, yielding 9 HEDIS scores for each plan. We ranked HMOs by quality in 2 ways. First, we ranked HMOs according to their score on each of the 9 HEDIS measures separately. Second, we ranked HMOs based on the average of ranks for all individual HEDIS measures for which the plan submitted data. For this latter analysis, we included only plans reporting scores on at least 5 of the 9 HEDIS measures. When more than 1 plan reported the same score, we assigned these plans the same rank. We then divided the plans into tertiles on the basis of their quality ranks. All analyses were performed using SAS.21

Outcomes

Our primary outcome was withdrawal from public disclosure of HEDIS scores 1 year after a previous public disclosure. We defined withdrawal as either (1) a failure to submit any HEDIS scores to NCQA or (2) submission of HEDIS scores but refusal to allow public disclosure. Plans that disclosed even a single HEDIS score or that merged and disclosed pooled HEDIS scores were not considered to have withdrawn. We excluded from our analysis the single plan that closed.

Statistical Analysis

For each of the 9 separate HEDIS measures we classified plans by whether their scores fell in the highest, middle, or lowest tertile of the 329 plans publicly disclosing data in 1997. We then calculated for each quality tertile the proportion of plans that withdrew from disclosure 1 year later and used the χ2 test to compare the proportions withdrawing in the highest and lowest tertiles. We report 2-tailed P values for all tests.

We repeated this analysis using the 1998 cohort (the 292 plans disclosing data in 1998), classifying plans according to their quality ranks in 1998 and comparing withdrawal rates 1 year later among plans in the highest- and lowest-quality tertiles.

Thus, each analysis examined whether the quality rank in a given year predicted the likelihood of publicly disclosing quality scores in the subsequent year.

We also used multiple logistic regression to estimate the adjusted odds ratio (OR) for withdrawal from public disclosure for HMOs in the lowest vs highest tertile of average plan rank for all 9 measures combined. We considered plan characteristics (model type, geographic location, and method of data collection) as potential covariates. The final multivariate models included only those variables that showed a significant univariate association (P<.05) with the outcome in both cohort years. In addition, because collecting data by the hybrid method produces higher HEDIS scores than the administrative method,22 we controlled for the method of data collection in all multivariate models.

Because our previous research had shown that investor-owned plans achieve lower HEDIS scores than not-for-profit plans, we also explored the interrelationships among ownership status, tertile of average rank on HEDIS scores, and the likelihood of withdrawal from public disclosure with 2 × 2 contingency tables and χ2 tests of significance for both the 1997 and 1998 cohorts. Specifically, we compared investor-owned and not-for-profit plans with regard to the percentage of plans in the lowest tertile of average HEDIS rank and the percentage of plans that withdrew from public disclosure. Last, we compared plans in the upper and lower tertiles of average HEDIS rank with regard to the percentage of plans that withdrew from public disclosure among investor-owned and not-for-profit plans separately.

Finally, to quantify the clinical significance of differences in quality between the highest- and lowest-quality plans, we calculated the mean (SD) rates for each indicator for the highest- and lowest-quality tertile.

Characteristics of the Health Plans

The majority of HMOs in both the 1997 and 1998 cohorts were investor-owned and were independent practice associations or mixed model type plans (Table 1). Plans in both cohorts were most commonly located in the South Atlantic, Mid Atlantic, and East North Central regions.

Table Graphic Jump LocationTable 1. Characteristics of HMOs Allowing Public Disclosure of HEDIS Scores in 1997 and 1998*
HMO Withdrawal From Public Disclosure of HEDIS Scores

A total of 329 HMOs allowed public disclosure of their HEDIS scores in 1997 (Figure 1). Of these plans, 161 (49%) withdrew from public disclosure the following year. In 1998, 292 plans allowed public disclosure of their HEDIS scores. This cohort consisted of 162 plans (after mergers) that allowed public disclosure in 1997, plus 130 newly participating plans. Of these 292 plans, 67 (23%) withdrew from public disclosure in 1999. For both the 1997 and 1998 cohorts, just over half of all plans that withdrew from public disclosure continued to submit HEDIS scores to NCQA (Figure 1).

Figure. Withdrawal From Public Disclosure of HEDIS Scores by HMOs From 1997 to 1999
Graphic Jump Location
The number of health maintenance organization (HMO) plans submitting Health Plan Employer Data and Information Set (HEDIS) data to the National Committee for Quality Assurance and allowing public disclosure for 1997, 1998, and 1999 are shown in boxes. Arrows depict changes in disclosure status of plans from 1997-1998 and from 1998-1999. Numbers indicate HMOs that withdrew from public disclosure of HEDIS scores either through submitting data to the National Committee for Quality Assurance, but declining public disclosure, or through a failure to submit data. Also depicted are changes in the number of plans publicly reporting data due to mergers of existing plans or the addition of newly participating plans.
HMO Quality Rank and Withdrawal From Public Disclosure of HEDIS Scores

HEDIS scores among HMOs that allowed public release of their quality data varied widely in both 1997 and 1998 (Table 2). Absolute differences in mean HEDIS scores of plans in the lowest and highest tertiles ranged from 15.6 to 42.3 percentage points in the 1997 cohort and from 14.6 to 37.5 percentage points in the 1998 cohort. For example, the mean immunization completion rate for 13-year-olds in the 1997 cohort was 74.7% for plans in the highest-quality tertile, but only 32.4% for plans in the lowest-quality tertile.

Table Graphic Jump LocationTable 2. Mean HEDIS Scores Among Lowest- and Highest-Ranked HMOs Publicly Disclosing Scores in 1997 and 1998*

Health maintenance organizations in the lowest tertile were significantly more likely to withdraw from public disclosure than plans in the highest tertile for 7 of the 9 measures in the 1997 cohort and for 6 of 9 measures in the 1998 cohort (Table 3). Plans in the lowest tertile were 1.6 to 2.7 times more likely to withdraw from public disclosure than plans in the highest tertile in the 1997 cohort and 2.2 to 7.0 times more likely to withdraw in the 1998 cohort. For 7 of the 9 indicators in the 1997 cohort, more than half of plans in the lowest-quality tertile withdrew from public disclosure of HEDIS scores the subsequent year. Withdrawal rates for the 1998 cohort were somewhat lower; nonetheless, for 8 of 9 indicators, at least 25% of plans in the lowest-quality tertile withdrew the subsequent year.

Table Graphic Jump LocationTable 3. Withdrawal From Public Disclosure of HEDIS Scores According to HMO Quality Rank Tertile 1 Year Earlier*

Health maintenance organizations in the lowest tertile of overall quality (average rank for all 9 HEDIS measures) were more likely to withdraw from public disclosure than plans in the highest tertile in both the 1997 (OR, 3.6; 95% confidence interval [CI], 2.1-7.0) and 1998 (OR, 5.7; 95% CI, 2.7-17.7) cohorts after adjustment for the method of data collection and plan model type (the only plan characteristic consistently correlated with plan withdrawal in univariate analyses).

In our analyses according to plan ownership status, investor-owned plans were more likely than not-for-profit plans to be in the lowest-quality tertile in both the 1997 (RR = 3.4; 95% CI, 1.9-5.8) and 1998 (RR = 1.9; 95% CI, 1.3-2.7) cohorts and to withdraw from public disclosure in both the 1997 (RR = 5.7; 95% CI, 2.6-12.2) and 1998 (RR = 1.3; 95% CI, 0.7-2.5) cohorts, although this difference was not statistically significant for the latter cohort. It appears, however, that poor quality rather than profit status per se was the primary determinant of withdrawal from public disclosure. The poorest-quality plans were more likely to withdraw from disclosure than the best-quality plans among both investor-owned plans (RR = 1.5; 95% CI, 1.1-2.1), and not-for-profit plans (RR = 2.2; 95% CI, 0.5-10.4) in the 1997 cohort. Similar results were obtained in the 1998 cohort for both investor-owned (RR = 2.7; 95% CI, 1.1-6.6) and not-for-profit plans (RR = 20.0; 95% CI, 2.8-149.8).

While the total number of HMOs that publicly disclosed HEDIS quality scores changed little each year from 1997 to 1999, the composition of this group changed substantially. Forty-eight percent of plans in the 1997 cohort and 23% of plans in the 1998 cohort withdrew from public disclosure 1 year later. Quality scores varied substantially among HMOs, and lower-scoring plans were much more likely to withdraw.

No previous peer-reviewed studies have examined the relationship between HMO quality scores and withdrawal from participation in public disclosure of scores in the HEDIS program. Previous yearly NCQA reports have documented that nondisclosing plans score poorly. However, our longitudinal analyses provide a quite different view than these reports based on cross-sectional data. Our approach encompasses plans that drop out of the HEDIS program entirely, in addition to those that refuse disclosure. For example, a 1998 NCQA report that provided data on nondisclosing plans included only 88 of the 161 (nondisclosing and drop-out) plans we analyzed.

We also delineate, for the first time, the shifting cohort of HEDIS participants and disclosers. Many plans are disclosers one year and nondisclosers the next (or vice versa). Hence the manipulation of the HEDIS monitoring system is more pervasive than is apparent from the NCQA's cross-sectional comparisons. The differences in analytic approaches also give rise to quite different interpretations. The NCQA suggests that the better cross-sectional performance of disclosing plans is evidence that their quality monitoring system is working. In contrast, our longitudinal data imply that gaming of the system is so extensive as to potentially undermine the quality monitoring process.

Why do HMO executives at lower-scoring plans choose to withdraw from reporting of HEDIS scores? They might believe that their plan's low HEDIS scores result from inadequate data collection methods that could understate true quality. Perhaps some suspect that their plan will suffer from biased comparisons since not all plans' data were audited, especially in the earlier years of the HEDIS program. They may become aware of such issues only after disclosing HEDIS scores at least once. Some executives may regard the costs of data collection as too high,2325 which could explain why some higher-scoring plans withdrew from HEDIS participation. But costs cannot explain most withdrawals; about half of the plans withdrawing from public disclosure still collected and submitted HEDIS scores (Figure 1).

The most likely explanation for our findings is that many plans withdraw because they fear (or know) that they will score low again. Low scores might place such plans at a marketing disadvantage, especially if nondisclosure carries little stigma. Regardless of the explanation, however, our results imply that voluntary disclosure of quality data, the primary national mechanism for HMO quality oversight, is failing to meet its stated goals of informing consumer decision making, providing incentives to improve quality and increasing public accountability.

The NCQA's HEDIS program represents the most comprehensive and influential quality assessment tool26 for HMOs (and any health care sector). HEDIS measures are standardized and subject to external audit to verify the data collection and calculation process.22 Yet, the selective withdrawal by lower-scoring plans means that enrollees, purchasers, and the public often cannot monitor a plan's quality over time. Furthermore, it implies that the average quality of HMO care in the United States is unknowable. Average published HEDIS scores could improve even if the actual average quality were stable or even deteriorating. Hence, HEDIS scores cannot be used as an accurate barometer of HMOs' attainment of specific health goals for the nation.11,27,28

The variation in HEDIS scores that we observed between the highest- and lowest-scoring plans has substantial clinical relevance. For example, receiving a β-blocker after a myocardial infarction reduces the risk of cardiovascular death and nonfatal reinfarction by 22% and 27%, respectively.29,30 Yet in 1998, a patient surviving a myocardial infarction was only half as likely to receive this medication if enrolled in a health plan in the lowest compared with the highest tertile of quality.

Voluntary disclosure allows HMOs to use the HEDIS program as a marketing tool, sacrificing its value as a quality assessment and improvement tool. When scores are high, plans can disclose them and take advantage of consequent marketing benefits. When scores are low, plans can withdraw from public disclosure. Indeed, until recently, HMOs that refuse to publicly disclose their quality scores were fully eligible for NCQA accreditation; only 4% of HMO applications for accreditation were rejected in 1998.31

Investor-owned plans were somewhat more likely to withdraw from public reporting. However, poor quality was associated with withdrawal from public disclosure among both investor-owned and not-for-profit plans. Apparently, the increasingly competitive health care marketplace drives health plans (irrespective of ownership status) to control data release to maximize competitive advantage.

Lack of disclosure is not the only challenge to the HMO quality oversight process. Patients appear to have difficulty understanding quality data32,33 and use it infrequently when selecting a health plan.34,35 Many employers offer only 1 health insurance option,36 foreclosing patient choice. Even large employers make only limited use of quality data,1,8 instead, selecting health plans primarily on the basis of cost.1,37 Indeed, in the current health care market, evidence that public disclosure of quality data improves quality is equivocal.9,10,38,39 Therefore, improving accountability and encouraging quality improvement would require, at the very least, that quality data be presented in a patient-friendly format, that patients be offered a choice of health plans, and that both patients and large purchasers make purchasing decisions based on quality rather than price. Without publicly available data on quality, however, achieving these goals would accomplish little.

Our findings should also be viewed in the context of the broader debate on public disclosure of quality data by all types of health care provider organizations. Like HMOs, physicians and hospitals have opposed mandatory disclosure of performance data. Improvement of quality and accountability may ultimately depend on forthright disclosure of quality data at all levels of the health care system.

Our study has several limitations. First, since data on nondisclosing plans were, by definition, unavailable to us, we used a plan's performance 1 year earlier as a proxy for current performance. Plans know their current HEDIS scores before having to decide whether or not to disclose them. Thus, it seems likely that low-scoring plans that improved would choose to continue disclosing, while those that did not would be more likely to withdraw from disclosure. Hence, our study may underestimate the relationship between low scores and withdrawal from disclosure. Second, we cannot exclude the possibility that unmeasured plan characteristics such as geographic dispersal of medical provider sites or differences in data systems could systematically influence our results. Third, although our data show that lower-scoring plans are more likely to withdraw from disclosure, we have no direct data on HMO executives' reasoning regarding this decision. Last, only HMOs are currently eligible to submit HEDIS scores and receive NCQA accreditation. Whether the selective reporting of quality data we observed would apply to fee-for-service insurance or other facets of the health care system is unknown.

Few industries whose impact on health rivals HMOs' are as free of public oversight. Airlines and car manufacturers are required to disclose standardized data on the safety of their products. Our findings suggest that voluntary quality reporting by HMOs will not create the preconditions for effective quality oversight. Reporting and public disclosure of HEDIS and other meaningful quality data by HMOs should be mandatory.

Gabel JR, Hunt KA, Hurst K. When Employers Choose Health Plans: Do NCQA Accreditation and HEDIS Data Count? New York, NY: Commonwealth Fund; 1998.
California Office of Statewide Health Planning and Development.  Annual Report of the California Hospital Outcomes Project. Sacramento, Calif: Office of Statewide Health Planning and Development; 1993.
Princeton Survey Research Associates.  Americans as Health Care Consumers: The Role of Quality Information. Menlo Park, Calif: The Kaiser Family Foundation; 1996.
Miller T. Managed care regulation: in the laboratory of the states.  JAMA.1997;278:1102-1109.
Noble AA, Brennan TA. Stages of managed care regulation: developing better rules.  J Health Polit Policy Law.1999;24:1275-1305.
 Final Report . Washington, DC: The President's Advisory Commission on Consumer Protection and Quality in the Health Care Industry; 1998.
Epstein AM. Public release of performance data: a progress report from the front.  JAMA.2000;283:1884-1886.
Hibbard JH, Jewett JJ, Legnini MW, Tusler M. Choosing a health plan: do large employers use the data?  Health Aff (Millwood).1997;16:172-180.
Mukamel DB, Mushlin AI. Quality of care information makes a difference: an analysis of market share and price changes after publication of the New York State Cardiac Surgery Mortality Reports.  Med Care.1998;36:945-954.
Hannan EL, Kilburn H, Racz M, Shields E, Chassin MR. Improving the outcomes of coronary artery bypass surgery in New York State.  JAMA.1994;271:761-766.
Harris JR, Caldwell B, Cahill K. Measuring the public's health in an era of accountability: lessons from HEDIS.  Am J Prev Med.1998;14:9-13.
Rosenbaum S. Negotiating the new health system: purchasing publicly accountable managed care.  Am J Prev Med.1998;14:67-71.
Longo DR, Land G, Schramm W, Fraas J, Hoskins B, Howel V. Consumer reports in health care: do they make a difference in patient care?  JAMA.1997;278:1579-1584.
Leatherman S, McCarthy D. Public disclosure of health care performance reports: experience, evidence and issues for policy.  Int J Qual Health Care.1999;11:93-105.
Farley DO, McGlynn EA, Klein D. Assessing Quality in Managed Care: Health Plans Reporting of HEDIS Performance Measures. New York, NY: The Commonwealth Fund; 1998.
 NCQA's Quality Compass Data Base 1997 . Washington, DC: the National Committee for Quality Assurance; 1997.
 NCQA's Quality Compass Data Base 1998 . Washington, DC: the National Committee for Quality Assurance; 1998.
 NCQA's Quality Compass Data Base 1999 . Washington, DC: the National Committee for Quality Assurance; 1999.
 The InterStudy Competitive Edge: HMO Directory 9.1.  St Paul, Minn: InterStudy Publications; 1999.
 The InterStudy Competitive Edge: HMO Directory 10.1.  St Paul, Minn: InterStudy Publications; 2000.
 SAS Software Version 8.1.  Cary, NC: SAS Institute; 2000.
Spoeri RK, Ullman R. Measuring and reporting managed care performance: lessons learned and new initiatives.  Ann Intern Med.1997;127(8, pt 2):726-732.
Roper WL, Cutler CM. Health plan accountability and reporting: issues and challenges.  Health Aff (Millwood).1998;17:152-155.
Eddy DM. Performance measurement: problems and solutions.  Health Aff (Millwood).1998;17:7-25.
Bodenheimer T, Calasino L. Executives with white coats—the work and world view of managed-care directors: the second of two parts.  N Engl J Med.1999;341:2029-2032.
Epstein AM. Performance reports on quality—prototypes, problems and prospects.  N Engl J Med.1995;333:57-61.
US Public Health Service.  Healthy People 2000: National Health Promotion and Disease Prevention Objectives. Washington, DC: US Dept of Health and Human Services; 1990.
Schneider EC, Riehl V, Courte-Wienecke S, Eddy DM, Sennett C. Enhancing performance measurement: NCQA's road map for a health information framework.  JAMA.1999;282:1184-1190.
Yusef S, Wittes J, Friedman L. Overview of results of randomized clinical trials in heart disease, I: treatments following myocardial infarction.  JAMA.1988;260:2088-2093.
Yusef S, Peto R, Lewis J, Collins R, Sleight P. Beta blockade during and after myocardial infarction: an overview of the randomized trials.  Prog Cardiovasc Dis.1985;27:335-371.
Bodenheimer T. The American health care system—the movement for improved quality in health care.  N Engl J Med.1999;340:488-492.
Hibbard JH, Jewett JJ. Will quality report cards help consumers?  Health Aff (Millwood).1997;16:218-228.
Hibbard JH, Sofaer S, Jewett JJ. Condition-specific performance information: assessing salience, comprehension, and approaches for communication quality.  Health Care Financ Rev.1996;18:95-109.
Chernew M, Scanlon DP. Health plan report cards and insurance choice.  Inquiry.1998;35:9-22.
Tumlinson A, Bottigheimer H, Mahoney P, Stone EM, Hendricks A. Choosing a health plan: what information will consumers use?  Health Aff (Millwood).1997;16:229-238.
Gabel JR, Ginsburg PB, Hunt KA. Small employers and their health benefits, 1988-1996: an awkward adolescence.  Health Aff (Millwood).1997;16:103-110.
Sisk JE. Increasing competition and the quality of health care.  Milbank Q.1998;76:687-707.
Marshall MN, Shekelle PG, Leatherman S, Brook RH. Public release of performance data: what do we expect to gain? a review of the evidence.  JAMA.2000;283:1866-1874.
Schneider EC, Epstein AM. Use of public performance reports: a survey of patients undergoing cardiac surgery.  JAMA.1998;279:1638-1642.

Figures

Figure. Withdrawal From Public Disclosure of HEDIS Scores by HMOs From 1997 to 1999
Graphic Jump Location
The number of health maintenance organization (HMO) plans submitting Health Plan Employer Data and Information Set (HEDIS) data to the National Committee for Quality Assurance and allowing public disclosure for 1997, 1998, and 1999 are shown in boxes. Arrows depict changes in disclosure status of plans from 1997-1998 and from 1998-1999. Numbers indicate HMOs that withdrew from public disclosure of HEDIS scores either through submitting data to the National Committee for Quality Assurance, but declining public disclosure, or through a failure to submit data. Also depicted are changes in the number of plans publicly reporting data due to mergers of existing plans or the addition of newly participating plans.

Tables

Table Graphic Jump LocationTable 1. Characteristics of HMOs Allowing Public Disclosure of HEDIS Scores in 1997 and 1998*
Table Graphic Jump LocationTable 2. Mean HEDIS Scores Among Lowest- and Highest-Ranked HMOs Publicly Disclosing Scores in 1997 and 1998*
Table Graphic Jump LocationTable 3. Withdrawal From Public Disclosure of HEDIS Scores According to HMO Quality Rank Tertile 1 Year Earlier*

References

Gabel JR, Hunt KA, Hurst K. When Employers Choose Health Plans: Do NCQA Accreditation and HEDIS Data Count? New York, NY: Commonwealth Fund; 1998.
California Office of Statewide Health Planning and Development.  Annual Report of the California Hospital Outcomes Project. Sacramento, Calif: Office of Statewide Health Planning and Development; 1993.
Princeton Survey Research Associates.  Americans as Health Care Consumers: The Role of Quality Information. Menlo Park, Calif: The Kaiser Family Foundation; 1996.
Miller T. Managed care regulation: in the laboratory of the states.  JAMA.1997;278:1102-1109.
Noble AA, Brennan TA. Stages of managed care regulation: developing better rules.  J Health Polit Policy Law.1999;24:1275-1305.
 Final Report . Washington, DC: The President's Advisory Commission on Consumer Protection and Quality in the Health Care Industry; 1998.
Epstein AM. Public release of performance data: a progress report from the front.  JAMA.2000;283:1884-1886.
Hibbard JH, Jewett JJ, Legnini MW, Tusler M. Choosing a health plan: do large employers use the data?  Health Aff (Millwood).1997;16:172-180.
Mukamel DB, Mushlin AI. Quality of care information makes a difference: an analysis of market share and price changes after publication of the New York State Cardiac Surgery Mortality Reports.  Med Care.1998;36:945-954.
Hannan EL, Kilburn H, Racz M, Shields E, Chassin MR. Improving the outcomes of coronary artery bypass surgery in New York State.  JAMA.1994;271:761-766.
Harris JR, Caldwell B, Cahill K. Measuring the public's health in an era of accountability: lessons from HEDIS.  Am J Prev Med.1998;14:9-13.
Rosenbaum S. Negotiating the new health system: purchasing publicly accountable managed care.  Am J Prev Med.1998;14:67-71.
Longo DR, Land G, Schramm W, Fraas J, Hoskins B, Howel V. Consumer reports in health care: do they make a difference in patient care?  JAMA.1997;278:1579-1584.
Leatherman S, McCarthy D. Public disclosure of health care performance reports: experience, evidence and issues for policy.  Int J Qual Health Care.1999;11:93-105.
Farley DO, McGlynn EA, Klein D. Assessing Quality in Managed Care: Health Plans Reporting of HEDIS Performance Measures. New York, NY: The Commonwealth Fund; 1998.
 NCQA's Quality Compass Data Base 1997 . Washington, DC: the National Committee for Quality Assurance; 1997.
 NCQA's Quality Compass Data Base 1998 . Washington, DC: the National Committee for Quality Assurance; 1998.
 NCQA's Quality Compass Data Base 1999 . Washington, DC: the National Committee for Quality Assurance; 1999.
 The InterStudy Competitive Edge: HMO Directory 9.1.  St Paul, Minn: InterStudy Publications; 1999.
 The InterStudy Competitive Edge: HMO Directory 10.1.  St Paul, Minn: InterStudy Publications; 2000.
 SAS Software Version 8.1.  Cary, NC: SAS Institute; 2000.
Spoeri RK, Ullman R. Measuring and reporting managed care performance: lessons learned and new initiatives.  Ann Intern Med.1997;127(8, pt 2):726-732.
Roper WL, Cutler CM. Health plan accountability and reporting: issues and challenges.  Health Aff (Millwood).1998;17:152-155.
Eddy DM. Performance measurement: problems and solutions.  Health Aff (Millwood).1998;17:7-25.
Bodenheimer T, Calasino L. Executives with white coats—the work and world view of managed-care directors: the second of two parts.  N Engl J Med.1999;341:2029-2032.
Epstein AM. Performance reports on quality—prototypes, problems and prospects.  N Engl J Med.1995;333:57-61.
US Public Health Service.  Healthy People 2000: National Health Promotion and Disease Prevention Objectives. Washington, DC: US Dept of Health and Human Services; 1990.
Schneider EC, Riehl V, Courte-Wienecke S, Eddy DM, Sennett C. Enhancing performance measurement: NCQA's road map for a health information framework.  JAMA.1999;282:1184-1190.
Yusef S, Wittes J, Friedman L. Overview of results of randomized clinical trials in heart disease, I: treatments following myocardial infarction.  JAMA.1988;260:2088-2093.
Yusef S, Peto R, Lewis J, Collins R, Sleight P. Beta blockade during and after myocardial infarction: an overview of the randomized trials.  Prog Cardiovasc Dis.1985;27:335-371.
Bodenheimer T. The American health care system—the movement for improved quality in health care.  N Engl J Med.1999;340:488-492.
Hibbard JH, Jewett JJ. Will quality report cards help consumers?  Health Aff (Millwood).1997;16:218-228.
Hibbard JH, Sofaer S, Jewett JJ. Condition-specific performance information: assessing salience, comprehension, and approaches for communication quality.  Health Care Financ Rev.1996;18:95-109.
Chernew M, Scanlon DP. Health plan report cards and insurance choice.  Inquiry.1998;35:9-22.
Tumlinson A, Bottigheimer H, Mahoney P, Stone EM, Hendricks A. Choosing a health plan: what information will consumers use?  Health Aff (Millwood).1997;16:229-238.
Gabel JR, Ginsburg PB, Hunt KA. Small employers and their health benefits, 1988-1996: an awkward adolescence.  Health Aff (Millwood).1997;16:103-110.
Sisk JE. Increasing competition and the quality of health care.  Milbank Q.1998;76:687-707.
Marshall MN, Shekelle PG, Leatherman S, Brook RH. Public release of performance data: what do we expect to gain? a review of the evidence.  JAMA.2000;283:1866-1874.
Schneider EC, Epstein AM. Use of public performance reports: a survey of patients undergoing cardiac surgery.  JAMA.1998;279:1638-1642.
CME
Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

See Also...
Articles Related By Topic
Related Collections
JAMAevidence.com

Users' Guides to the Medical Literature
Chapter e23. How to Use an Article About Quality Improvement