0
We're unable to sign you in at this time. Please try again in a few minutes.
Retry
We were able to sign you in, but your subscription(s) could not be found. Please try again in a few minutes.
Retry
There may be a problem with your account. Please contact the AMA Service Center to resolve this issue.
Contact the AMA Service Center:
Telephone: 1 (800) 262-2350 or 1 (312) 670-7827  *   Email: subscriptions@jamanetwork.com
Error Message ......
Original Investigation |

“America's Best Hospitals” in the Treatment of Acute Myocardial Infarction FREE

Oliver J. Wang, MD; Yun Wang, PhD; Judith H. Lichtman, PhD, MPH; Elizabeth H. Bradley, PhD; Sharon-Lise T. Normand, PhD; Harlan M. Krumholz, MD, SM
[+] Author Affiliations

Author Affiliations: Departments of Medicine (Drs O. J. Wang, Bradley, and Krumholz) and Epidemiology and Public Health (Drs Lichtman, Bradley, and Krumholz) and the Robert Wood Johnson Clinical Scholars Program (Drs Bradley and Krumholz), Yale University School of Medicine, New Haven, Connecticut; Center for Outcomes Research and Evaluation, Yale–New Haven Hospital, New Haven (Drs Y. Wang and Krumholz); and Department of Health Care Policy, Harvard Medical School, and Department of Biostatistics, Harvard School of Public Health, Boston, Massachusetts (Dr Normand).


Arch Intern Med. 2007;167(13):1345-1351. doi:10.1001/archinte.167.13.1345.
Text Size: A A A
Published online

Background  The ranking of “America's Best Hospitals” by U.S. News & World Report for “Heart and Heart Surgery” is a popular hospital profiling system, but it is not known if hospitals ranked by the magazine vs nonranked hospitals have lower risk-standardized, 30-day mortality rates (RSMRs) for patients with acute myocardial infarction (AMI).

Methods  Using a hierarchical regression model based on 2003 Medicare administrative data, we calculated RSMRs for ranked and nonranked hospitals in the treatment of AMI. We identified ranked and nonranked hospitals with standardized mortality ratios (SMRs) significantly less than the mean expected for all hospitals in the study.

Results  We compared 13 662 patients in 50 ranked hospitals with 254 907 patients in 3813 nonranked hospitals. The RSMRs were lower in ranked vs nonranked hospitals (16.0% vs 17.9%, P<.001). The RSMR range for ranked vs nonranked hospitals overlapped (11.4%-20.0% vs 13.1%-23.3%, respectively). In an RSMR quartile distribution of all hospitals, 35 ranked hospitals (70%) were in the lowest RSMR or best performing quartile, 11 (22%) were in the middle 2 quartiles, and 4 (8%) were in the highest RSMR or worst performing quartile. There were 11 ranked hospitals (22%) and 28 nonranked hospitals (0.73%) that each had an SMR significantly less than 1 (defined by a 95% confidence interval with an upper limit of <1.0).

Conclusions  On average, admission to a ranked hospital for AMI was associated with a lower risk of 30-day mortality, although about one-third of the ranked hospitals fell outside the best performing quartile based on RSMR. Although ranked hospitals were much more likely to have an SMR significantly less than 1, many more nonranked hospitals had this distinction.

Figures in this Article

Among the increasing number of academic, industry, and governmental profiling systems that evaluate and compare hospitals, U.S. News & World Report’s annual issue of “America's Best Hospitals” for specialty and overall care is one of the most well known. The 50 ranked institutions for “Heart and Heart Surgery” are identified based on 3 equally weighted measures: in-hospital mortality rates for a wide range of cardiovascular conditions, reputation among surveyed cardiologists, and hospital infrastructure. The rankings' widespread influence is evidenced by a circulation that exceeds more than 2 million copies1 and more than 200 000 Google search results for Web sites containing information on or references to designated hospitals. General awareness is also aided in part by hospitals that frequently highlight ranking information in consumer advertising.2

Despite their prominent role in the public arena, the ability of the U.S. News & World Report rankings to identify hospitals with excellent survival rates for common cardiovascular conditions is not known. The ultimate results of a hospitalization are what matters most to patients. As such, mortality rates for routine, high-risk conditions represent a meaningful measure of hospital performance that complements information obtained from current process measures.3,4 In fact, the assessment of the process of care and infrastructure is relevant only to the extent that it relates to outcomes.5 Thus, a comprehensive hospital profiling system should classify each institution by the results achieved. Although the U.S. News & World Report rankings are partly based on mortality measures, to our knowledge, validation of the magazine's approach has not been published in the peer review literature. In addition, that approach does not comply with standards set by the American Heart Association.6 A newly validated mortality risk model, endorsed by the National Quality Forum,3 provides the opportunity to profile hospitals on risk-standardized 30-day mortality rates (RSMRs) after an acute myocardial infarction (AMI).7 An AMI is an ideal condition with which to evaluate hospital performance for any institution that provides cardiovascular care because it is a common and life-threatening condition with guideline-based interventions that improve survival rates. Therefore, performance in the care of patients with AMI should be excellent in any hospital that is rated as a top cardiac center.

For this study, we employed this validated model to determine whether hospitals ranked by U.S. News & World Report have a lower mortality rate in the treatment of AMI than nonranked hospitals (hospitals not included in the U.S. News & World Report list for 20038). That is, does each U.S. News & World Report–ranked hospital perform better than nonranked hospitals on a validated mortality measure? Similarly, are there nonranked hospitals that perform well compared with ranked hospitals?

HOSPITAL SAMPLE AND CHARACTERISTICS

U.S. News & World Report’s 2003 list8 of “America's Best Hospitals” for “Heart and Heart Surgery” was published in the 2003 issue of the magazine. As detailed in the description of their methods,9 hospitals were identified from a nationwide sample meeting at least 1 of the following criteria: (1) Council of Teaching Hospitals membership, (2) medical school affiliation, and (3) having at least 9 of 19 possible hospital-wide technologies or services available. In addition, hospitals were required to have a minimum of 500 surgical and 785 medical cardiovascular discharges during the time period. Each of the 770 candidate hospitals was assigned a composite score based on 3 equally weighted measures consisting of compiled surveys from a nationwide sample of cardiologists who cited the top 5 institutions for care, inpatient mortality data for cardiovascular diagnoses using Medicare Provider Analysis and Review files,10 and an infrastructure score. The infrastructure score combined 5 measures, including nurse-bed ratios, patient discharges, palliative or hospice care services, level 1 or level 2 trauma services, and a technology index score that is itself comprised of 9 measures, including facilities for angioplasty, cardiac catheterization, open heart surgery, and a number of radiologic tools such as magnetic resonance imaging.

For our study, we classified all institutions in U.S. News & World Report’s 2003 “Heart and Heart Surgery” list as ranked hospitals. We used the hospitals listed in the 2003 ranking8 because this corresponded to the most recent period for which Medicare data were available. Each ranked institution was identified by a single Medicare provider identification number. If more than 1 Medicare provider identification number was associated with a ranked institution, the Medicare provider identification number chosen represented most of the institution's total patient beds and annual revenue. We compared ranked hospitals with a cohort of nonranked hospitals during 2003 for which Medicare data were available. Hospital characteristics collected included status as a Council of Teaching Hospitals member or affiliation with a medical school and on-site facilities for coronary artery bypass graft surgery (CABG). Information was obtained from the American Hospital Association database.11 A subset analysis compared ranked hospitals with nonranked hospitals that had either (1) Council of Teaching Hospitals membership or affiliation with a medical school or (2) on-site facilities for CABG.

PATIENT SAMPLE AND CHARACTERISTICS

The sample included Medicare patients older than 65 years with a principal discharge diagnosis of AMI from January 1, 2003, to December 31, 2003 (codes 410.00-410.91, excluding 410.02; International Classification of Diseases, Ninth Revision, Clinical Modification12). Data were obtained from Medicare Provider Analysis and Review files10 that included patient variable information (Table 1), principal discharge and secondary diagnosis codes, and procedure codes for each hospitalization for Medicare patients enrolled in the fee-for-service plan.7

Table Graphic Jump LocationTable 1. Baseline Patient Characteristics a

To measure patient variables, additional data were obtained from diagnosis codes obtained 12 months before the index admission from Medicare Part A and B data. Patients transferred from one facility to another were included if their principal discharge diagnosis was AMI at both hospitals. For included transferred patients, both hospitalizations were linked into a single episode of care with outcomes attributed to the first hospital, with comorbid conditions selected from the first hospitalization only to avoid misclassifying complications as comorbid conditions during the linked episode of care.

Patients discharged alive, not against medical advice, and after less than 1 day were excluded because these patients were unlikely to have experienced AMI. Because the regression model incorporates utilization information 12 months before the index hospitalization, only patients enrolled in the Medicare fee-for-service plan for at least 1 year were included.

OUTCOME AND RISK-ADJUSTMENT VARIABLES

The primary outcome was a hospital-specific RSMR, incorporating mortality data for patients dying from any cause within 30 days following the index admission date for patients with a principal discharge diagnosis AMI. We obtained mortality information from Medicare Enrollment files by linking unique patient identifiers. The patient variables used in risk-adjustment included demographic, cardiovascular, and comorbidity variables (Table 1).

STATISTICAL ANALYSIS

We compared baseline characteristics of patients by hospital type using the χ2 test for categorical variables and analysis of variance for continuous variables. Observed in-hospital and 30-day mortality rates for patients with AMI were calculated and compared between the 2 hospital groups. We used hierarchical generalized linear models to calculate hospital-specific RSMRs, accounting for patient characteristics and within- and between-hospital variances.13 The RSMRs, which essentially transform the mortality ratios to rates, were calculated for each hospital.

We calculated standardized mortality ratios (SMRs) for ranked hospitals. An SMR greater than 1 indicates that the hospital had higher than expected mortality, and an SMR less than 1 indicates lower than expected mortality. For each hospital, a bootstrapping model was used to produce 1000 hierarchical generalized linear model iterations to generate a 95% confidence interval (CI) of the SMR. We identified hospitals with an SMR significantly less than 1.0 (defined by a 95% CI with an upper limit less than 1.0) and then cross-classified hospitals by this new variable and whether U.S. News & World Report ranked the hospital. We then calculated the relative risk of having a significantly lower SMR for nonranked vs ranked hospitals. In a subgroup analysis, we restricted our comparison to nonranked hospitals that were either teaching hospitals or that had on-site CABG facilities. We then recalculated the relative risk of having a significantly lower SMR for the subset of nonranked teaching and CABG hospitals vs ranked hospitals.

To compare individual hospital performance between ranked and nonranked hospitals nationwide, we categorized each hospital based on RSMR into 1 of 3 groups (an RSMR lower than the 25th percentile, an RSMR in the 25th to 75th percentiles, or an RSMR higher than the 75th percentile). We calculated the numbers of ranked and nonranked hospitals within the 3 categories of hospitals. We conducted the analyses using SAS statistical software (version 8.02; SAS Institute Inc, Cary, North Carolina).

CHARACTERISTICS OF PATIENTS AND HOSPITALS

A total of 268 569 patients were treated in the 3863 hospitals studied. Compared with patients admitted to nonranked hospitals (n = 3813), patients (n = 13 662) admitted to ranked institutions (n = 50) were similar in age, male-female ratio, and prevalence of many comorbidities (Table 1). Patients admitted to ranked hospitals were more likely to have a secondary diagnosis of hypertension (60.4% vs 49.8%, P<.001), unstable angina (17.3% vs 12.4%, P<.001), and AMI (17.0% vs 14.7%, P<.001). The 254 907 patients admitted to nonranked hospitals were more likely to have a history of CABG (7.1% vs 5.8%, P<.001).

All ranked institutions were, by definition, affiliated with teaching institutions and had on-site facilities for CABG. Among the nonranked hospitals, 22.7% were affiliated with teaching institutions and 24.1% had on-site facilities for CABG. The mean (SD) volume of patients treated at ranked hospitals was larger than that for nonranked hospitals (273.2 [24.2] vs 66.9 [1.3]).

MORTALITY

The mean observed in-hospital mortality rate was significantly lower in ranked vs nonranked hospitals (10.6% vs 12.2%, P<.001). Similarly, 30-day mortality rates were significantly lower in ranked vs nonranked hospitals (14.4% vs 18.0%, P<.001) (Table 1). After adjustment for patient characteristics, mean RSMRs (Table 2) were significantly lower in ranked vs nonranked hospitals (16.0% vs 17.9%, P<.001). However, the RSMR varied widely for both groups, ranging from 11.4% to 20.0% for ranked hospitals vs 13.1% to 23.3% for nonranked hospitals. For ranked hospitals, the 25th to 75th percentile RSMR range was 17.2% to 18.6%. For nonranked hospitals, the 25th to 75th percentile range was 17.0% to 18.7%. The distributions are represented in a side-by-side comparison, with the nonranked hospital range expressed by deciles (Figure 1). Of the 3813 nonranked hospitals in our study, 1260 (33.0%) had on-site facilities for CABG and/or were designated as a teaching hospital. The mean (SD) RSMR for these nonranked teaching or CABG hospitals was 17.6% (1.7%) vs 16.0% (1.9%) for ranked hospitals (P<.001).

Place holder to copy figure label and caption
Figure 1.

Decile distribution of nonranked hospitals vs ranked hospitals by risk-standardized mortality rates. The horizontal bar contained in the boxes indicates decile mean, lower and upper box bars define the respective 25th and 75th percentiles, respectively, and error bars define the entire range except when black dots are used to represent outliers.

Graphic Jump Location
Table Graphic Jump LocationTable 2. Risk-Standardized Mortality Rates (RSMRs) of Ranked and Nonranked Hospitals by Percentile Groups

When all ranked and nonranked hospitals were stratified into quartiles, 35 (70%) of the ranked hospitals were in the lowest quartile, 11 (22%) were in the middle 2 quartiles, and 4 (8%) were in the highest quartile (Table 2). The SMRs for ranked hospitals revealed 11 institutions (22%) with SMRs significantly lower than the average expected mortality for all nonranked and ranked hospitals (defined by a 95% CI with an upper limit excluding 1.0). Of the remaining ranked hospitals, 29 had an SMR less than 1.0 and 10 had an SMR greater than 1.0, but the 95% CI for these comparisons included 1.0. The mean (SE) SMR for ranked hospitals was 0.90 (0.11), with a range of 0.64 to 1.13 (Figure 2). In comparison, 28 nonranked hospitals (0.73%) had an SMR significantly less than 1.0 (defined by a 95% CI with an upper limit excluding 1.0). The resulting relative risk ratio for a nonranked hospital to have an SMR significantly less than 1.0 is 0.04 (95% CI, 0.02-0.07) compared with ranked hospitals. From the subset of nonranked teaching and CABG hospitals, 24 (1.9%) had an SMR significantly less than 1.0 (defined by a 95% CI with an upper limit excluding 1.0). The resulting risk ratio for a nonranked teaching or CABG hospital to have an SMR significantly less than 1.0 is 0.1 (95% CI, 0.05-1.73) compared with ranked hospitals.

Place holder to copy figure label and caption
Figure 2.

Standardized mortality ratios of ranked hospitals with 95% confidence intervals. The individual hospital identification numbers do not have any relationship to the numeric U.S. News & World Report ranking. The black diamonds indicate mean standardized mortality rates with error bars defining the 95% confidence interval.

Graphic Jump Location

As a group, the U.S. News & World Report top heart hospitals had significantly lower mortality rates for the care of patients with an AMI (16.0% vs 17.9%, P<.001). In addition, when individual hospital performance was measured, ranked hospitals were much more likely than nonranked hospitals to have an SMR significantly less than 1.0 (defined by a 95% CI with an upper limit excluding 1.0). However, nonranked hospitals with SMRs significantly less than 1.0 outnumbered ranked hospitals with similar performance by nearly 3 to 1. As a result, the U.S. News & World Report ranking list does not include many hospitals that have outstanding performances for the care of patients with AMI.

The methods employed by U.S. News & World Report may explain why it does not discriminate more accurately based on AMI mortality rate. The reputation component, which accounts for one-third of the overall ranking score derived by U.S. News & World Report, is based on cardiologists' opinions of hospitals that provide the best treatment for the most serious and difficult medical conditions.9 Citations by cardiologists likely favor tertiary centers with strong subspecialty care for the most critically ill patients while not necessarily reflecting the perceived care for the overwhelming majority of admissions for more common diagnoses, which in turn have a more substantial impact on overall hospital outcomes. The 10 hospitals ranked highest by U.S. News & World Report accounted for more than 75% of the aggregate reputation points for all ranked hospitals. The distribution of reputation points among these very highest ranked hospitals is relatively consistent from year to year and reflects an enduring perception among surveyed physicians for superior cardiac care at these institutions.14 Given the disproportionate influence of reputation, even relatively large mortality differences in this higher tier of ranked hospitals may be a less important factor in determining the overall ranking. In contrast, the lack of high reputation scores in the lower tier of ranked hospitals likely results in increased annual turnover and instability of ranking position from year to year.

Notwithstanding the effect of reputation in determining the U.S. News & World Report rankings, challenges remain in validating the other remaining measures used to evaluate hospitals. The proprietary U.S. News & World Report mortality model used to calculate hospital ranking incorporates the in-hospital mortality rates of a wide range of cardiovascular diagnoses. Unlike the 30-day time frame used in our study, in-hospital mortality does not reflect a standardized time of treatment, nor does it account for patients treated who are transferred to another facility or discharged prematurely. The cardiovascular hospital infrastructure score employs factors such as the presence of hospice facilities or imaging technology that may possess little correlation with outcomes in cardiac care. Finally, the composite score used to evaluate multiple hospital characteristics, while simplifying a vast amount of information, results in rankings that vary based on the subjective weighting of each factor.15 As a result, the inability of the U.S. News & World Report rankings to consistently classify hospitals based on AMI mortality rates is not entirely unexpected and is consistent with the analysis of another hospital profiling system.16

The 50 hospitals ranked by U.S. News & World Report in 2003 represent slightly more than 1% of our nationwide cohort and are distributed in 18 states and the District of Columbia, with 40% of these institutions located in North Carolina, Ohio, Pennsylvania, and California. In contrast, nonranked hospitals in the decile with the lowest RSMRs were distributed among 46 states. With RSMRs comparable or superior to ranked institutions, increasing awareness of these nonranked institutions could expand access to regional centers of excellence (Figure 3).

Place holder to copy figure label and caption
Figure 3.

Geographic distribution of all ranked hospitals and top decile of nonranked hospitals by risk-standardized mortality rates. Not all nonranked institutions were represented in areas of high concentration. Maps courtesy of http://www.theodora.com, used with permission.

Graphic Jump Location

There are several issues to consider in the interpretation of this study. First, our risk-standardized model is focused on AMI mortality outcomes, in contrast to the U.S. News & World Report rankings' composite mortality rate for a wide range of cardiovascular diagnoses and cardiothoracic procedures. We focused on AMI treatment because of its prevalence and potential for improved outcomes with accepted interventions. However, certain ranked hospitals may perform disproportionately better or worse in measures outside of AMI care, which would not be reflected in our mortality model. Although it is the only mortality measure currently endorsed by the National Quality Forum,3 the 30-day mortality rate for AMI treatment does not capture all aspects of hospital performance for patients with cardiac disease. The advantage of 30-day mortality over in-hospital mortality is that a standardized period of assessment is being used to avoid differences based on variations in practices related to length of stay. Longer time periods may provide an interesting perspective, but the longer the follow-up the more likely that factors not related to the hospitalization may affect the outcome. Although there are other factors that should be included in a more comprehensive assessment of hospital quality, 30-day mortality can be considered as a minimum standard for identifying superior cardiac care. The U.S. News & World Report rankings also take into account structural or nonclinical measures, such as the presence of hospice facilities and imaging technology. The nonranked hospitals may include a relatively small number of affiliated hospitals compared with ranked institutions. U.S. News & World Report uses inclusion criteria for candidate hospitals in its list, which restricts comparison to larger teaching hospitals. As such, hospitals in our nonranked cohort may include some hospitals not evaluated by their methods. Our findings are based on data drawn from Medicare patients who are 65 years or older, and our findings may not be relevant to the care provided to younger patients in these hospitals. Nevertheless, Medicare patients typically comprise most of the patients admitted to the hospital with AMI. Our RSMR model employs Medicare data that may not reflect all of the factors contributing to the variation in mortality rates. However, the RSMR model demonstrates acceptable validity when compared with a chart-based model and has the attributes recently recommended by an expert panel.6

Our study evaluated the performance of hospitals ranked by U.S. News & World Report compared with a large nationwide sample of institutions using a validated RSMR model. The U.S. News & World Report ranking, which includes many of the nation's most prestigious hospitals, did identify a group of hospitals that was much more likely than nonranked hospitals to have superb performance on 30-day mortality after AMI. However, our study also revealed that not all ranked hospitals had outstanding performance and that many nonranked hospitals performed well. Consequently, although the U.S. News & World Report rankings provide some guidance about the performance on outcomes, they fall short of identifying all the top hospitals with respect to 30-day survival after admission for AMI and include a few hospitals that are actually in the lowest quartile of performance.

Correspondence: Harlan M. Krumholz, MD, SM, Yale University School of Medicine, 333 Cedar St, PO Box 208088, New Haven, CT 06520-8088 (harlan.krumholz@yale.edu).

Accepted for Publication: February 2, 2007.

Author Contributions: Dr O. J. Wang had full access to all the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis. Study concept and design: O. J. Wang, Normand, and Krumholz. Acquisition of data: O. J. Wang, Y. Wang, and Krumholz. Analysis and interpretation of data: O. J. Wang, Y. Wang, Lichtman, Bradley, Normand, and Krumholz. Drafting of the manuscript: O. J. Wang. Critical revision of the manuscript for important intellectual content: O. J. Wang, Y. Wang, Lichtman, Bradley, Normand, and Krumholz. Statistical analysis: Y. Wang and Normand. Obtained funding: Krumholz. Administrative, technical, and material support: O. J. Wang and Normand. Study supervision: Krumholz.

Financial Disclosure: None reported.

Funding/Support: Dr Lichtman is supported by grant No. 1 K01 DP000085-03 from the Centers for Disease Control and Prevention (CDC). Dr Bradley is supported by a Patrick and Catherine Weldon Donaghue Medical Research Foundation Investigator Award. The analyses upon which this publication is based were performed under Contract Number HHSM-500-2005-CO001C, titled “Utilization and Quality Control Quality Improvement Organization for the State (commonwealth) of Colorado,” sponsored by the Centers for Medicare & Medicaid Services, an agency of the U.S. Department of Health and Human Services.

Role of the Sponsor: The Patrick and Catherine Weldon Donaghue Medical Research Foundation had no role in the design or conduct of the study; collection, management, analysis, or interpretation of the data; or preparation, review, or approval of the manuscript.

Disclaimer: The content of this publication does not necessarily reflect the views or policies of the Department of Health and Human Services, nor does mention of trade names, commercial products, or organizations imply endorsement by the US government. The author assumes full responsibility for the accuracy and completeness of the ideas presented. The contents of this article are solely the responsibility of the authors and do not necessarily represent the official views of the CDC.

Audit Bureau of Circulations, Circulation averages for the six months ended 12/31/2005. http://abcas3.accessabc.com/ecirc/magtitlesearch.asp. Accessed August 1, 2006
Larson  RJSchwartz  LMWoloshin  SWelch  HG Advertising by academic medical centers. Arch Intern Med 2005;165 (6) 645- 651
PubMed Link to Article
Bradley  EHHerrin  JElbel  B  et al.  Hospital quality for acute myocardial infarction: correlation among process measures and relationship with short-term mortality. JAMA 2006;296 (1) 72- 78
PubMed Link to Article
Jha  AK Measuring hospital quality: what physicians do? how patients fare? or both? JAMA 2006;296 (1) 95- 97
PubMed Link to Article
Donabedian  A Evaluating the quality of medical care. Milbank Mem Fund Q 1966;44 (3(suppl)) 166- 203
PubMed Link to Article
Krumholz  HMBrindis  RGBrush  JE  et al.  Standards for statistical models used for public reporting of health outcomes: an American Heart Association Scientific Statement from the Quality of Care and Outcomes Research Interdisciplinary Writing Group: cosponsored by the Council on Epidemiology and Prevention and the Stroke Council: endorsed by the American College of Cardiology Foundation. Circulation 2006;113 (3) 456- 462
PubMed Link to Article
Krumholz  HMWang  YMattera  JA  et al.  An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with an acute myocardial infarction. Circulation 2006;113 (13) 1683- 1692
PubMed Link to Article
 America's Best Hospitals. U.S. News & World Report 2003;135 (3, theme issue) 102
O'Muircheartaigh  COBurke  AMurphy  W The 2003 index of hospital quality. http://www.usnews.com/usnews/health/best-hospitals/methodology/ABH_Methodology_2003.pdf. Accessed August 1, 2006
Centers for Medicare & Medicaid Services, MEDPAR Limited Data Set (LDS)–Hospital (National). http://www.cms.hhs.gov/LimitedDataSets/02_MEDPARLDSHospitalNational.asp. Accessed February, 2006
American Hospital Association, The AHA Annual Survey Database: Fiscal Year 2003 Documentation.  Chicago, IL Health Forum, American Hospital Association2005;
US Department of Health and Human Services, International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM).  Hyattsville, MD National Center for Health Statistics1979;
Normand  SLZou  KH Sample size considerations in observational health care quality studies. Stat Med 2002;21 (3) 331- 345
PubMed
Green  JWintfeld  NKrasner  MWells  C In search of America's best hospitals: the promise and reality of quality assessment. JAMA 1997;277 (14) 1152- 1155
PubMed
Jacobs  RGoddard  MSmith  PC How robust are hospital ranks based on composite performance measures? Med Care 2005;43 (12) 1177- 1184
PubMed Link to Article
Krumholz  HMRathore  SSChen  JWang  YRadford  MJ Evaluation of a consumer-oriented internet health care report card: the risk of quality ratings based on mortality data. JAMA 2002;287 (10) 1277- 1287
PubMed Link to Article

Figures

Place holder to copy figure label and caption
Figure 1.

Decile distribution of nonranked hospitals vs ranked hospitals by risk-standardized mortality rates. The horizontal bar contained in the boxes indicates decile mean, lower and upper box bars define the respective 25th and 75th percentiles, respectively, and error bars define the entire range except when black dots are used to represent outliers.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 2.

Standardized mortality ratios of ranked hospitals with 95% confidence intervals. The individual hospital identification numbers do not have any relationship to the numeric U.S. News & World Report ranking. The black diamonds indicate mean standardized mortality rates with error bars defining the 95% confidence interval.

Graphic Jump Location
Place holder to copy figure label and caption
Figure 3.

Geographic distribution of all ranked hospitals and top decile of nonranked hospitals by risk-standardized mortality rates. Not all nonranked institutions were represented in areas of high concentration. Maps courtesy of http://www.theodora.com, used with permission.

Graphic Jump Location

Tables

Table Graphic Jump LocationTable 1. Baseline Patient Characteristics a
Table Graphic Jump LocationTable 2. Risk-Standardized Mortality Rates (RSMRs) of Ranked and Nonranked Hospitals by Percentile Groups

References

Audit Bureau of Circulations, Circulation averages for the six months ended 12/31/2005. http://abcas3.accessabc.com/ecirc/magtitlesearch.asp. Accessed August 1, 2006
Larson  RJSchwartz  LMWoloshin  SWelch  HG Advertising by academic medical centers. Arch Intern Med 2005;165 (6) 645- 651
PubMed Link to Article
Bradley  EHHerrin  JElbel  B  et al.  Hospital quality for acute myocardial infarction: correlation among process measures and relationship with short-term mortality. JAMA 2006;296 (1) 72- 78
PubMed Link to Article
Jha  AK Measuring hospital quality: what physicians do? how patients fare? or both? JAMA 2006;296 (1) 95- 97
PubMed Link to Article
Donabedian  A Evaluating the quality of medical care. Milbank Mem Fund Q 1966;44 (3(suppl)) 166- 203
PubMed Link to Article
Krumholz  HMBrindis  RGBrush  JE  et al.  Standards for statistical models used for public reporting of health outcomes: an American Heart Association Scientific Statement from the Quality of Care and Outcomes Research Interdisciplinary Writing Group: cosponsored by the Council on Epidemiology and Prevention and the Stroke Council: endorsed by the American College of Cardiology Foundation. Circulation 2006;113 (3) 456- 462
PubMed Link to Article
Krumholz  HMWang  YMattera  JA  et al.  An administrative claims model suitable for profiling hospital performance based on 30-day mortality rates among patients with an acute myocardial infarction. Circulation 2006;113 (13) 1683- 1692
PubMed Link to Article
 America's Best Hospitals. U.S. News & World Report 2003;135 (3, theme issue) 102
O'Muircheartaigh  COBurke  AMurphy  W The 2003 index of hospital quality. http://www.usnews.com/usnews/health/best-hospitals/methodology/ABH_Methodology_2003.pdf. Accessed August 1, 2006
Centers for Medicare & Medicaid Services, MEDPAR Limited Data Set (LDS)–Hospital (National). http://www.cms.hhs.gov/LimitedDataSets/02_MEDPARLDSHospitalNational.asp. Accessed February, 2006
American Hospital Association, The AHA Annual Survey Database: Fiscal Year 2003 Documentation.  Chicago, IL Health Forum, American Hospital Association2005;
US Department of Health and Human Services, International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM).  Hyattsville, MD National Center for Health Statistics1979;
Normand  SLZou  KH Sample size considerations in observational health care quality studies. Stat Med 2002;21 (3) 331- 345
PubMed
Green  JWintfeld  NKrasner  MWells  C In search of America's best hospitals: the promise and reality of quality assessment. JAMA 1997;277 (14) 1152- 1155
PubMed
Jacobs  RGoddard  MSmith  PC How robust are hospital ranks based on composite performance measures? Med Care 2005;43 (12) 1177- 1184
PubMed Link to Article
Krumholz  HMRathore  SSChen  JWang  YRadford  MJ Evaluation of a consumer-oriented internet health care report card: the risk of quality ratings based on mortality data. JAMA 2002;287 (10) 1277- 1287
PubMed Link to Article

Correspondence

CME
Also Meets CME requirements for:
Browse CME for all U.S. States
Accreditation Information
The American Medical Association is accredited by the Accreditation Council for Continuing Medical Education to provide continuing medical education for physicians. The AMA designates this journal-based CME activity for a maximum of 1 AMA PRA Category 1 CreditTM per course. Physicians should claim only the credit commensurate with the extent of their participation in the activity. Physicians who complete the CME course and score at least 80% correct on the quiz are eligible for AMA PRA Category 1 CreditTM.
Note: You must get at least of the answers correct to pass this quiz.
Your answers have been saved for later.
You have not filled in all the answers to complete this quiz
The following questions were not answered:
Sorry, you have unsuccessfully completed this CME quiz with a score of
The following questions were not answered correctly:
Commitment to Change (optional):
Indicate what change(s) you will implement in your practice, if any, based on this CME course.
Your quiz results:
The filled radio buttons indicate your responses. The preferred responses are highlighted
For CME Course: A Proposed Model for Initial Assessment and Management of Acute Heart Failure Syndromes
Indicate what changes(s) you will implement in your practice, if any, based on this CME course.
Submit a Comment

Multimedia

Some tools below are only available to our subscribers or users with an online account.

Web of Science® Times Cited: 21

Related Content

Customize your page view by dragging & repositioning the boxes below.

See Also...
Articles Related By Topic
Related Collections
PubMed Articles
JAMAevidence.com

Users' Guides to the Medical Literature
Acute Myocardial Infarction