In 2001, the Institute of Medicine (IOM) released “Crossing the Quality Chasm: A New Health System for the 21st Century”, a report exposing poor quality health care. In response to mounting public demand for freely available hospital quality information, Congress enacted the 2003 Medicare Modernization Act, which contains provisions that implore hospitals to report clinical care performance data. In 2003, the Centers for Medicare and Medicaid Services (CMS) and The Joint Commission (TJC) aligned common measures, which are published in the Specifications Manual for National Hospital Quality Measures (see Tables 1 & 2). Individual hospital data is posted publicly on Hospital Compare, a consumer-oriented web site posted by the U.S. Department of Health and Human Services (DHHS). This program is endorsed by the National Quality Forum (NQF) and the Hospital Quality Alliance (HQA), a public-private collaborative of healthcare-related organizations. CMS will advertise Hospital Compare in scores of media markets this year.
Quality of care has become a hot button for hospital administrators once health insurers and the federal government began linking quality measures to payment. Over 4,000 hospitals now “voluntarily” participate in Hospital Compare because, since 2007, non-participating hospitals are penalized with 2% reduction in Medicare payments. In addition, CMS continues to expand their list of no-pay hospital conditions (e.g., nosocomial infections and other complications) and this tactic is frequently replicated by private insurers.
Hospital Compare attempts to formulate evidence-based, meaningful and quantifiable quality of care measures. While many are reasonable and represent the standard of care for ED patients, some are at least scientifically conflicted and likely just plain wrong. This may explain, in part, the surprising variation of measures across hospitals such that reputable institutions are labeled as poor performers. Naivety in wording a medical record to get a favorable result is hardly improper care. For instance, it seems more plausible that an emergency physician would neglect to document that cardiac patient self-administered an aspirin prior to arrival than for the physician to be unacquainted with the fact that aspirin improves AMI outcome.
Specific to measure AMI-6, there is strong evidence that a subset of AMI patients do worse when given a beta-blocker on arrival. The COMMIT/CCS-2 trial evaluated the use of metoprolol for STEMI in China and was the second largest AMI trial ever conducted. The findings suggest that beta blockers are ill-advised when HR <60 bpm, SBP 0.24 seconds, second- or third-degree heart block, or active asthma/reactive airway disease. As a result, CMS, TJC, and the Agency for Healthcare Research and Quality (AHRQ) made it clear that beta blocker use must be based upon clinical judgment and documenting a reason for not prescribing is acceptable. In a University of California study of patients meeting TJC/CMS eligibility criteria for PN-5b, one-third failed to receive antibiotics within 4 hours of arrival and the majority of these cases did not have a final ED diagnosis of pneumonia. By comparison, of the two-thirds that met the 4-hour benchmark, 95% had pneumonia correctly diagnosed in the ED. The study author claims this measure may promote unnecessary antibiotic use and contribute to the growing problem of antibiotic resistance. The authors recommend the TJC/CMS criteria be adjusted so that only patients with a final ED diagnosis of pneumonia are included. The AHRQ believes that obtaining blood cultures before administering antibiotics improves survival in hospitalized pneumonia patients. In actuality, evidence supporting PN-3b is confined to the critically ill and elderly. Blood cultures assist in identifying organisms in about 8% of patients. And, since most are pneumococcal, management is rarely influenced (i.e., changing antibiotics because of resistant or atypical organisms). Furthermore, unnecessary and falsely positive blood cultures may provoke inappropriate hospitalizations and, paradoxically, iatrogenic complications. Related to measures PN-6, PN-6a, and PN-6b, it is certainly common that an emergency physician, often in consultation with the primary care physician, pulmonologist, or infectious disease specialist, has tailored the antibiotic regimen to the individual patient instead of adopting a “cookie cutter” approach. It seems that the pressure to meet certain Hospital Compare goals may conversely lead to suboptimal care. Hospital leaders may pressure emergency physicians to give antibiotics at triage for every patient with a cough and fever, a practice that leads to adverse drug reactions and antibiotic resistance. Mortality from cardiogenic shock may increase as AMI patients are uniformly given beta-blockers. Moreover, if the emergency team is being “graded” on certain patients, time and resources may be diverted from other serious cases. A Michigan study actually demonstrated poor concordance between hospitals listed as best performers and Hospital Compare scores. There are other potential pitfalls accompanying the intense public scrutiny of hospitals provoked by the growing awareness of Hospital Compare. Many of the measures relate to processes that ought to happen during the course of a patient’s hospitalization, without reference to the timing of the process. To improve performance, some hospital administrators may insist that the ED become the preferred setting for non-urgent Hospital Compare processes (e.g., initiation of ACE inhibitors, smoking cessation counseling, influenza/pneumococcal vaccination, or pre-surgery DVT prophylaxis). This may be especially tempting when private physicians (often with other hospital affiliations) are indifferent towards compliance with these measures. While benchmarking emergency departments is an effective way to drive quality improvement, some of the performance measures in Hospital Compare will paradoxically trigger unforeseen and negative consequences. Until the Hospital Compare rating scales are optimized, any exaggerated attention on a limited group of diagnoses may compromise the care of other emergent patients in the ED. CMS, TJC and HQA should work with established EM professional organizations (e.g., AAEM, ACEP, ACOEP and SAEM) to develop meaningful benchmarks that truly promote excellence in emergency medicine. Mark Reiter, MD, MBA, is CEO of Emergency Excellence (EmEx) and a faculty member at the emergency medicine residency program at St. Luke’s Hospital in Bethlehem, PA.
Tom Scaletta, MD, is President of EmEx, and emergency department Chair at Edward Hospital in Naperville, IL. Dr. Scaletta is the Immediate Past President of AAEM.
“ACEP Clinical policy for the management and risk stratification of community-acquired pneumonia in adults in the emergency department”
Ann Emerg Med. 2001;38:107-113
“Conflicting measures of hospital quality: ratings from ‘Hospital Compare’ versus ‘Best Hospitals’”
J Hosp Med. 2007;2(3):128-134
ACEP Clinical policy for the management and risk stratification of community-acquired pneumonia in adults in the emergency department. Ann Emerg Med. 2001;38:107-113.
AHRQ website. Commitment to Respond to COMMIT/CCS-2 Trial Beta Blocker Use for Myocardial Infarction (MI) Within 24 Hours of Hospital Arrival. Practice Advisory. 2005. AHRQ website. www.ahrq.gov/clinic/commitadvisory.htm.
AHRQ website. National Quality Measures Clearinghouse. www.qualitymeasures.ahrq.gov.
Benenson RS , Kepner AM, Pyle DN 2nd, Cavanaugh S, et al. Selective use of blood cultures in emergency department pneumonia patients. J Emerg Med. 2007;33(1):1-8.
DHHS website. Hospital Compare. www.hospitalcompare.hhs.gov.
Fee C, Weber E. Identification of 90% of Patients Ultimately Diagnosed With Community-Acquired Pneumonia Within Four Hours of Emergency Department Arrival May Not Be Feasible. Ann Emerg Med. 2007;49(5):553-559.
Halasyamani LK, and Davis MM. Conflicting measures of hospital quality: ratings from “Hospital Compare” versus “Best Hospitals”. J Hosp Med. 2007;2(3):128-134.
HQA website. Specifications Manual for National Hospital Quality Measures. www.qualitynet.org.
TJC website. Facts about ORYXâ for Hospitals (National Hospital Quality Measures). www.jointcommission.org/AccreditationPrograms/Hospitals/ORYX.