Looking Beyond the Stars

No Comments

Understanding the ED’s Impact on Non-CMS Hospital Quality Rankings

Dear Director,


I have seen a variety of hospitals advertise different awards.  How do they differ from the CMS Star Quality Rating?

CEOs are no different than us when it comes to awards.  We all have a section on our CV that lists our awards and accomplishments.  While my CV contains a few awards I consider truly meaningful, it also lists a handful of junk awards that may look good on paper, but I know the truth.  Awards help CEOs validate the work the hospital is doing to the board of directors and to the community.  There are numerous awards on the national, regional and local levels with categories ranging from quality, safety, patient satisfaction, employee satisfaction and most wired.

I did residency at Johns Hopkins when U.S. News and World Report was consistently naming them the best hospital in the world.  My irrational fear as an intern was that I would negatively impact the ranking because of my ignorance or because of a mistake I would make with a patient. For better or worse, these awards tend to be much more complex than the individual performance of one intern. More recently, I’ve talked to C-suite executives who are laser focused on achieving one of the awards listed below.  Usually this has come as a result of having to explain to the hospital board why a competitor has a particular status that “we” don’t.  To illustrate the importance of these awards, Saint Anthony Hospital in Chicago sued the Leapfrog group last November for a C grade that the hospital believed was in error. The lawsuit emphasized the irrevocable harm the publication could cause if that grade was known to patients, media, health plans and other health systems with which it might affiliate and perhaps impact its ability to partner with Northwestern and the University of Chicago.1


One of my management philosophies has always been to know that while I can’t necessarily change the rules of the game, I can know the rules so well that it’s easier to win the game. All of the following awards are based on specific, very detailed data, so understanding how the ED metrics contribute positively or negatively can help you manage priorities and discussions with your bosses.

It’s interesting to note that there is very little overlap between hospitals making more than one list.  There have always been a few of these awards that I really respected.  However, there are also a few where I know way too many insider details about the hospitals that make the top of the list and then I have to question the validity of the measures.  Below is a list of the better known, and perhaps higher status awards that hospitals like to keep on their mantle (and their letterhead).  It is not meant to be inclusive of all awards, but rather to provide some background of the role the ED may play in key awards.

CMS Star Rankings

Although I’ve written about CMS star rankings before, I’ll provide a brief review. Star Rankings are a fairly new way for the public to gauge a hospital’s quality, being released initially in July 2016. Hospitals are assigned between one to five stars based on a complicated formula that aggregates performance on up to 57 hospital quality measures.  There are seven broad categories and the ED significantly impacts two of them: timeliness of care and effectiveness of care (each 4% of the overall score).  ED measures comprise four out of 10 effectiveness of care metrics and seven out of seven timeliness of care metrics. SEP-1 quality data is just now being made public and not surprisingly, it has been added to the effectiveness of care category.   Since these rankings are posted on the Hospital Compare website, CEOs may take this rating very seriously.  Ratings appear every July and December and there was a recent change in the distribution of star ranking so that more hospitals receive a five star ranking (2% to 9%).

Watson Health (formerly Truven Health)

Truven analytics started this award 25 years ago, and unlike some other awards, there is no charge to the hospital for winners to use this in marketing.  Watson Health annually releases its ranking system to create the “Top 100” Hospital List.  There are five categories of hospitals with each having about 20 spots.  The categories are: major teaching hospital, teaching hospital, large community hospital, medium community hospital and small community hospital. The methodology for this award includes CMS quality measure performance, mortality, readmissions, inpatient LOS, Medicare cost data, inpatient HCAHPS scores, and hospital operating margin.  The ED metrics make up 10% of the overall score via ED-1b (Length of Stay Admitted) and OP-18b (Length of Stay Discharged), each of which comprises 5% of the total score. It’s interesting to note that ED-1B is being considered for elimination as a core measure in the future.  There is also a strong correlation between ED patient satisfaction data and in-patient HCAHPS data so we may also contribute to that category indirectly.  CMS promises more to come on the ever-changing list of core measures.


Leapfrog Group

Leapfrog is a non-profit founded in 2000 and has been producing a “Top Hospitals” list since 2006.  They started awarding hospitals safety score grades in 2012.  The Top Hospital list typically includes about 100 hospitals and is released once a year (December).  The safety score grades are released twice a year (typically in April and October.)  Grading is relative to other hospitals: A is the 68 to 100th percentile; B is the 43rd to 68th percentile; C is the 6th to 43rd percentile; D covers the 1st to 6th percentile and F marks the 0 to1st percentile.  I wished they used this grading scale back in organic chemistry.

Leapfrog uses 27 metrics divided over two categories and incorporates CMS quality data, HCAHPS data, AHA data and an optional Leapfrog safety survey filled out by the hospital.  There is currently no ED specific quality measures being used in the methodology; however, as mentioned above, the ED does indirectly impact HCAHPS data.  Interestingly enough, in order to be named a Leapfrog Top Hospital, the hospital must comply with a variety of criteria including receiving an “A” safety grade, being fully compliant with CPOE and ICU staffing recommendations, having a Never Events policy, reporting data in at least 50% of the categories included in the optional Leapfrog safety survey and performing better than average in all CMS mortality measure categories.


Created in 1998, Healthgrades is a for-profit group that charges hospitals for marketing rights.  Published annually in February, Healthgrades has a Distinguished Hospital Award for Clinical Excellence that recognizes their top 5% of hospitals each year.  To make it onto the Top 100 list, the hospital needs to be in the top 5% for five years in a row and seven consecutive years is required to make the Top 50 list.  That’s pretty impressive to me.  Scores are tallied by using CMS mortality and complication data for 34 common conditions or procedures.  Scores are weighted 2/3 on mortality and 1/3 on complications.  Currently, there are no ED quality measures used in the Healthgrades methodology.

U.S. News and World Report

For many hospitals, and certainly for large academic centers, being ranked on this list may be the granddaddy of all recognition.  Every August since 1990, USNWR has issued its top hospitals list with the goal of identifying hospitals that are “best equipped to treat patients who need specialty care.”  Lists are generated for 16 specialties with rankings from 1 to 50 (152 hospitals in total were ranked in the most recent rankings) and there is an honor roll for the top 20 hospitals with multiply high ranked specialties.  It’s No. 1 on this Honor Roll list that is called the best.  Of note, there are no rankings for the specialty of emergency medicine.  A category called “Best Regional Hospital” was added in 2011 and most recently included 535 hospitals. The USNWR methodology is similar to other lists in that CMS mortality and patient safety data is utilized, but differs in that almost 30% of the score is determined by a reputation on an annual survey sent to about 100,000 physicians.   There are no ED specific quality measures incorporated in the USNWR methodology.


Hospitals are ranked by multiple different organizations whose methodology varies. Hospital performance in these rankings is very important to hospital leadership. CMS quality measures are included as a component of all ranking systems. Besides the CMS Star Quality Rating program, ED specific quality measures are only included in Watson Health (formerly Truven) at this time, though we probably have a hand in determining the mortality, re-admissions, and inpatient HCAHPS scores that are included in several ranking program methodologies.  Also, expect SEP-1 performance to be added to the methodology of one or more of the other rankings systems now that this data is being made public.

Continue to focus on your CMS quality measure performance and keep an eye on the methodologies used by the major ranking systems.  Although the ED may not have a large impact in helping the hospital win these awards, you’ll certainly be keeping your CEO happy and your job more secure if you align efforts with the goals of your CEO.


1. In Era of Increased Options, Hospitals Fret Over Ratings, The Washington Post.


EXECUTIVE EDITOR Dr. Silverman is Chair of Emergency Medicine at VHC Health. He also taught a leadership development course for over a decade. Dr. Silverman’s practical wisdom is available in an easy-to-use reference guide, available on Amazon. Follow on Twitter @drmikesilverman

Dr Sverha is a Regional Quality Director for US Acute Care Solutions and Vice-Chair of the Emergency Department at VHC Health in Arlington, VA.

Leave A Reply