Hospitals that care for sickest found to be unfairly ranked
When consumers make what can be life-or-death decisions about where to seek health care, they often look to the numerous online hospital rating services that are available, including the government-sponsored Hospital Compare, Consumer Reports and Leapfrog.
Rating hospital quality is not as simple as rating the quality of a car, though. One reason is that patients vary in terms of how sick they are when they arrive at the hospital or undergo surgery. Older patients with multiple health problems or advanced diseases are more likely to develop an infection or complication than younger, healthier patients — even though they receive the same quality of care.
The best hospital rating systems try to account for severity of illness. However, a Rush data expert’s recent analysis of U.S. News & World Report’s “Best Hospitals” rankings shows that these rating systems often are dependent on faulty data sets and performance metrics.
Something didn’t add up
Rush University Medical Center consistently has scored high on patient safety measures, according to most rankings. In fact, Rush University Medical Center has received nine consecutive "A" grades for safety from the Leapfrog Group, putting it among fewer than 100 hospitals in the country that have received the highest possible grade each time since Leapfrog began issuing the semi-annual ratings in 2012.
But last year U.S. News' “Best Hospitals” issue gave Rush the publication’s lowest ranking for patient safety (one out of five). To learn what was behind the discrepancy, leaders of Rush’s quality team — Tom Webb, Bala Hota, MD, MPH, and Omar Lateef, DO — dug into the U.S. News data and methodology.
“When we compared the data that U.S. News used for our hospital to our own internal data, we found big differences,” says Hota, Rush’s chief research informatics officer. “U.S. News showed many more patient safety events than our actual rates.”
For example, U.S. News data showed that 25 patients at Rush had developed pressure ulcers, or bed sores, over a particular time period. In fact, only one patient had developed a pressure ulcer while at Rush.
Suspecting a broader problem, they analyzed data on a sample of hospitals and found that Rush was not the only organization inaccurately and unfairly ranked. Hota reported the results of his study in the October issue of the Joint Commission Journal on Quality and Patient Safety.
High demand for hospitals’ services skews safety scores down
Two hospital characteristics tended to negatively skew U.S. News patient safety scores: Taking care of large numbers of patients (i.e., high volumes) or admitting a high percentage of patients transferred from other hospitals.
Webb, Hota and Lateef found that the U.S. News data set was missing key information, such as whether patients with pressure ulcers had these sores when they were admitted to Rush. If conditions were present prior to arrival, Rush should not have been penalized.
Large referral hospitals such as Rush, which provide highly specialized care for the sickest patients and most complex conditions, also tend to have high volumes and high transfer rates. “Rush is the biggest transfer center in Illinois,” Hota says. “If somebody is sick in a community hospital and not getting better, that hospital knows Rush is a reliable place to transfer the patient to.”
‘Data is not perfect’
Rush notified U.S. News about the Rush findings in 2015, and U.S. News made some revisions to its data methodology. As a result, Rush’s patient safety score increased to four out of a possible five in the 2016-2017 U.S. News ratings released in August.
“U.S. News was very responsive to our findings, and the Rush community should take pride in the fact that we take care of very sick patients very well,” says Lateef, Rush’s chief medical officer. “When you adjust for patient acuity (severity of patients’ conditions), our outcomes data are among the best in the country.”
U.S. News' own head of health analysis, Ben Harder, acknowledged the report's significance in a commentary published on the U.S. News website. Harder called the report “an important contribution to the healthcare community's understanding of the Patient Safety Indicators, or PSIs, a group of quality measures used by U.S. News, the Centers for Medicare and Medicaid Services and other organizations that report on hospital performance.”
The Rush analysis of the U.S. News ratings confirms what many experts have suspected. “People have argued and theorized that if you take care of a lot of very sick people or have a lot of transfers, then the data could be biased,” Hota says. “This study proves that. It shows that data is not perfect, and misunderstandings in the data can negatively portray a hospital.”
Acknowledging that hospital rankings have room for improvement, U.S. News is convening a panel of experts in November to discuss how to better evaluate hospital quality, safety and performance, according to a press release.
Transparency is a two-way street
Lateef and Hota, and the quality team believe that right now the rules used by hospital rating systems are too complex for most consumers to navigate. That’s why they are urging U.S. News and other hospital rating systems to be more transparent about how they rate hospital quality, so that individual hospitals can validate the accuracy of their scores and work with the rating organization to adjust the scores as appropriate.
“The companies that are producing these ratings have to be more transparent,” Hota says. “Their methodology should be reproducible and easily understood.”
In the meantime, Rush will continue to measure, track, and share patient safety data rigorously, and use that date to drive improvements to make Rush’s care even safer. “We consider quality to be the most important part of patient care,” Lateef says. “We believe in being transparent about quality so we can continue marching towards improvement.”
For media inquiries, please contact john_m_pontarelli@rush.edu.