There’s good news for hospitals scrambling to meet The Joint Commission’s culture of safety standard: The government is here to help—and the tools and guidance are free.
Standard LD.3.10 calls on organizations to assess and set up a culture of safety and quality. But if hospitals just stop at assessment and don’t make meaningful changes, they may be doing more harm than good.
“It’s really important for an organization to use the information from the assessment to take action,” says James Battles, PhD, a senior service fellow for patient safety at the Agency for Healthcare Research and Quality’s (AHRQ) Center for Quality Improvement & Patient Safety. “Because if you just do it as an exercise, to check something off, and you’re not going to do anything about it, you may do more damage than if you hadn’t done anything in the first place.”
Simply going through the motions, he adds, will only “frustrate staff and probably make matters worse.”
The AHRQ has a free culture of safety assessment on its Web site, a tool that took years to create. Part of what Battles and the AHRQ looked at when designing the instrument was the questionnaire the Veterans Health Administration (VA) pioneered to assess its culture of safety.
Staff worries about humiliation
The VA began looking at its culture in 1998, one year before the Institute of Medicine’s (IOM) groundbreaking report To Err is Human. The questions unearthed some surprises, as well as some expected news.
“The higher you were in the pecking order, the more you thought communication was great; the further down you were, the more you said it stunk,” says James Bagian, MD, PE, chief patient safety officer for the VA and director of the VA National Center for Patient Safety. “That’s not unique to medicine. Most people thought that was going to be the case.”
What Bagian, a former NASA astronaut who investigated the Challenger and Columbia disasters, didn’t expect were the responses from staff members about the consequences of making a mistake.
“One thing we thought we would find, but didn’t, was that one of the top things people worried about was being punished if they made a mistake,” Bagian says. “That was what we thought. That was the common wisdom. Turned out, that really wasn’t the case.”
Even though several people acknowledged it was an issue, they didn’t rank it very highly on the scale of 1–5. But 49% of the staff ranked humiliation as a 5 when asked about their fears of making a mistake.
“This influenced a lot of what we did,” Bagian says. “You can change your personnel policies to say, ‘If you make a mistake, we won’t downgrade you, we won’t fire you, we won’t suspend you.’ However, that has nothing to do with shame.”
VA focuses on systems, not individual errors
Consequently, the VA decided early on that if a safety investigation found that a clinician had simply made an honest mistake, it wouldn’t reveal his or her identity, but instead look at systems-related solutions. It focused on prevention, not punishment.
Before the VA set out to improve patient safety in the late 1990s, a medication error that caused harm or death typically resulted in some sort of reprimand for the nurse.
Bagian talked to department heads and nurse managers and provided them with tools to do safety investigations that focused not on one person, but on the process that led to it. The investigation might probe aspects such as whether confusion stemming from a look-alike/soundalike drug or possible mislabeling contributed to the medication error.
“Instead of punishing the last person to touch the patient, which typically happened before, now they put a countermeasure in to prevent this from happening again, not just to nurse X, but to anybody,” says Bagian.
When staff members saw that a serious error led to a change that would help all of them make fewer mistakes—instead of a nurse being fired—they not only bought into the program, they became more likely to report events and near misses, he adds.
Clinicians learn from close calls
“You can’t fix what you don’t know about,” Bagian says. “If someone’s not willing to tell you about a problem, there’s no way you can correct it. Secondly, you want to learn from close calls. Wouldn’t you rather learn from something that didn’t happen but could have, rather than wait until you kill somebody?”
If the results of the question on shame caught Bagian off guard, the responses to the query about the importance of safety to patient care absolutely floored him. Again, this predated the IOM’s 1999 report.
The question asked: Do you think patient safety is important to good patient care? Staff members could respond on a 1–5 scale, with 1 being “strongly disagree” and 5 being “strongly agree.”
“We thought that everybody would say 5, strongly agree,” Bagian says. “Well, that wasn’t the case. Twenty-four percent said 1, strongly disagree. We were all flabbergasted about that.”
After talking to the staff, Bagian and his colleagues learned that clinicians didn’t necessarily think patient safety wasn’t important; it just wasn’t a priority to them because they thought they had it all figured out.
“We had to show them that no matter how good they are, given the right set of circumstances, they’re capable of actions that can hurt patients,” he says. “It doesn’t mean they’re bad people; it means they’re human. Mistakes happen and we need to look at what will make it less likely to happen.”
The answer, Bagian says, is to design systems and protocols that make it easy to do the right thing and, when possible, impossible to do the wrong thing.
Effective systems also require built-in redundancies. Bagian uses the airline industry as an example, which has a stellar safety record. Commercial airliners have at least two engines so if one fails, the aircraft doesn’t crash.
Even though engines are built not to fail, if they do, the aircraft doesn’t crash and kill all the passengers on board. “Whereas in medicine, and in many industries, people say, ‘Let’s be perfect, let’s try hard,’ Bagian says. “The metaphor in aviation would be that we’re flying single-engine airplanes. And on a day that the engine does fail, we all just crash.”
AHRQ pushes nonpunitive culture
Like Bagian, Battles stresses that the culture must be nonpunitive if it wants to truly advance patient safety. An AHRQ benchmarking study, Hospital Survey on Patient Safety Culture: 2008 Comparative Database Report, analyzed data from 519 hospitals that conducted a culture of safety survey with 160,000 staff members nationwide. It found the following:
64% worried that the mistakes they made would be kept in their personnel file
49% felt mistakes were held against them
55% felt like the person, rather than the mistake, was being reported
Battles says he hopes the creation of patient safety organizations, mandated by the Patient Safety Act of 2005, will go a long way toward changing those perceptions. Under the legislation, clinicians can sue their hospitals if they are punished or fired for reporting a mistake.
“As part of a framework of culture, the hospital and healthcare environments have been pretty punitive in this respect, and that’s one reason the provision is in the law,” says Battles.
Survey asks about feedback, teamwork
The AHRQ’s survey includes questions in 12 domains that assess everything from communication and openness to teamwork across and within units to feedback and communication about errors. Within each domain, staff members answer questions about their particular unit as well as the hospital overall.
“Depending on your organization, you may have multiple different cultures going on,” Battle says. “There might be different cultures in medicine, surgery, and the ED.”
The literature, he adds, shows that the unit manager has tremendous influence on that unit’s culture. Smaller hospitals tend to have fewer variations in culture in different units.
Hospitals start with early wins
After assessing the organization’s culture of safety, hospitals should tackle the areas where they can most easily see early success.
“Go after the things that are winnable and fixable and get started, because you want to show improvement as quickly as possible,” Battles says. “It’s very difficult to try to fix everything at once.” The AHRQ survey engages everyone who works at the hospital in solving the problem, he adds.
“One of the typical approaches, which is usually a disaster for hospitals, is that we appoint someone to come up with a solution,” Battles says. “It’s now up to the infection control person or the wound care nurse to solve all the problems. And if they don’t fix the problem, we fire that person.”
Instead, hospitals should share the risk so that everybody says: We’ve got a problem, let’s figure out how to solve it together.
“Changing culture takes time,” Battles says. “Culture follows action. To say, ‘I’m going to change the culture and just focus on culture’ is probably not going to help very much.” Instead, hospitals should examine what is causing the culture to be negative.
“There’s the old adage ‘The beatings will stop as soon as morale improves,’ ” Battles says. “You’ve got to begin to take this information and drill down to determine the problems and what you need to fix them. It’s the actions an organization takes that speak much louder than what they say.”