Humans make mistakes. Even in health care. It’s a fact. The question is, what do we do about them when they happen? And perhaps most importantly, what are the best ways to prevent them from happening in the first place?
By April Frawley Birdwell
A nurse mislabels a specimen.
A doctor overlooks test results.
A surgical team forgets to remove gauze during a procedure.
At UF&Shands, an entire department exists specifically to focus on reducing the risk of medical errors. The Clinical Risk Management department was formed in 2011 to investigate adverse events and near misses and develop action plans and lead efforts that will help redesign and improve how health care is delivered. And in order to truly discover what the underlying problems are within the system and improve care for patients, clinical risk managers have brought a new set of tools to the table, surprisingly culled from the world of engineering.
For UF&Shands, it’s a new approach to handling risk management, which has largely been more focused on handling claims than focusing on change and improvements, said Susan Keating, who joined UF&Shands in September to lead the new program, which is part of the Sebastian Ferrero Office of Clinical Quality and Patient Safety at UF&Shands. The goal, Keating says, is to change the culture so employees feel safe to report their own mistakes and the mistakes of their peers, allowing problems to be evaluated and fixed on a systemwide basis before patients are harmed.
“The Institute for Health Care Improvement estimates that only 10 percent of medical errors are reported,” Keating said. “Encouraging reporting is really important. We want to create a culture in the organization where people feel safe reporting errors, otherwise problems get underground.
“You cannot fix what you don’t know.”
Reporting an error is just the first but most critical step. Once an error or problem is known, Keating and her three clinical risk managers (Brad Green, Gregg Koff and Robert Kelly) assemble teams charged with applying different forms of engineering-derived analysis to solve the problem.
For example, if something goes wrong in a department and there’s an indication that a process may be to blame, the team will lead what is known as a root-cause analysis, or RCA, to get to the bottom of it. For an RCA, clinical risk managers invite everyone who has a stake in the case, from executives to staff members working in the unit. (To see how these techniques work in a real case, see Lessons Learned).
“A root-cause analysis is a retrospective examination of an event,” Keating said. “We look at the process and ask how can we redesign our procedures to avoid similar events ever happening again. We have increased the number of root-cause analyses (in recent months). I am asking to do them not only for major harm events but also near misses and good catches because these events are clearly indicators of a system concern.”
Another option is the failure modes and effects analysis, or FMEA. Risk managers employ this tactic when a specific process does not appear to be working or when multiple patient safety reports are submitted for the same issue. The team takes a hard look at the process and pinpoints where it fails, assigning a “hazard score” to each of these failures. The ultimate goal is to find ways to reduce the hazard score and improve the process.
“These are tools that have been used by engineers for decades and we have been applying them in health care over the last 10 years,” Keating said.
Once problems have been identified and action plans have been developed, clinical risk managers work closely with colleagues in the department of quality and accreditation, who take recommendations and work with departments to implement them.
“The patients are always at the center of our focus,” said Debbie Lynn, director of quality and accreditation. “Whether we are addressing an issue that did happen or something that might happen that we can prevent, we work to mitigate risk in both ways to ensure our patients get the best care possible.”
There’s been good news, so far. In recent months, there have already been a reduction in the number of major harm events, when errors actually affect a patient, and the number of patient safety reports coming in from employees has increased. But the number of reports coming in is still relatively low, in part because many employees believe they will be punished if they admit mistakes. However, this is the opposite of the culture changes leaders hope to implement. (For more information on Just Culture).
Changes are underway to help improve the reporting process to make it easier for staff members, said Linda Allen, M.H.A., manager of quality systems. Also, risk managers periodically hold risk “huddles” with departments to talk about issues uncovered both systemwide and within specific units, to encourage reporting and to spread the value of learning from past mistakes.
“In order to improve systems we have got to communicate effectively. That means coming together as collaborative teams and examining issues, and that is what the FMEA and RCA process does,” Keating said. “It brings together departments that could have been at odds. All of a sudden they are human beings around a table looking at a problem and all those differences are gone. It’s really team-building.”