Thanks for visiting!

Sign up to access all our FREE articles, tools, and resources.

banner
HCPro

The cost and truths of human error


CLICK to Email
CLICK for Print Version

Editor’s note: The following is the third in an occasional series about human error and its role in medical error. This month, author Robert J. Latino, executive vice president of the Reliability Center, Inc., in Hopewell, VA, discusses human error and how it is viewed by those involved, those on the outside, and those investigating the error.

Approximately 5.7 million workers are injured annually in the United States. In the healthcare field alone, the Institute of Medicine (IOM) reported in 1999 that medical error was accountable for between 44,000 and 98,000 deaths per year.

Human error will almost certainly be a contributor to such undesirable outcomes because human decision-making will determine one’s behavior. And poor decision-making results in preventable deaths, costly equipment downtime, poor product quality, and hence, reduced profitability.

In 2004, 91% of fatal work injuries occurred in private industries. Of those deaths, 47% occurred in the service industries, and 44% occurred in the goods industries. Also, 2004 statistics from the Bureau of Labor and Statistics revealed that construction industry deaths were up 8% to 1,234, compared to 1,131 in 2003. These statistics indicate an upward trend in accidents involving fatalities.

But think about all of the accidents that occur that do not rise to the level of a fatality. Sometimes such incidents result in degrees of financial loss, and sometimes they result in degrees of risk.

In 1969, Frank E. Bird analyzed 1,753,498 accidents reported by 297 companies and found the new ratio of 600:30:10:1. This means that for every 600 near misses, there will be 30 property damage incidents, 10 minor injuries, and one major injury.

So when is good performance good enough? At what point do we rest on our laurels and relax our defenses? As you would imagine, never. Many would say this happened at NASA prior to Challenger (and again prior to Columbia).

When we start to think that there is not much room for improvement, we should remind ourselves that if we were 99.99% accurate, we would still experience:

Two unsafe plane landings per day at O’Hare Airport

500 incorrect surgical operations each week

50 newborns dropped at birth by doctors every day

22,000 checks deducted from the wrong bank account each hour

32,000 missed heartbeats per person, per year

114,500 mismatched pairs of shoes shipped each year

200,000 documents lost by the IRS this year

The truths of human error

A decision error triggers existing conditions in our work environment to cause a series of physical consequences. Ultimately, if this sequence is permitted to continue, an undesirable event will occur in which we have no choice but to address it. This undesirable event will be deemed a failure of a certain severity and magnitude. Given a basic understanding of this error chain, we can see that many people have preconceived beliefs about failure that affect decision-making. The facts are:

Good people make honest mistakes

A fast-paced, ever-changing world sometimes outsmarts us

We can never work error-free, but if we lower human error, we will lower failure rates

Systems are not basically safe; people design safety into systems; the systems do not come that way

Most people come to work prepared and have the relevant knowledge to be successful

Having the knowledge to be successful and applying that knowledge successfully are two different scenarios. We must possess the knowledge in the first place, organize the knowledge in a manner that makes it usable, and then know when to apply it. Many believe that to ensure that people “get it right,” we should enforce more rules, and people will follow them. Unfortunately, this is not always true. Adding more rules and tightening procedures adds more complexity to our working environments. As a result, the gap between procedure and practice widens instead of narrows. When that happens, we increase our risk of safety incidents.

Practice should not be expected to equal procedure; a narrow gap between the two is acceptable. o procedure can possibly encompass every conceivable event and condition that could occur. Therefore, a certain amount of judgment must be afforded in our procedures. Without such room for judgment, we handcuff our workers and do not allow them to apply their knowledge to situations that arise that are not accounted for in the procedure. This may result in a “work slowdown,” which is simply when staff members follow procedures exactly—nothing more and nothing less. This will drop overall productivity quickly for the reasons described above.

The views of human error

In the old school of human error research, it was believed that human error is a cause of accidents and incidents. Investigators would focus on the people involved and seek to explain the failure. This focus led to interrogations that focused on the inaccurate assessments made by those involved. As a result, the investigators made bad judgments and ensuing wrong decisions.

The new research tries to understand why people make the decisions they do by understanding deeper problems in the organizational systems. When undesirable outcomes occur, there is usually a poor decision made somewhere in the error chain. We must believe that the person who made the poor decision did not intend the outcome. Human error is not random. We can trace decision-making patterns and trends to previous behavior. Human error is not the ending point of the analysis; it is the starting point. Rule-based errors typically occur for one of three reasons, according to the Generic Error Modeling System (GEMS):

1. The rule itself was not correct, and we followed it

2. The rule was not correct; therefore, we applied it incorrectly

3. The rule and the information regarding it were correct; we had a problem complying with it

Knowledge-based errors occur when situations arise and we have not been prepared to address them (i.e., no rules exist). In these cases, we must rely on our basic knowledge and apply it to the new situation. To put the GEMS model into proper perspective, we can draw the following general conclusions:

New hires are more prone to knowledge-based errors

As we gain more years of experience, we are more prone to becoming complacent with our jobs

Highly experienced employees typically are less prone to skill-based errors; however, they may become complacent and overconfident and cause such errors n

Editor’s note: Go to www.proactforhealthcare.com to visit the Reliability Center’s Web site.

References and acknowledgements

Latino, Robert J. and Kenneth C. Latino, Root Cause Analysis: Improvement Performance for Bottom-Line Results

Dekker, Sidney, The Field Guide to Human Error Investigations

Eisenhart, Steve, Human Error Reduction for Supervisors

Reason, James, Human Error

OSHA Fatal Facts, www.osha.gov/OshDoc/toc_FatalFacts.html

U.S. Department of Labor, Bureau of Labor Statistics, Census of Fatal Occupational Injuries, 2003

National Center for Statistics and Analysis

Wolgalter, et al, 1989, Human Factors on Administrative Procedure Compliance

North American Rockwell Corporation, Defect Error Rate

Hargrave, Jan, Let Me See Your Body Talk

Honeywell International, Inc., Honeywell High Spec Solutions