Human Error
Sep. 3rd, 2005 11:09 amI haven't been posting this week, mostly out of a feeling that anything I said about New Orleans would be redundant and anything I said about anything else would be frivolous. But I had a conversation with Nameseeker that I want to elaborate on here.
I'd been reading the paper, and following the discussion on Making Light, and getting more and more frustrated. I called N. from work and said this is making me sick. What's the point of being a continent-wide nation that supposedly shares its resources if this is the best we can do? We should have been doing X, Y, and Z by now; we should have planned A, B, and C years ago. And Nameseeker said Ashni, I'm always saying to you that people need to be more aware of the cost of their mistakes. Even while you're driving, you make all these snap decisions and just one wrong might kill somebody. And you, Ashni, always tell me that preventing every one of these errors is impossible, and that focusing on the cost just adds to stress which makes the errors worse. And you always tell me that understanding human problem-solving makes you more forgiving of human error, and as an EMT I honestly believe that the people on the ground right now are doing their best with limited data. And I had to admit that she had a point.
Some background: I've acted, for several years, as a consultant on a medical error project. I talk to doctors about why humans make errors, and why the doctors are not immune, and why the errors are not going to go away entirely, and what they might be able to do to minimize the number and impact. The short version is that humans often make decisions using rules of thumb that are geared toward situations where there is not enough information available and not enough time. We have to do this--the prototypical situation is that you're out hunting mammoth and a sabertooth jumps out at you. If you are the sort of person to stop and analyze every aspect of the situation, you will not be passing on your genes any time soon. So you build up a database of past dangerous situations, and you do the thing now that has worked most of the time. This means that in an atypical situation, you're more likely to make mistakes. The doctor who sees chest pain is going to save more lives by assuming it's a heart attack at first, every time. The occasional person with a weird syndrome may die because of this. But if the doctor looked for weird syndromes every time, a lot of people with heart attacks would suffer.
In our medical study, we talked about cognitive errors (individual doctors and their imperfect human reasoning) and system errors (e.g., the radiologist forgot to send the X-Ray up to the doctor; the doctor forgot to ask where the heck it was). You can address both of these at either the cognitive level (long-term changes in the way you train individuals) or the system level (have resources in place for someone to be checking into alternate possibilities while the primary doctor is working on the most likely diagnosis, or have a computer system in place that says by the way you ordered this X-Ray and it never arrived). The fact that not all errors can be prevented doesn't mean that you can't or shouldn't minimize them. And system solutions to cognitive errors make a big difference. You can accept that your organization involves humans (something medical institutions are occasionally bad about), and then you can set up the organization to catch errors made by every one of them.
Back to New Orleans: this conversation was Thursday; since then it's become pretty clear that while the people on the ground are indeed doing their best, there are some major problems at the system level. In addition, the people not on the ground, who are giving orders to the people on the ground, are not doing their best. Remember what I said about operating with imperfect information? It's supposed to be the job of the people at the top to give people further down the information that will help them the most; it's not happening. This sort of major effort can't depend on individuals not to make errors; you need the system-level organizational structure to minimize their impact. We know how to make these structures. Jim MacDonald describes one here.
So I am feeling more forgiving of the people on their ground. They really are doing what they can. But I'm feeling less forgiving of the people who set up the system (or failed to) by the second.
And OMG, when I went to check that link, I saw in a new post that the National Guard high-ups are not permitting the Red Cross to enter New Orleans. Apparently, they are worried that it might make the people still there resist evacuation.
This is not my opinion as a psychologist, just as a human. There is such a thing as a criminal level of error.
I'd been reading the paper, and following the discussion on Making Light, and getting more and more frustrated. I called N. from work and said this is making me sick. What's the point of being a continent-wide nation that supposedly shares its resources if this is the best we can do? We should have been doing X, Y, and Z by now; we should have planned A, B, and C years ago. And Nameseeker said Ashni, I'm always saying to you that people need to be more aware of the cost of their mistakes. Even while you're driving, you make all these snap decisions and just one wrong might kill somebody. And you, Ashni, always tell me that preventing every one of these errors is impossible, and that focusing on the cost just adds to stress which makes the errors worse. And you always tell me that understanding human problem-solving makes you more forgiving of human error, and as an EMT I honestly believe that the people on the ground right now are doing their best with limited data. And I had to admit that she had a point.
Some background: I've acted, for several years, as a consultant on a medical error project. I talk to doctors about why humans make errors, and why the doctors are not immune, and why the errors are not going to go away entirely, and what they might be able to do to minimize the number and impact. The short version is that humans often make decisions using rules of thumb that are geared toward situations where there is not enough information available and not enough time. We have to do this--the prototypical situation is that you're out hunting mammoth and a sabertooth jumps out at you. If you are the sort of person to stop and analyze every aspect of the situation, you will not be passing on your genes any time soon. So you build up a database of past dangerous situations, and you do the thing now that has worked most of the time. This means that in an atypical situation, you're more likely to make mistakes. The doctor who sees chest pain is going to save more lives by assuming it's a heart attack at first, every time. The occasional person with a weird syndrome may die because of this. But if the doctor looked for weird syndromes every time, a lot of people with heart attacks would suffer.
In our medical study, we talked about cognitive errors (individual doctors and their imperfect human reasoning) and system errors (e.g., the radiologist forgot to send the X-Ray up to the doctor; the doctor forgot to ask where the heck it was). You can address both of these at either the cognitive level (long-term changes in the way you train individuals) or the system level (have resources in place for someone to be checking into alternate possibilities while the primary doctor is working on the most likely diagnosis, or have a computer system in place that says by the way you ordered this X-Ray and it never arrived). The fact that not all errors can be prevented doesn't mean that you can't or shouldn't minimize them. And system solutions to cognitive errors make a big difference. You can accept that your organization involves humans (something medical institutions are occasionally bad about), and then you can set up the organization to catch errors made by every one of them.
Back to New Orleans: this conversation was Thursday; since then it's become pretty clear that while the people on the ground are indeed doing their best, there are some major problems at the system level. In addition, the people not on the ground, who are giving orders to the people on the ground, are not doing their best. Remember what I said about operating with imperfect information? It's supposed to be the job of the people at the top to give people further down the information that will help them the most; it's not happening. This sort of major effort can't depend on individuals not to make errors; you need the system-level organizational structure to minimize their impact. We know how to make these structures. Jim MacDonald describes one here.
So I am feeling more forgiving of the people on their ground. They really are doing what they can. But I'm feeling less forgiving of the people who set up the system (or failed to) by the second.
And OMG, when I went to check that link, I saw in a new post that the National Guard high-ups are not permitting the Red Cross to enter New Orleans. Apparently, they are worried that it might make the people still there resist evacuation.
This is not my opinion as a psychologist, just as a human. There is such a thing as a criminal level of error.