As per our education and culture we have incorporated assurances such as the following:
We are the result of our dedication and effort. We are the product of our desire to be and do things. Good people get the best results and do not make mistakes. We cannot fail. We have free will and we do it our way.
These sentences are not entirely true. The context in which we move determines both our results and our will and dedication. However, we feel the weight of responsibility as if everything depended only on us. It is a feature of our individualistic culture. Knowing the context will help us to improve medical care.
In Health Safety, the health professional improves his development by comprehending his relation with Context.
The operation of a hospital is a complex system and it is called a complex partner because of the high dependence on people’s abilities to operate the processes.
Complex systems have a non-linear relationship between causes and effects. There are several factors that influence a result that focuses on other factors. There are inhibitors, actions and triggering catalysts and attractors that produce their own gravitational field. There is a positive or negative feedback between drivers. Human beings tend to seek a causal and linear relationship and to think serially (one problem after another). We normally do not see the whole picture. We see only the part that corresponds to us. Thus paradoxical (unexpected) results appear. In this context, with limited resources, incomplete information, pressure on production to deliver, fatigue and the act of doing several things at once (multitasking), we must provide a service without errors or transgressions. It seems like an unequal fight.
Patient Safety undergoes continual comparisons with aviation regarding to the improvement they have achieved. The world of medicine is much wider than the world of aviation and many comparisons are not valid. Even so, we can learn from them.
Much part of the accident or adverse event studies focus on human error as the cause of disaster activation. Sometimes we consider the professional a hero. However, he is generally seen as the one who solved 100 problems but could not deal with one.
Recent studies on transgressions of pilot operating standards indicated that a very high number of procedural violations were necessary for the safety of the aircraft.
We are not proposing to violate the rules, but to understand the reason why many rules are violated. Some rules are simply transgressed because it is easier to do it that way. Others because the rule is not clear.
Making mistakes is part of our human condition. Some errors cannot be avoided, but can be anticipated and solved over time. “We cannot change the human condition, but we can change the conditions in which we work to have fewer mistakes and easier recovery.” (James Reason, The Human Contribution, 2008).
The errors happen in three levels of control: automatic, mixed and conscious. They operate in the following situations: routine situations prepared to solve problems or new situations.
In the following table, James Reason shows us the three performance levels: according to skill, standards and knowledge.
1. Errors of ability (errors based on skill). These errors are committed when operating at an automatic level (not conscious) and are usually called slips or lapses.
2. Errors when complying with rules (errors based on standards). It fails because of an error in the application of a rule or violation of the rule.
3. Errors of knowledge (errors based on knowledge). In face of new situations and does not apply the appropriate solution for memory error or lack of knowledge.
Operating in a conscious way all the time would not be possible. The brain uses roads and shortcuts to avoid energy consumption. Our attention is limited. If we receive information in the middle of a task, we may lose the operation of something we know well. When the mind searches information in our memory (we call it information packet), this is based on similarity and most frequent or recent use. We compare the information we receive with stored information following these norms: similarity and frequency. This can fool us very easily, because for the mind, due to energy economy, it does not analyze all possibilities. If you are working based on knowledge and the memory sends erroneous data, activity or solution, it can go wrong.
The only way to reduce our risk is to create a series of multiple layers barrier to avoid error or, when it happens, to soften the consequences of that error.
Barriers can be personal habits or systemic procedures.
It is necessary to understand the differences between conditions of causes and errors. The conditions are present in the cases with poor results and in others where nothing serious happened. The condition for failure, also called a pathogenic condition or latent failure does not cause any problems until the cause appears. The cause is the trigger of an existing fault, asleep within the organization or process.
The model given below is an evolution of the swiss cheese model, the same given by James Reason. This author dedicated more than 40 years to the study of the accidents and the conditions of error.
This way, the organizational culture type of guilt and the work environment are separated. Although both are part of the context, the first one has a more generic and cultural characteristic and the second one a more physical and the moment of the incident.
Therefore, we can also obtain important information, the security strategies of the organizations of high reliability (HRO).
In all of them the consequences of an error are very serious, once these industries or organizations of high reliability (HRO) in general, follow very strict guidelines from which we can extract some safety guidelines.
CONCLUSION
Everybody makes mistakes. It is a characteristic that is part of our humanity.
Accept them and be prepared to avoid mistakes or correct them without causing damage, is what the champions of security do. The same thing happens individually and collectively.
- The best organizations adopt a culture of transparency and mutual trust.
- Errors allow us to learn and improve. You can learn a lot from almost-accidents with a low cost because of damages.
- Strongly hierarchical systems (based on power) do not help to develop security. Responsibility should be given to those who can solve security problems in a better way (experience and expertise).
- People and organizations work with the law of minimum effort. Violation of the established rules will take place if the procedure is not easy to achieve.
- The solution to a complex problem must have a higher level of complexity than the problem to solve it. A simple answer to a complex problem is a methodological error, independent of the response.
- Each person and each organization has its own “dormant errors or resident pathogens” waiting for a trigger to materialize.
- We must escape from superficial responses, focus on experience or on criteria based on common-sense. We must deepen the questions and answers for all problems.
- Because of how the mind searches in memory to find a pattern, based on what is similar and frequent, and not on all the information we have, it makes us always prone to errors. Developing habits to prevent these mistakes is our goal as professionals.
- The main process of high reliability or HRO organizations is controlled by computers (up to 320 applications on Airbus) and one person or supervisor controls the operation of the whole system because computers do not make mistakes, but the person who programs or feeds the information does. Principle of redundancy.
That means, there are recommendations and methodologies to reduce adverse events. We have not created a list of procedures here. We have concentrated on the conceptual and attitudinal.
Understanding the psychological aspect of an error or a violation leads us to change a punitive behavior by a conduct of greater investigation.