Zeilen is een combinatie van wetenschap en kunst. En voor wie daar meer over wilt weten schrijft Albert De Nijs, instructeur bij de De Zeezeilers van Marken wekelijks een rubriek met tips&trics van de Royal Yachting Association.
When dealing with human errors, are we looking to fix the blame on a person, or should we be looking to fix the underlying issue and thereby improve the system?
After a maritime accident or incident, the Dutch Board for Transport Safety has extensive powers to investigate and gather information. When the crew acted contrary to the rules of good seamanship they have to appear before the Maritime Disciplinary Tribunal.
Putting blame on someone who made a mistake may sound logical, but it is not the most effective way to improve the system that person was working in. Professionals seldom set out to cause unsafe situations on purpose. Most accidents are the consequence of the tools and tasks the organization provided. It could very well be a symptom of trouble deeper inside the system.
The outcome of an incident (the seriousness of the result) often classifies the mistake itself, and more importantly, the consequences for the crew involved, a classical ‘looking-back bias’.
In a system approach you would look at the human in the system instead of the human versus the system. To make the system safer and more robust, you want your professionals to report potential unsafe situations. Even if they themselves were involved. Most incidents are only reported when there are undeniable consequences, for example external witnesses or damage. That means we cannot learn from all those mistakes without consequences.
In the airline industry ‘Just culture’ is used, where learning safety lessons is paramount. Only in case of willful misconduct or gross negligence legal action can be taken. If you want your employees to report safety issues, everyone must accept that people make errors and employees will not be punished simply for making an error. The aim is to find out what went wrong, why it happened, and act in order to prevent it from happening again. If one person makes a mistake, the same thing could almost certainly happen to someone else. People should feel safe to report issues that may implicate themselves, in order to improve the system.
For example, suppose a watchkeeper falls asleep during his watch. If nothing happens, this will (most likely) not be reported. If an incident happens during the nap of the watchkeeper, there will be serious (personal) consequences. But the situation is the same, the watchkeeper was not alert. The underlying reason could be almost anything, but if not reported, the system cannot learn and similar situations are not avoided.
The punitive approach as used in the maritime environment does not help in solving these problems because frequently the system itself is (also) at fault. Let’s fix the issue, not the person!
Albert de Nijs, Dutch Offshore Sailing Academy