Human error is frequently cited as a contributing factor in accidents, but what is it really? It was not until the Three Mile Island nuclear meltdown accident in 1979 that human error really became a target of scientific study and intervention in the broader safety sciences, even though the idea of human error has been around since the early 1900s. The main safety and error researchers of the twentieth century following the Three Mile Island over a number of conferences considered the issue of human error.
Two different schools of thought emerged from this time the Joint Cognitive Systems school and the Cognitive Psychological school. James Reason, a professor of safety science, is arguably the best representative of the cognitive psychology school, the human mistake or error. Back in the 1980s Reason, who was a psychologist, researched the errors made in people's daily lives, following a number of personal observations. by researching diary entries in which people reported their blunders, he established a theory of 'absent minded slips'. Reason became interested in the mistakes operators of high-risk systems make, and he began to analyze these mistakes using a lot of the theory he had already created for common slips after the Three Mile Island catastrophe.
The cognitive psychological school (CPS) views error as a social fact of life. In his 1990 book Human Error, he established four forms of errors or risky activities. There are lapses, which are memory failures, and slips, which are failures of attention. Errors that, in Reason's opinion, can be knowledge-based, or rule-based, which is a violation of a rule or procedure. According to Reason, and his human error taxonomy for high-risk operations, understanding and classifying human error can provide explanations for accidents. A cause could be identified as human error. Moreover, psychological cognitive models that are employed to explain behavior include the slip, lapse, error, or violation.
In the Risø nuclear research facility, a different school of thought emerged, Professor Jens Rasmussen developed the joint cognitive systems school (JCSS) of thought along with colleagues Erik Hollnagel and David Woods. While managing complicated, high-risk work processes was the goal of both the CPS and JCSS, they arrived at radically different conclusions.
The Risø team didn't begin by looking at common mistakes made by humans or laboratory trials. Instead, they created a naturalistic school that considered high-risk work in a naturalistic manner. They looked at errors as a result of intricate interactions between actors in space and time, as opposed to investigating errors as a psychological or cognitive construct.
How do people and machines communicate, for example, was one of their inquiries? What limits human behavior? What guidelines for designing technical interfaces should be followed to help people comprehend the condition of the system?
Rasmussen Hollnagel and Woods arrived at a startling result, by approaching human error not as the root cause of an accident, but as a symptom of larger systemic issues, intrinsic to the work process. Hollnagel considered the concept of human error as an analytical dead end. They contended that considering human error after an accident was too simplistic and possibly beyond simple academic analysis. This decision in analysis has important ramifications.
Consider for instance an unintended injury to a patient after surgery—the two schools will offer various ways to improve the system depending on the error we describe.
The cognitive psychological school will concentrate on system interventions at the cognitive level and will emphasize motivation, selection, proceduralisation, and functional allocation, or what jobs should be carried out by humans and which should be done by technology. The integrated cognitive systems school, analyse the accident in terms of what factors ultimately led to what appears to have been a human error, both in terms of timing and hierarchy. The joint cognitive school considers the gap between work as imagined and work as done and how we have configured humans and technology in their working environments. The adherents of the joint cognitive school will consider intrinsic and extrinsic motivators and their level of interplay throughout the phases of the event
In order to enhance our systems, we will need to decide which accounts of human error we find trustworthy and consider to be true. The perspectives expressed in the stories told by various stakeholders, including accident investigation teams, safety managers, unions, and media, will vary. As a result, it becomes important to consider both who is telling the tale and how it is told and maybe why it is being told in a specific way.
Comments