Human Success: Old wine in new bottles, or a shift of mindset for HRA in an automated world?
A really interesting conference paper from Andreas Bye, discussing whether shifting Human Reliability Analysis (HRA) terminology from human error to human success would help alleviate some blame-connotations.
Also discussed is the human role in automated systems.
It was meant to be a mini-post with a few dot-points and a couple of images, but was so interesting that it blew out to an article…
Worth checking out, but some points:
·???????? “Words matter. [HRA] has been hampered by a negative connotation around the concept of human error. Many people feel that this concept is related to blaming humans. Why doesn’t the HRA community redefine the vocabulary and call it human success?"
·???????? “ Most HRA practitioners would say that the term human error is neutral and has nothing to do with blaming”
·???????? That is, “HRA analyzes the context and predicts the way in which humans are set up for error and quantifies its probability”
·???????? “ It is a fallacy to believe that the human role can be excluded from the safe operation of an automated, complex industrial plant, or an airplane”
·???????? “The human will be given a role, and we have to design that role to be adapted to the human strengths and weaknesses”
·???????? Many people, including HF/E practitioners “view human error as a blaming term. However, it is not. The definition of human error is failure of a Human Failure Event (HFE). An HFE is any event in which humans are involved in a scenario”
·???????? “So the choice to call it human error is a choice to focus on the failure branch of a binary tree. In the analysis itself, of course the success branch is as important”
·???????? Using human error as a term may “give the impression that we are studying weaknesses of operators, and the reliability of human actions decoupled from the environment that they are in. This is clearly a misconception”
·???????? “HRA analyzes the operators’ environment and context and how this sets ?them up for success or failure”
·???????? HRA has relied on demarcating Errors of Omission (EOO) and Errors of Commission (EOC)
·???????? EOO is said to be the classic normative view of error – “If the human follows the procedures everything will be fine and it is the fault of the human to do something else than prescribed”
·???????? HRA analyses tasks and performance shaping factors, e.g. “Adequate time to react is always an important factor … It is also a good example of why human error can be a wrong term in such cases. If the time required to mitigate an accident by the procedures at hand is far less than the time available, their chance to succeed is almost zero. However, it is still called human error”
·???????? “We could as well call this case human success, and analyze whether the context (in this case the available time) gives the human any chance whatsoever to succeed. Does the context, the environment and the situation, set the human up for success?”
·???????? He argues that human error event and human success event can both end in success and failure, and that HRA is simply calculating the probability of both – e.g. “if the Human Error Probability (HEP) is 0.1, the Human Success Probability is 0.9)”
·???????? So practically it makes little difference what they’re called, but changing the term may shift away from the error term, which has baggage
·???????? Hence, human error could become human success, which may “give a more positive view on what we are doing. Most importantly, we could get rid of the impression that we are studying actions and situations in which humans are blamed for failures that are happening”
·???????? Also, it would highlight the role of people for “part of saving the day, by using the tools at hand (safety systems and procedures)”
·???????? Remaining with error may be useful in HRA nevertheless, since it “may reduce the search space by looking for ways to make errors compared to a guideline or procedure”, e.g. searching and counting thousands of potential successful performances is more challenging that a few ways of failure
·???????? “many human errors are unavoidable, and we should learn to live with them, and build joint cognitive systems that can handle variability both in human behaviour and in system behaviour”
·???????? “HRA actually searches for the variability in performance that is discussed in e.g., Resilience Engineering (Hollnagel et al., 2006), and evaluates the context for human events and thus their probability for error, or success”
?
Next the role of people in automated systems is briefly discussed. Some points:
·???????? The role operators in modern automated systems is “described to be more monitoring, and less action oriented”
·???????? It’s proposed :“If it is possible to predict all things that are going to happen, one might as well make a completely autonomous system. However, if this cannot be guaranteed, one must include a human to collaborate in some way with the automatic system”
·???????? There are several important questions with the joint system – some being “Should the operator just monitor the automatic system, and should there be some sort of self-reporting of problems from the system?”
·???????? Another is whether the operator should be more “on-line collaborating with the automation and do some tasks? There are a number of questions here, including long-term skill retention”
·???????? But importantly, design should avoid “making the human a pure backup for the automation”
·???????? A potential issue is emerging in automated vehicles, where the auto-piloted, not designed for a situation, “simply hands over the control to the driver. One must avoid situations in which this is done at critical points in time”
Ref: Human Success: Old wine in new bottles, or a shift of mindset for HRA in an automated world? Andreas Bye. 17th International Conference on Probabilistic Safety Assessment and Management & Asian Symposium on Risk Assessment and Management (PSAM17&ASRAM2024) 7-11 October, 2024, Sendai International Center, Sendai, Miyagi, Japan
Health & Safety - Expert Witness and Consultant | FIIRSM, FRSPH, MISTR, EurOSHM
3 小时前A really nice paper Ben that makes a clear point about the nature of reliability and the human componant in systems. I will source the original on this one, and may well end up quoting some of the nice pithy statements in my work. There is far too mucb blame in the follow-up to accidents and this paper appears to make it clear that people are 'set up for failure' by bad workplace settings (machines, environments etc.). Something I find myself having to explain to lawyers, repeatedly.
Certified Work Health and Safety Professional and leader~ Here to support and learn// my comments are mine based on my professional experience and not reflective of employment.
4 小时前Love the title on this one.
Solver of Complex Challenges & Driver of Innovative Solutions | Team Player with a Creative Edge | plus a little joie de vivre.
7 小时前I enjoyed reading the paper but I didn’t find weighting to profit and time constraints in some acceptable manner. Distractions are human error but can be caused by the environment which I felt was there but not nearly strong enough. I would bet a word cloud of man made events and “pressure” would be near the top.
CEO at vPSI Group, LLC where we're making the world a better place, one company at a time.
13 小时前Ben Hutchinson IMHO the snippet you have quoted mixes up two entirely different things. Human error (if it is actually present) is causal towards an unplanned event. In contrast, to mitigate something means?to make it less severe, intense, or unpleasant, ie it's about reducing the consequences of an unplanned event. cause(s) -> unplanned event -> consequences
HSE Leader / PhD Candidate
13 小时前Study link: https://www.iapsam.org/PSAM17/program/Papers/PSAM17&ASRAM2024-1100.pdf My site with more reviews:?https://safety177496371.wordpress.com Shout me a coffee: https://buymeacoffee.com/benhutchinson