Human-Systems Integration is essential for Safe Automation
Robin Burgess-Limerick
Professorial Research Fellow seconded to Think and Act Differently, BHP
The National Robotics Strategy discussion paper highlights the potential benefits of robotics and automation for Australian industry. As well as increasing productivity, industrial automation also has great potential to reduce safety risks by removing people from hazardous situations.?
However, automation does not remove people from the system - it just changes the tasks they undertake. Automation can also introduce new types of errors and, potentially, new hazards. The challenge is to identify how the operator/supervisor and automated components can collaborate to perform the functions required. An overriding focus on safety is critical for the successful introduction of automation.
The choice of functions to be automated requires consideration of the capabilities and limitations of humans. People are good at perceiving patterns. They adapt, improvise and accommodate quickly to unexpected variability. People are not good at precise repetition of actions, or vigilance tasks. However, system design requires more than allocating functions to person and machine – rather the challenge is to identify how the operator/supervisor and automated components can collaborate to perform the functions required and achieve both safety and health benefits, as well as improved productivity. Interactions between system maintainers and automated components also requires detailed consideration.
Potential adverse safety outcomes of automation
The introduction of automated components to a system frequently changes the role played by people from continuous active control to passive supervision. In an extreme case, the human may be assigned the role of “superhero” with responsibility for taking back control of the system at short notice to avert disaster in the case of automation system short-comings or failure. An assumption that this arrangement is satisfactory, even temporarily, betrays a lack of understanding of human limitations.?
The introduction of automation that places humans into supervisory roles can lead to degradation of manual control skills. Introducing automation can also change the type and extent of information available to equipment operators by removing them from direct contact with the process being controlled. Locating the system supervisor (control room operator) remotely from the automated components may reduce the sources of information that can be used to monitor the system, and in particular, to detect and diagnose the causes of departures from normal operation.
Both the change from manual control, and the reduced information directly available to the people involved; potentially leads to loss of situation awareness, and consequential delays in responding in the event that a human is required to take action in response to the system being perturbed beyond it’s normal operating range. The need to maintain situation awareness is increased with the addition of automation, rather than being diminished, because supervisors must maintain awareness of the functioning of the automated components as well as information about the base system.?
The designer’s challenge is to assist the control room operator responsible for supervising the system to maintain situation awareness by determining what information is required by the operator, and how this may be provided without overwhelming the operator with data. The design of the interfaces by which information is conveyed to people within the system becomes a critical concern.?
Combining data into meaningful information though the design of visual interfaces with emergent properties that correspond to system relevant parameters is one approach which may be helpful, as is placing information in a meaningful context and/or integrating automation-related information with traditional displays. Other options are to create interfaces that predict future states of the system and/or to provide information through multiple sensory channels.
One interface design strategy typically employed is the provision of auditory and/or visual alarms which signal the supervisor to direct their attention to a potentially abnormal situation. However, if the system frequently alarms when action is not actually required, then it is predictable that such nuisance alarms will increase the probability that abnormal states will be ignored, with potential safety consequences.
Interfaces are also used by human supervisors to provide input to direct the actions of the automated components of the system. Errors in these inputs have potential to lead to adverse safety outcomes if not detected and corrected. Errors could include inaccurate information about roadway or dump location, for example. Timely validation of supervisory input is an important, albeit non-trivial, aspect of interface design. Input errors may also be caused by a control room operator’s confusion between different operational, or control, modes.?
At the same time, the span of control of an individual is likely to be increased when placed in an automated system in a supervisory role. Delays in receiving feedback resulting from actions, including errors, may be increased, and when combined with a reduction in the number of operators, the probability of error detection and correction is potentially reduced.
Another potential issue associated with the introduction of automation is so called ‘clumsy automation’ in which easy tasks are automated while complex tasks are left for a human operator, sometimes because they are too difficult to automate. This can mean that workload is reduced during already low-workload phases of work, while remaining unchanged, or even increased, during high-workload operations because of the cognitive overhead associated with engaging and disengaging automation. Control room operator workload requires careful consideration and evaluation, and again, the design of interfaces is a critical part of ensuring the operators’ cognitive workloads remain manageable.
The behaviour of people will be changed by the introduction of automation, potentially with unanticipated consequences. One dimension of the human response relates to the trust people have in the automation technology. People in the system may come to over-trust the automation, either failing to note and respond to automation failures (particularly when such failures are rare) or altering behaviour in ways that reduce the intended safety benefits of automation.
These examples highlight the importance for system safety of ensuring that human characteristics and limitations are considered during the implementation of automation. Introducing additional automation will only result in improved safety if the joint system that emerges from the combination of human and automated components is designed to function as a whole system. The impact of automation on current and potential future employees also requires examination to ensure the change is managed for optimal safety and health outcomes. This objective may be achieved though utilising human-systems integration processes during system design and introduction.?
Human-Systems Integration?
Human-systems integration (HSI) refers to a set of systems engineering processes originally developed by the US Defence industry that ensure that human-related issues are adequately considered during system planning, design, development, and evaluation.
There are six core domains of human-systems integration that are relevant to the introduction of industrial automation:?
“Staffing” concerns decisions regarding the number, and characteristics, of the roles that will be required to operate and maintain the joint human-automation system.?
领英推荐
The “personnel” and “training” domains concern, respectively, the related issues of the characteristics of the personnel who will be selected to fill those roles; and the extent and methods of training, and competency assessment, involved in preparing personnel to obtain and maintain competencies (knowledge, skills, and abilities) required for safe operation and maintenance of the joint human-automation system. Rather than decreased, training requirements for operators interacting with highly autonomous systems are likely to be increased to ensure the operation of the automation is fully understood. For example, automated system controllers need to understand: system hazards and logic, and reasons behind safety-critical procedures; potential results of overriding controls; and how to interpret feedback. Skills for solving problems and dealing with unanticipated events are also required. Emergency procedures must be over-learned and frequently practiced.
The design of training should encompass a structured process incorporating a training needs analysis leading to the definition of the functional specifications; an iterative design component incorporating usability testing; and evaluation. The use of simulation is a promising method for allowing trainees to be exposed to rare events, and for competency assessment.
“Human-factors engineering” encompasses the consideration of human capabilities and limitations in system design, development, and evaluation. In the automation context, this is particularly important in the design of interfaces between people and automated components. Methods employed in human factors engineering include task analyses and human performance measures (e.g., workload, usability, situation awareness) as well as participatory human-centred design techniques.?
ISO 9241 provides the following principles for human-centred design of computer-based interactive systems which will be relevant to many automation projects:
“ a) The design is based on an explicit understanding of users, tasks and environments
?b) users are involved throughout design and development
?c) the design is driven and refined by user-centred evaluation
?d) the process is iterative
?e) the design addresses the whole user experience
?f) the design team includes multidisciplinary skills and perspectives”.
The “safety” domain includes consideration of safety risks. Relevant methods include traditional risk analysis and evaluation techniques such as hazard and operability studies (HAZOP), layers of protection analysis (LOPA), failure modes and effects analysis (FMEA), as well as functional safety analyses, and systems-theoretic process analysis.?
Systems-theoretic process analysis (STPA) in particular may be useful for analysis of complex systems involving automated components because both software and human operators are included in the analysis. STPA is a proactive analysis method that identifies potential unsafe conditions during development and avoids the simplistic linear causality assumptions inherent in HAZOP, LOPA, and FMEA. Safety is treated as a control problem rather than a failure prevention problem. Unsafe conditions are viewed as a consequence of complex dynamic processes that may operate concurrently. STPA also includes consideration of the wider, dynamic, organisational context in which the automated system is situated.
The “health” domain encompasses the use of risk management techniques, and task-based risk assessment in particular, to ensure that the system design minimises risks of adverse health consequences to system operators and maintainers, and indeed, anyone else potentially impacted by the system activities. These analyses should encompass all operational and maintenance activities associated with the autonomous component or system.?
An overall focus on human-systems integration includes consideration of interactions and potential trade-offs between decisions made in different domains. For example, decisions regarding automation and interface complexity may influence personnel characteristics and training requirements, as well as the anticipated number of people required for system operation and maintenance.?
Systems engineering involves three stages: (i) analysis; (ii) design and development; and (iii) testing and evaluation. Human-systems integration incorporates human-centred analysis, design and evaluation within the broader systems-engineering process. That is, human-systems integration is a continuous process that should begin during the definition of requirements for any automation project, continue throughout system design, and throughout commissioning and operation to verify that safety goals have been achieved.
Any introduction of automation should include a human-centred design process that, to paraphrase NASA standard 3001, encompasses at a minimum:?
Conclusion
It is important that the National Robotics Strategy emphasises the importance of human-systems integration and human-centred design to ensure that the resulting joint human-automation systems systems operate safely and effectively, as well as promoting diversity, inclusion and overall societal well-being. This will require a regulatory framework that incudes the assessment and management of human-system interaction risks across the whole system life-cycle.
Senior Lecturer @ RMIT University | Problem Solving, Analytical Skills | with expertise in Human Factors | Ergonomics | Occupational Health and Safety |
1 年The insight provided by Professor Robin Burgess-Limerick on the National Robotics Strategy discussion paper has expanded my perspective on Human Factors Engineering and its significance in Industry 5.0. Thank you,?Robin Burgess-Limerick
*Practical & Strategic Global HSE Leader*
1 年As discussed last week, Alastair Mathias. An interesting read, Ludwig Von Maltitz Ryan McCool Bily Purushothaman Eryn Grant, Ph.D.
Professor and Director of Industry Safety and Health Centre SMI
1 年Thanks for sharing Robin. I think it's very important that a human-centred risk-based approach should be adopted to identify, assess, manage and optimise human health, safety and well-being as well as overall performance of any system including those comprising automation. #mishc #uqrisk
Coach for senior H&S leaders & their teams
1 年Would be interesting to see how this vibes with the work done by Rohan Rainbow and Grain Producers Australia on a code of practice for robotics and automation of grains industry machinery.
Design expertise to manage business transformations
1 年Bryce Gregory, as we were just chatting about...