Operator Staffing Analysis Using the AIM Workload Measurement Tool.
Stephen Maddox
I help engineers improve situation awareness in the control room so operators perform at the highest level when all hell breaks loose.
UCDS (User Centered Design Services) - www.mycontrolroom.com was founded in 2000 by Ian Nimmo. Based on Mr. Nimmo’s experience as founder and chairman of the ASM Consortium he saw a need in the industry for leadership in the application of a human factors approach to operator performance. Mr. Nimmo was then able to use the knowledge and expertise learnt over many years in the industry combined with the research with the ASM Consortium to help clients understand and apply best practices in the control room.
The staffing / workload measurement methodology all started when a large US refiner came to UCDS to help make sense of a staffing study they had commissioned themselves.
Their method had identified that the three, near identical units, had a different workload score. How could this be, the equipment was the same for all three units and they assumed they would have similar workload scores. However, one unit scored significantly higher than the other two units. The data suggested that the two lesser loaded units be consolidated into a single operator position.
UCDS reviewed the method that was used and found that a traditional time and motion study was used. During the time and motion study, one unit was undergoing a significant and rare upset. Based on the number of tasks required by the operator during the upset the team assigned a much higher workload score for the unit. The upset skewed the analysis. UCDS concluded that time and motion was not appropriate for scoring workload.
Research revealed that workload was related to the amount of ‘attention’ an operator must pay to the equipment under their control. An "equipment responsibility" analysis would provide a much more accurate score. Some equipment requires very little to no interaction when things are normal and other equipment requires a high level of attention, intervention, and management during the shift. Regardless, every piece of equipment has some level of workload associated with it. Operator equipment can be divided into 5 categories:
? Storage – tanks, spheres, bullets, silos etc.
? Transfer – moving material for one place to another, e.g. pumps
? Ancillary processing – e.g. compressors
? Supplemental transformation – e.g. distillation columns
? Material transformation – e.g. reactors
With the help of a global refiner we developed a scoring matrix to assign a workload complexity score to each piece of equipment within each of the 5 categories. The "equipment complexity" score was developed.
Now Ian knew the workload could not be measured without adding "control complexity". Loop count alone is not appropriate, but when combined with equipment complexity it makes more sense. Workload is not impacted by the number of loops. An operator may have 250 loops or 1,000 loops. What matters is how the operator is impacted by the process and the effectiveness of the controls:
? Are controllers in automatic?
? Do operators have to make a lot of manual moves?
? Is alarm management under control?
Ian uses loop information and the data from the control system combined to determine a "control complexity" score. Now we have the equipment complexity score and the control complexity score. Ian identified another factor, the "interface complexity".
领英推荐
If an operator has their own feed tanks, processing equipment, and product tanks they are, in general, masters of their own destiny. If they are taking feed from another console, sending product to one, or reliant on utilities controlled by others, then they will have to communicate more with others to coordinate activities. Even more important if the systems are more dynamic such as taking hot feed from another console’s column to their console. This type of feed interactions is given a score. We use all three scores combined; equipment complexity, control complexity, and interface complexity to calculate a weighted score for each position.
Even with this complex analysis we need to consider "other duties". What if a console operator must coordinate call-ins, take external telephone calls, write work permits or do an outside round? This is also taken into consideration when calculating the overall loading score.
How is the data combined and calculated? Ian developed a tool called the "Automation Index Model" AIM is used to house the information gathered from the equipment list, the operator event / alarm data, and the feed data to calculate the workload score for each operator job. With AIM you can run also run cases to see "what if" scenarios like "what if we combined these two jobs into a single position" and AIM will recalculate the score so you can see how the workload is impacted before you make a change. It also allows you to see how the workload score would be impacted if you improved control / automation, improved alarms, or if you add or remove equipment.
Once the data is received, we come to the site to conduct interviews with all the operators. This time is extremely valuable to identify "other duties" as well as confirm accuracy of the data. The interview process is not required but recommended. The information captured during the interviews provides great insight to how the process, operation, management systems, interfaces, and environment impact the workload. This information also allows us to identify gaps they may exhist within the operation that may have a direct impact on operator performance.
In the workload report we show a bar chart of every operator position and the score allowing you to compare each position across the operation. We use two industry benchmarks in the chart allowing you to see the operator scores alongside with the Industry Average score which is based on the cumulative average of past studies.
We also show you the pacesetter score alongside the operator score. The pacesetter score is based on "loading achievable" by companies that are recognized as top quartile performers in operational management systems such as training, management of change, documentation, hiring, fatigue risk management etc. It can be beneficial to compare your workload scores against your competition.
In some cases, we may have provided this study in the past for other sites that fall under a client’s corporation. When we do a study for a site that is part of a large company with multiple sites we can provide the enterprise average to allow direct fleetwide comparisons on operator loading.
All the data is entered into AIM tool is based on normal operations only. Companies staff for normal operations not abnormal, however it is important to make sure your staffing design is effective during abnormal situations as well. We use a special method for assessing workload and abnormal situations like loss of power, equipment failure, or fire.
If you were to make a change to a job. For example, you believe you can combine two positions into a single job position. With AIM you may prove that the workload score is manageable but how do you know they can still safely manage the workload during an upset or emergency situation? Just as you would not make an equipment change without an MOC, you would not make this staffing change without doing making sure it is safe.
Hence the development of an extended Management of Organization Change (MOOC) methodology to ensure abnormal situations can be safely and effectively managed. This is a recommended practice to be performed before any staffing change is made. We use a ladder assessment tool to identify a list of predefined probable abnormal situations and then we assess every task or step from detection to safe state to determine if the operator can safely and effectively manage those situations.
During a staffing study we always provide at least one MOOC study on a unit that the client chooses. We provide training while we go through the steps so they can perform them in the future. We also provide a MOOC procedure document that can be incorporated into their MOC standard.
How do we kick off a project? Once we receive the ok to start, a full data collection plan will be provided. It consists of alarm and operator event data as well as, P&IDs for equipment counts, PFD’s, and org charts.? Next, we schedule times for all the interviews. Our team will arrive onsite usually for 1 week depending on the size of the facility. Then we go offsite to configure the AIM tool, capture the workload scores, and then develop the report. Next, we schedule an online presentation to review the scores, discuss the findings, and cover any recommendations. This is a summary of the report and provides great insight to your staffing design and options.
Measuring operator workload can be done scientifically, mathematically, and effectively when you use a tool and a proven method. It is critical to have a process engineer to support the effort and to ensure that the data is accurate . We have been invited by Unions, corporations, and Solomon to use our method. We have developed the workload industry average score for many industries and pride ourselves in pioneering this methodology. Please visit www.mycontrolroom.com to see our client testimonials.
Human Factors and Safety Consultant, Ergonomist (political views are my own).
1 年Stephen Maddox great post and used similar methodology for CCR MoC project when a offshore duty holder moved a control room operation on to an nearby platform. A key question, did you review the Safety and environment management system to identify what duties were specified within the SEMS? we are finding many SEMS are misaligned and often duties are written down in various procedures but CRO's may be unaware that they are responsible.
Human Machine Interfaces, Interaction and Safety
1 年Interesting read.?