Creation of Cognitive Human-Machine Interfaces and Interactions for Industry 4.0 Applications - a research proposal
Sathish Kasilingam
Product Manager | Avid Learner | PhD Candidate | Manufacturing SW | Manufacturing AI
Abstract
Industry 4.0 calls for a high level of automation and connectivity of all systems. In the current state, the prevalence of automation is predominantly present only in large organizations. There are more and more companies working in the Industry 4.0 environment that are bringing the components of Industry 4.0 to manufacturing organizations of all shapes and sizes.
?A key unsolved challenge of highly automated manufacturing and fabrication plants lies in the delicate balance between the automation and the human operator’s involvement: increasingly automated plants tend to shift the human “out-of-the-loop”, which is a hazardous paradigm as it denies the ability of the human to intervene and avoid incidents and accidents. New technological solutions shall contribute to establish the “human-on-the-loop” and “human-machine teaming” paradigms, which are inherently safer and allow the human to contribute to maximizing the overall operational performance. Human-Machine Interfaces and Interactions are where these challenging paradigms will be mostly realized.
This research project aims to design and experimentally evaluate a suite of sensors that measure various physiological parameters of the human operator including blink rate, fixations, saccades, heart/respiratory rate, and brain activity to allow inferring her/his cognitive state. The specific sensors will be selected to minimize intrusiveness and costs. Custom Adaptive Neural-Fuzzy Inference Systems (ANFIS) classification techniques will be used to infer cognitive states including mental workload and fatigue, stress, and attention.
These states are used to generate aural warnings, prompt a specific action from the human operator, and/or send alerts to supervisors and operational control centers. The level of automation and the human-machine interactions and interfaces will also be adapted to maximize the human operator’s involvement to enhance safety, situational awareness, and operational performance.
In addition to generating alerts and aural warnings, the system will be interfaced with the automation to adjust operational parameters as necessary, as well as aural and visual indicator control units to generate visual alerts that can be detected by other operators.
The system will be an essential complement to increased automation support in promoting a modulation of the mental workload associated with perception, decision-making, and execution. Continuous optimal modulation of the task load, in turn, will be essential to mitigate the onset of mental fatigue, which is a cumulative parameter depending on task load and task complexity with extremely negative impacts on human cognitive performance.
This cognitive human-machine interfaces and interactions(CHMII) system will differ from existing systems by its consideration of usability principles, user’s physiological parameters, machine’s functional parameters, HMII’s usage statistics, environmental conditions including the availability of tools, gadgets (VR/AR goggles), analytics (IIOT data, Big Data, Business Intelligence data), and specific industry regulations and the characteristics of the tasks that are considered in the workflow. This CHMII will automatically learn and adapt itself with all this data and hence it will perform best as a team alongside the operator as he/she utilizes all capabilities available to him in an Industry 4.0 enabled shop floor. The current CHMII systems as per my research do not consider all these parameters though they might consider one, or a few of the listed inputs that can alter the CHMII.
Literature Review:
Every $1 invested in user experience brings $100 in return according to the report ‘The Six Steps For Justifying Better UX’ [101]. Industry 4.0 is a collection of technologies and tools that will enable the current manufacturing and production industry to become more efficient and competitive in financial and service terms. When completely implemented Industry 4.0 will radically disrupt how products are made and sold. Any human being can have a 3D printer and just download the design for the part he/she/they need(s) to fabricate. Then the design could be optimized for this person’s use case and he/she can print it and bring the design into reality. To discuss how Cognitive Human-Machine Interfaces and Interactions (CHMII) will impact Industry 4.0, let us review the components [75]:
?Industry 4.0 enabling constituents:
?·??????Cyber-Physical Systems integrates physical processes with computation capabilities and can operate in changing environments, maintaining robust behavior against unexpected conditions and failures.
?·??????The Industrial Internet of Things (IIoT) integrates various devices equipped with sensing, identification, processing, communication, and networking capabilities. It bridges the physical and virtual world together.
?·??????Big data analytics is characterized by volume, variety, velocity, and veracity, and it requires new techniques of data processing and analysis. Data lakes need appropriate structuring for analytics.
?·??????Cloud based technologies are a basis for infrastructure that allows ubiquitous and secure access to data from different devices. AWS, Microsoft, Goggle and a miltitude of other organizations offer clouds services at economical prices. The ability to introduce technology anywhere with an internet connection is possible if the technology is built for native cloud use.
?·??????Cybersecurity is the set of technologies, tools, and processes to guarantee the security of networks, devices, and the large amount of data collected, stored, and received from machines and users.
?·??????Virtual reality is a computer interface that allows the user to be fully immersed in an experimental situation, i.e., a virtual environment, enabling the looking, moving, and interaction of users in a world that is like the real one.
?·??????Augmented reality allows the creation of a virtual environment on top of the reality that we exist in, in which humans can interact with machines using special devices.
?·??????Smart sensors are traditional sensors embedded with intelligence capabilities, i.e., on-board microprocessors, which can be used for processing, conversions, calculations, and interfacing functions. Raspberry Pi and Arduino with configurable sensors are available prevalently.
?·??????Simulation provides a digital representation of products and processes, to identify in advance potential issues, avoiding cost and resource wastes in production.
?·??????Additive Manufacturing (AM) consists of a cluster of technologies that enable to produce small batches of products with a high degree of customization by adding rather than removing material from a solid block. A wide vareity of materials, techniques, and tools are used.
?·??????Advanced robotics: The evolution of traditional robots opened the way to new collaborative solutions of robots (i.e. Co-Bots) that can work together with humans safely and efficiently. Moreover, embedded intelligence in robots can allow them to learn from human activities, improving their autonomy and flexibility. CHMII would align with this component.
?·??????Energy-saving technologies include the monitoring and optimization systems that allow reducing the energy consumption in manufacturing.
?Horizontal and vertical integration refers to the creation of a global value network through the integration and optimization of the flow of information and goods between companies, suppliers, and customers. The vertical integration, instead, is the integration of functions and departments of different hierarchical levels of the single company creating a consistent flow of information and data.
?Multi-agent systems (MAS) are an organized set of agents that represent the behavior of objects of a system, capable of interacting and negotiating among them to achieve individual goals.
?In the tables below, the kind of support that these systems provide to humans in the manufacturing industry is listed.
Table 1: Industry 4.0 technologies in relation to physical and cognitive support [75]
Table 2: Enabling technologies for human physical and cognitive support [75]
?The need for cognitive human interfaces and interactions is immediate as these technologies have already started to become implemented in organizations around the world. Based on how these technologies can support humans either through predictive or reactive support is an important subject to study. This study will result in better cognitive human interfaces and interactions for operators who need to use the internet of things, additive manufacturing, extended reality, or any of the components of Industry 4.0.
Industry 4.0 involves complete digitization and connectivity between machines, processes, tools, and humans in the system. In the following diagram, the different scenarios of integration are visualized. Situation #5 is the epitome of connectivity that Industry 4.0 prescribes. Human operators can benefit from a central information system: also, more information from other human-machine systems can be consulted and examined in real-time thanks to a cloud-based architecture, to enhance their situational awareness. To avoid the risk of slowing down the speed of response of the centralized system, decentralized computational capabilities are embedded into machines, which can pre-process data. At the same time, similarly to Scenario #4, cognitive human interfaces and interactions can take advantage of a hybrid control structure, in which decisions can be taken at a decentralized level or through central optimization strategies that consider the real-time state of all the elements of the system.
Figure 1: Different scenarios of communication between humans and machines [75]
?The human-machine team performing the tasks needs to be monitored so that the interface and the interactions that happen between them can be adapted based on the unique situation that they are both in. To describe the state of the humans, machines, and the team together, metrics that define the performance of these entities need to measure and analyzed. These metrics can form the foundation for cognitive human interfaces and interactions. Once these metrics are calculated and provided as an input to the custom adaptive neural-fuzzy inference systems these systems could intelligently adapt the cognitive human interfaces and interactions.
Figure 2: Metrics for monitoring human, machine, and team performance [13].
?The suite of sensors that measure various physiological parameters of the human operator including blink rate, fixations, saccades, heart/respiratory rate and brain activity to allow inferring her/his cognitive state are to be designed and tested so that the inputs for the cognitive human interfaces and interactions are available for computation of different scenarios that will result in an adapted user interface.
Table 3: Mapping of physiological parameters to metrics [13].
?The neural network needs to consider the fact that these sensors alone cannot directly quantify the state at which the human operator is. When sensors are set up and start reporting, preconditioned scenarios for mapping of a certain set of values of these parameters to states such as fatigue, lack of situation awareness coming from education or frustration, and many more such states. Thus, effective sets of training and test data containing the map of values of physiological parameters with performance metrics and the need to adapt the interfaces are to be created and curated. These datasets must be continuously updated. Based on the quality of the data, the decision can be made to proceed with supervised or unsupervised learning.
Table 4: The mapping of metrics to categories of considerations for the adaptation of the ?UI/UX [13].
?In any environment where productivity is paramount, the tasks need to be split between the human and the machine in the human-machine team and Madni et al. propose a simple framework for this classification.
Figure 3: Dynamic task allocation criteria [79].
To create an Industry 4.0 environment for the generation of data from sensors measuring physiological parameters, the actual set of machines and technologies need not be purchased [84]. Using certain open source and certain paid tools, a system equivalent to an industry 4.0 environment can be set up using digital twins [107]. Test subjects can interact with this environment to generate data for the CHMII. There are devices such as eye-tracking systems embedded in goggles and EEG systems that can enable the user and also report back on their interactions with this virtual environment [4]. The software for this work was developed using the application Tecnomatix. Robot Construction Kit, Siemens PLM Teamcenter, and a multitude of other applications can be used as well.
?Figure 4: Digital twin analysis [84].
?In summary, the collaborative skills of the robot are based on the seamless integration of:
? Novel multimodal sensor-based person tracking using RGBD and laser scanner data.
? Real-time identification and tracking of point clouds in the workspace to be used for collision avoidance.
? Real-time dual-arm self-collision avoidance and dynamic collision avoidance with external objects.
? Robot’s speed and compliance automatically adjusted depending on the current working mode and real-time environment data.
? Intention recognition and recognition of simple human gestures.
?According to the ISO 10218 standard and the new ISO/TS 15066 Technical Specification, there a currently four modes allowed for human-robot collaboration:
1. Safety-rated monitored stop, in which there is a separation between human and robot and no robot motion is allowed when the operator is in the collaborative workspace,
2. Hand guiding, in which the robot motion is only happening through direct input of the operator (by, for instance, holding a device on the end-effector of the robot),
3. Speed and separation monitoring, in which there is an external sensor (usually a camera or laser scanner) ensuring that there is always a minimum separation between humans and robots. The human can work closely to the robot and without a fence, but if a minimum distance is reached, the robot stops,
4. Power and force limiting by inherent design or by control, in which the maximum forces (or power) that the robot can exert are limited by either mechatronic design or by control software. This is the only mode in which physical contact between humans and a moving robot is permitted.
?As Industry 4.0 is a new paradigm that will be implemented in already present shop-floors, the education of the personnel in the current system is important. The medium of education can affect their collaborative tendencies and influence how they appreciate new information and systems. The consideration of these learning styles will greatly enhance the adaptability of the cognitive human-machine interfaces and interactions according to the persona of the user.
?Table 5: Learning styles used for industry 4.0 focus areas [99].
?Research Gaps and Questions:
?In Figure 1, scenarios 1-5 are presented, how are the Cognitive Human-Machine Interfaces and Interactions (CHMII) going to adapt to these different scenarios?
?Based on the skillset of the operator or other defined personas, how are the Cognitive Human-Machine Interfaces and Interactions (CHMII) going to adapt to these different users?
?While using this Cognitive Human-Machine Interfaces and Interactions (CHMII) when would a third party’s help be needed based on the level of automation?
?How are retroactively automated machines going to affect CHMII? What interactions and to what level will the machine and humans interact?
?When there are robots that do not possess heightened levels of Industry 4.0 implementation including VR, AR, IIOT, 3D printing, and functional design and manufacturing in the mix of operations on the shop floor, how are they going to be considered for the CHMII?
What are the criteria for tasks in incoming inspection, manufacturing (milling, turning, coating, honing), assembly, inspection, and shipping to be divided between the human-machine combination?
?Will machine learning for assembly tasks like riveting based on pitch or screws in a circle or assembling a part after a human does one help collaboration between human and machine?
?How is the internet of things data coming from the shop floor affect adaptation of the CHMII?
?Will additive manufacturing need collaborative human-machine interfaces and if so how would data from the machine lead to adaptation of the CHMII?
?Extended reality could expand the capability of humans, but at what point are tasks better completely automated rather than being done through instruction to humans through extended reality?
?What are the market problems that need to be addressed? As in, are more tasks in the assembly and inspection departments automated to a greater extent compared to milling, turning, finishing, and other such tasks?
How can usability heuristics such as Nielsen’s 10 heuristics and Lund’s 37 maxims be incorporated into the cognitive human-machine interface and interaction system?
The need for considering other devices (e.g., electric screwdrivers, electrical clamping devices, etc.) in the same work cell is a principal consideration that should not be ignored.
Research Objectives:
Once the agreed-upon user scenarios for testing and thus evaluation of adaptability of the cognitive human-machine interfaces and interactions, then it must set up physically or virtually using a digital twin environment. Setting up a user testing environment and devices for all personas, user journey maps, use cases, and scenarios is the starting point for the task.
The set of tasks chosen will help to categorize the industry standards that are needed to be followed. For example, if the first article inspection is needed to be done at an aerospace manufacturer, then the AS9102 standard needs to be followed. Thus, how the cognitive human-machine interface and interaction systems adapt to these tasks could be better defined when such standards are used as constraints.
?User journey map, scenario, and case documentation will define the triggers, needs, motivation, expectations for these tasks. These documents can be the basis for defining criteria for quantification of the results in terms of quality, performance, safety, and more.
?Digital twin creation and testing using multiple prototypes, Figma, AWS Sumerian, Lex, Polly, and more. A digital twin can be created by a Lidar scan followed by using CAD to import the 3D model into Unity or Sumerian. AWS Lex or Polly can be used to interact with the application.
?Because a collaborative workspace not only does it involve the human and the robot but also other auxiliary devices (e.g. electric screwdrivers, electrical clamping devices, etc.), each cell presents unique risks that need to be handled with safety.
?Machine learning for assembly like riveting based on pitch or screws in a circle or assembling a part after human does one. Assessing weight or volume capability for objects and handling that could be done by the robot.
?The custom adaptive neural-fuzzy inference systems, which are the core of the cognitive human-machine interfaces and interactions(CHMII) system will have information from the consideration of usability principles. user’s physiological parameters, machine’s functional parameters, HMII’s usage statistics, environmental conditions including the availability of tools, gadgets (VR/AR goggles), analytics (IIOT data), and specific industry regulations and the characteristics of the tasks that are considered in the workflow.
Proposed Research Methodology:
Based on the availability of a testing station, I will build a testing station using ISO/TS 15066:2016, which specifies safety requirements for collaborative industrial robot systems and the work environment and supplements the requirements and guidance on collaborative industrial robot operation given in ISO 10218?1 and ISO 10218?2.
Failure Modes and Effects Analysis (FMEA), Monte Carlo Analysis ?and other risk management tools will be used to assess risk associated with tasks in the user journey. The risk information, simulation-based, and history-based information will be considered in the evaluation of any new user experience that the CHMII creates.
In addition to the sensors, the operators will need to speak out to comprehend the task and share how they feel completing it. They will be involved in buying fixes in the CHMII with fake money. With the feedback generated from sensors on the human, functional parameter values from the machine and an iterative process to address stress points that are prioritized using frameworks (Risk Priority Number & more), the CHMII could be rebuilt to create improved versions. This can be supported by design ROI calculations when necessary.
Automatic usability testing report generation to substantiate the need for CHMII iterative improvements to reduce the user’s cognitive workload and prevent overload will be done. The improvements made would need to be shown to the user to enhance the synergy and relation with the testing user.
Digital usability and accessibility tools such as Pa11y (automated accessibility testing), Google Analytics, Heap, Full story, Usabilla will be incorporated into the software application that the user interacts with when performing tasks as a human-machine team. Hypothesis testing for the improvement of HMI will be done with multiple hypotheses.
?The custom adaptive neural-fuzzy inference systems, which are the core of the cognitive human-machine interfaces and interactions(CHMII) system will be iteratively built using data from these experimental sessions. Once a sufficient base is built a supervised learning approach for the system to improve itself will be built.
Expected Outcomes:
The results of this research project will include the following:
?1.????Criteria for adaptability of the cognitive human-machine interactions and interfaces system (CHMIIS) based on
a.????varying levels of automation and connectivity
b.????different personas of the user
c.????the need for third-party intervention if human or machine cannot solve a problem
d.????the situation when working with retroactively connected legacy machines
e.????the usage of VR, AR, IIOT, 3D printing, and functional design and manufacturing
on an Industry 4.0 enabled shop floor.
2.????Dynamic allocations of tasks with the consideration of norms and rules in varied manufacturing environments between the user, machine, or the team will be done once the criteria are validated.
3.????The machine learning capabilities and principles for certain tasks so that the machine can handle those tasks after instruction from the human.
4.????As a multitude of other devices such as jigs, tools, and job trackers will be present in the work cell, the logic behind including them in the CHMIIS will be discussed.
5.????The incorporation of usability and accessibility tools, heuristics, maxims, and principles into the CHMIIS.
6.????For a chosen industry standard, how CHMIIS can help will be documented.
7.????Any suggestions for the CHMIIS will include physiological data, machine data, team data, environment data, and virtual and aural response data from the users and thus the supervised learning approach will be based on relevant data.
8.????The steps for a digital twin creation for usability and accessibility testing will be elucidated.
9.????Risk assessment criteria for dynamic allocation of tasks using VR, AR, IIOT, 3D printing, and functional design and manufacturing on an Industry 4.0 enabled shop floor will be cataloged.
References:
1.?????Jiang, Jianjun, Yiqun Wang, Li Zhang, Daqing Wu, Min Li, Tian Xie, Pengcheng Li et al. "A cognitive reliability model research for complex digital human-computer interface of industrial system." Safety science 108 (2018): 196-202.
2.?????Ansari, Fazel, Philipp Hold, and Marjan Khobreh. "A knowledge-based approach for representing jobholder profile toward optimal human–machine collaboration in cyber physical production systems." CIRP Journal of Manufacturing Science and Technology (2020).
3.?????Lim, Yixiang, Alessandro Gardi, Subramanian Ramasamy, Julian Vince, Helen Pongracic, Trevor Kistan, and Roberto Sabatini. "A novel simulation environment for cognitive human factors engineering research." In 2017 IEEE/AIAA 36th Digital Avionics Systems Conference (DASC), pp. 1-8. IEEE, 2017.
4.?????Lim, Yixiang, Alessandro Gardi, Roberto Sabatini, Subramanian Ramasamy, Trevor Kistan, Neta Ezer, Julian Vince, and Robert Bolia. "Avionics human-machine interfaces and interactions for manned and unmanned aircraft." Progress in Aerospace Sciences 102 (2018): 1-46.
5.?????Gavriushenko, Mariia, Olena Kaikova, and Vagan Terziyan. "Bridging human and machine learning for the needs of collective intelligence development." Procedia Manufacturing 42 (2020): 302-306.
6.?????Schwarz, Juergen. "Code of Practice for development, validation and market introduction of ADAS." In 2. Tagung Aktive Sicherheit durch Fahrerassistenz. 2006.
领英推荐
7.?????Pongsakornsathien, Nichakorn, Alessandro Gardi, Roberto Sabatini, and Trevor Kistan. "Cognitive Human-Machine Interfaces and Interaction for Terminal Manoeuvring Area Traffic Management." optimization 7: 8.
8.?????Lim, Yixiang, Jing Liu, Subramanian Ramasamy, and Roberto Sabatini. "Cognitive Remote Pilot-Aircraft Interface for UAS Operations." In Proceedings of the 2016 International Conference on Intelligent Unmanned Systems (ICIUS 2016), Xi’an, China, pp. 23-25. 2016.
9.?????Liu, Jing, Alessandro Gardi, Subramanian Ramasamy, Yixiang Lim, and Roberto Sabatini. "Cognitive pilot-aircraft interface for single-pilot operations." Knowledge-based systems 112 (2016): 37-53.
10.????Lim, Yixiang, Jing Liu, Subramanian Ramasamy, and Roberto Sabatini. "Cognitive Remote Pilot-Aircraft Interface for UAS Operations." In Proceedings of the 2016 International Conference on Intelligent Unmanned Systems (ICIUS 2016), Xi’an, China, pp. 23-25. 2016.
11.????Lim, Yixiang, Subramanian Ramasamy, Alessandro Gardi, Trevor Kistan, and Roberto Sabatini. "Cognitive human-machine interfaces and interactions for unmanned aircraft." Journal of Intelligent & Robotic Systems 91, no. 3-4 (2018): 755-774.
12.????Lim, Yixiang, Vincent Bassien-Capsa, Subramanian Ramasamy, Jing Liu, and Roberto Sabatini. "Commercial airline single-pilot operations: System design and pathways to certification." IEEE Aerospace and Electronic Systems Magazine 32, no. 7 (2017): 4-21.
13.????Damacharla, Praveen, Ahmad Y. Javaid, Jennie J. Gallimore, and Vijay K. Devabhaktuni. "Common metrics to benchmark human-machine teams (HMT): A review." IEEE Access 6 (2018): 38637-38655.
14.????Navarro-Cerdan, J. Ramon, Rafael Llobet, Joaquim Arlandis, and Juan-Carlos Perez-Cortes. "Composition of Constraint, Hypothesis and Error Models to improve interaction in Human–Machine Interfaces." Information Fusion 29 (2016): 1-13.
15.????Debernard, Serge, C. Chauvin, R. Pokam, and Sabine Langlois. "Designing human-machine interface for autonomous vehicles." IFAC-PapersOnLine 49, no. 19 (2016): 609-614.
16.????Pacaux-Lemoine, Marie-Pierre, Damien Trentesaux, Gabriel Zambrano Rey, and Patrick Millot. "Designing intelligent manufacturing systems through Human-Machine Cooperation principles: A human-centered approach." Computers & Industrial Engineering 111 (2017): 581-595.
17.????Pacaux-Lemoine, Marie-Pierre, and Damien Trentesaux. "Ethical risks of human-machine symbiosis in industry 4.0: insights from the human-machine cooperation approach." IFAC-PapersOnLine 52, no. 19 (2019): 19-24.
18.????Lim, Yixiang, Alessandro Gardi, Neta Ezer, Trevor Kistan, and Roberto Sabatini. "Eye-tracking sensors for adaptive aerospace human-machine interfaces and interactions." In 2018 5th IEEE International Workshop on Metrology for AeroSpace (MetroAeroSpace), pp. 311-316. IEEE, 2018.
19.????Behymer, Kyle J., and John M. Flach. "From autonomous systems to sociotechnical systems: Designing effective collaborations." She Ji: The Journal of Design, Economics, and Innovation 2, no. 2 (2016): 105-114.
20.????Bellet, Thierry, Martin Cunneen, Martin Mullins, Finbarr Murphy, Fabian Pütz, Florian Spickermann, Claudia Braendle, and Martina Felicitas Baumann. "From semi to fully autonomous vehicles: New emerging risks and ethico-legal challenges for human-machine interactions." Transportation research part F: traffic psychology and behaviour 63 (2019): 153-164.
21.????Min, Byoung-Kyong, Ricardo Chavarriaga, and José del R. Millán. "Harnessing prefrontal cognitive signals for brain–machine interfaces." Trends in Biotechnology 35, no. 7 (2017): 585-597.
22.????Ouedraogo, Kiswendsida Abel, Simon Enjalbert, and FréDéRic Vanderhaegen. "How to learn from the resilience of Human–Machine Systems?." Engineering Applications of Artificial Intelligence 26, no. 1 (2013): 24-34.
23.????Langan-Fox, Janice, James M. Canty, and Michael J. Sankey. "Human–automation teams and adaptable control for future air traffic management." International Journal of Industrial Ergonomics 39, no. 5 (2009): 894-903.
24.????Lim, Yixiang, Kavindu Ranasinghe, Alessandro Gardi, Neta Ezer, and Roberto Sabatini. "Human-machine interfaces and interactions for multi UAS operations." In Proceedings of the 31st Congress of the International Council of the Aeronautical Sciences (ICAS 2018), Belo Horizonte, Brazil, pp. 9-14. 2018.
25.????Kemény, Zsolt, Richárd Beregi, János Nacsa, Csaba Kardos, and Dániel Horváth. "Human–robot collaboration in the MTA SZTAKI learning factory facility at Gy?r." Procedia Manufacturing 23 (2018): 105-110.
26.????Malone, T., P. Savage-Knepshield, and L. Avery. "Human-systems integration: Human factors in a systems context." Human Factors and Ergonomics Society Bulletin 50, no. 12 (2007): 1-3.
27.????Gardecki, Arkadiusz, Michal Podpora, and Aleksandra Kawala-Janik. "Innovative Internet of Things-reinforced Human Recognition for Human-Machine Interaction Purposes." IFAC-PapersOnLine 51, no. 6 (2018): 138-143.
28.????Shi, Yu, Ronnie Taib, Natalie Ruiz, Eric Choi, and Fang Chen. "Multimodal human-machine interface and user cognitive load measurement." IFAC Proceedings Volumes 40, no. 16 (2007): 200-205.
29.????Ansari, Fazel, Selim Erol, and Wilfried Sihn. "Rethinking human-machine learning in Industry 4.0: How does the paradigm shift treat the role of human learning?." Procedia manufacturing 23 (2018): 117-122.
30.????Pak, Richard, Ewart J. de Visser, and Ericka Rovira, eds. Living with Robots: Emerging Issues on the Psychological and Social Implications of Robotics. Academic Press, 2019.
31.????Borst, Clark. "Shared mental models in human-machine systems." IFAC-PapersOnLine 49, no. 19 (2016): 195-200.
32.????Zolotová, Iveta, Peter Papcun, Erik Kajáti, Martin Mi?kuf, and Jozef Mocnej. "Smart and cognitive solutions for Operator 4.0: Laboratory H-CPPS case studies." Computers & Industrial Engineering 139 (2020): 105471.
33.????Longo, Francesco, Letizia Nicoletti, and Antonio Padovano. "Smart operators in industry 4.0: A human-centered approach to enhance operators’ capabilities and competencies within the new smart factory context." Computers & industrial engineering 113 (2017): 144-159.
34.????Wiltshire, Travis J., and Stephen M. Fiore. "Social cognitive and affective neuroscience in human–machine systems: a roadmap for improving training, Human–Robot interaction, and team performance." IEEE Transactions on Human-Machine Systems 44, no. 6 (2014): 779-787.
35.????Levesque, Laurent, and Marc Doumit. "Study of Human-Machine Physical Interface for Wearable Mobility Assist Devices." Medical Engineering & Physics (2020).
36.????Villani, Valeria, Giulia Lotti, Nicola Battilani, and Cesare Fantuzzi. "Survey on usability assessment for industrial user interfaces." IFAC-PapersOnLine 52, no. 19 (2019): 25-30.
37.????Van Diggelen, Jurriaan, Mark Neerincx, Marieke Peeters, and Jan Maarten Schraagen. "Developing effective and resilient human-agent teamwork using team design patterns." IEEE intelligent systems 34, no. 2 (2018): 15-24.
38.????Guzman, Andrea L. "The messages of mute machines: Human-machine communication with industrial technologies." Communication+ 1 5, no. 1 (2016): 1-30.
39.????Mason, Cindy. "The Multi-Disciplinary Case for Human Sciences in Technology Design." In 2014 AAAI Fall Symposium Series. 2014.
40.????van der Vecht, Bob, Jurriaan van Diggelen, Marieke Peeters, Wessel van Staal, and Jasper van der Waa. "The sail framework for implementing human-machine teaming concepts." In International conference on practical applications of agents and multi-agent systems, pp. 361-365. Springer, Cham, 2018.
41.????Ranasinghe, Kavindu, Yixiang Lim, Cholsanan Chantaraviwat, Neta Ezer, Alessandro Gardi, and Roberto Sabatini. "Time and energy management for descent operations: Human-machine teaming considerations." In AIAC18: 18th Australian International Aerospace Congress (2019): HUMS-11th Defence Science and Technology (DST) International Conference on Health and Usage Monitoring (HUMS 2019): ISSFD-27th International Symposium on Space Flight Dynamics (ISSFD), p. 362. Engineers Australia, Royal Aeronautical Society., 2019.
42.????Hernández-Orallo, José, David L. Dowe, and M. Victoria Hernández-Lloreda. "Universal psychometrics: Measuring cognitive abilities in the machine kingdom." Cognitive Systems Research 27 (2014): 50-74.
43.????Zieba, Stéphane, Philippe Polet, and Frédéric Vanderhaegen. "Using adjustable autonomy and human–machine cooperation to make a human–machine system resilient–Application to a ground robotic system." Information Sciences 181, no. 3 (2011): 379-397.
44.????Savino, Matteo Mario, Daria Battini, and Carlo Riccio. "Visual management and artificial intelligence integrated in a new fuzzy-based full body postural assessment." Computers & Industrial Engineering 111 (2017): 596-608.
45.????Pérez, Luis, Eduardo Diez, Rubén Usamentiaga, and Daniel F. García. "Industrial robot control and operator training using virtual reality interfaces." Computers in Industry 109 (2019): 114-120.
46.????Zhang, Yunbo, and Tsz-Ho Kwok. "Design and interaction interface using augmented reality for smart manufacturing." Procedia Manufacturing 26 (2018): 1278-1286.
47.????Matsas, Elias, George-Christopher Vosniakos, and Dimitris Batras. "Prototyping proactive and adaptive techniques for human-robot collaboration in manufacturing using virtual reality." Robotics and Computer-Integrated Manufacturing 50 (2018): 168-180.
48.????Nikolakis, Nikolaos, Niki Kousi, George Michalos, and Sotiris Makris. "Dynamic scheduling of shared human-robot manufacturing operations." Procedia CIRP 72 (2018): 9-14.
49.????Large, David R., Leigh Clark, Annie Quandt, Gary Burnett, and Lee Skrypchuk. "Steering the conversation: a linguistic exploration of natural language interactions with a digital assistant during simulated driving." Applied ergonomics 63 (2017): 53-61.
50.????Wang, Fangju, Shaidah Jusoh, and Simon X. Yang. "A collaborative behavior-based approach for handling ambiguity, uncertainty, and vagueness in robot natural language interfaces." Engineering Applications of Artificial Intelligence 19, no. 8 (2006): 939-951.
51.????Vuletic, Tijana, Alex Duffy, Laura Hay, Chris McTeague, Gerard Campbell, and Madeleine Grealy. "Systematic literature review of hand gestures used in human computer interaction interfaces." International Journal of Human-Computer Studies 129 (2019): 74-94.
52.????Voinescu, Alexandra, Phillip L. Morgan, Chris Alford, and Praminda Caleb-Solly. "The utility of psychological measures in evaluating perceived usability of automated vehicle interfaces–A study with older adults." Transportation Research Part F: Traffic Psychology and Behaviour 72 (2020): 244-263.
53.????Ahn, Sewoong, Kwanghyun Lee, and Sanghoon Lee. "Visual entropy: A new framework for quantifying visual information based on human perception." In 2017 IEEE International Conference on Image Processing (ICIP), pp. 3485-3489. IEEE, 2017.
54.????Tranfield, David, David Denyer, and Palminder Smart. "Towards a methodology for developing evidence‐informed management knowledge by means of systematic review." British journal of management 14, no. 3 (2003): 207-222.
55.????Sciaccaluga, Martina, and Ilaria Delponte. "Investigation on human factors and key aspects involved in Autonomous Vehicles-AVs-acceptance: new instruments and perspectives." Transportation research procedia 45 (2020): 708-715.
56.????Nagy, Viktor, and Balázs Horváth. "The effects of autonomous buses to vehicle scheduling system." Procedia Computer Science 170 (2020): 235-240.
57.????Al Maghraoui, Ouail, Reza Vosooghi, Abood Mourad, Joseph Kamel, Jakob Puchinger, Flore Vallet, and Bernard Yannou. "Shared Autonomous Vehicle Services and User Taste Variations: Survey and Model Applications." Transportation Research Procedia 47 (2020): 3-10.
58.????Yao, Shengyue, Rahi Avinash Shet, and Bernhard Friedrich. "Managing Connected Automated Vehicles in Mixed Traffic Considering Communication Reliability: a Platooning Strategy." Transportation Research Procedia 47 (2020): 43-50.
59.????Kurnaz, Sefer, Omer Cetin, and Okyay Kaynak. "Adaptive neuro-fuzzy inference system based autonomous flight control of unmanned air vehicles." Expert systems with Applications 37, no. 2 (2010): 1229-1234.
60.????Driving, Automated. "Levels of driving automation are defined in new SAE international standard J3016: 2014." SAE International: Warrendale, PA, USA (2014).
61.????SAE On-Road Automated Driving Committee. SAE J3016. Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles. tech. rep., SAE International, 2016.
62.????SAE On-Road Automated Driving Committee. SAE J3114. Human Factors Definitions for Automated Driving and Related Research Topics. tech. rep., SAE International, 2016.
63.????Regulations, Federal Aviation. "Part 23 Airworthiness Standards: Normal, Utility, Acrobatic and Commuter Category Airplanes." Washington, DC: Federal Aviation Administration, Department of Transportation (1991).
64.????SAE. SAE ARINC680. AIRCRAFT AUTONOMOUS DISTRESS TRACKING (ADT)., SAE International, 2019.
65.????Gualtieri, Luca, Erwin Rauch, and Renato Vidoni. "Emerging research fields in safety and ergonomics in industrial collaborative robotics: A systematic literature review." Robotics and Computer-Integrated Manufacturing 67: 101998.
66.????Coxon, Selby, Robbie Napper, and Mark Richardson. Urban Mobility Design. Elsevier, 2018.
67.????Krug, Steve.?Don't make me think!: a common sense approach to Web usability. Pearson Education India, 2000.
68.????Norman, Don.?The design of everyday things: Revised and expanded edition. Basic books, 2013.
69.????Cagan, Marty.?Inspired: How to create tech products customers love. John Wiley & Sons, 2017.
70.????Cooper, Alan.?The inmates are running the asylum: Why high-tech products drive us crazy and how to restore the sanity. Vol. 2. Indianapolis: Sams, 2004.
71.????Gothelf, Jeff, and Josh Seiden.?Lean UX: designing great products with agile teams. " O'Reilly Media, Inc.", 2016.
72.????Cimini, Chiara, Fabiana Pirola, Roberto Pinto, and Sergio Cavalieri. "A human-in-the-loop manufacturing control architecture for the next generation of production systems."?Journal of Manufacturing Systems?54 (2020): 258-271.
73.????Martinez, Pablo, Adriana Vargas-Martinez, Armando Roman-Flores, and Rafiq Ahmad. "A science mapping study on learning factories research."?Procedia Manufacturing?45 (2020): 84-89.
74.????Mital, Anil, and Arunkumar Pennathur. "Advanced technologies and humans in manufacturing workplaces: an interdependent relationship."?International journal of industrial ergonomics?33, no. 4 (2004): 295-313.
75.????Karwowski, Waldemar, Stefan Trzcielinski, Beata Mrugalska, Massimo Di Nicolantonio, and Emilio Rossi. "Advances in manufacturing, production management and process control." In?Conference proceedings AHFE, p. 12. 2019.
76.????Madni, Azad M., and Carla C. Madni. "Architectural Framework for Exploring Adaptive Human-Machine Teaming Options in Simulated Dynamic Environments."?Systems?6, no. 4 (2018): 44.
77.????Malik, Ali Ahmad, and Arne Bilberg. "Collaborative robots in assembly: A practical approach for tasks distribution." In?52nd CIRP Conference on Manufacturing Systems, vol. 81, pp. 665-670. 2019.
78.????Sievers, Torsten Sebastian, Bianca Schmitt, Patrick Rückert, Maren Petersen, and Kirsten Tracht. "Concept of a Mixed-Reality Learning Environment for Collaborative Robotics."?Procedia Manufacturing?45 (2020): 19-24.
79.????Mourtzis, Dimitris, John Angelopoulos, and George Dimitrakopoulos. "Design and development of a flexible manufacturing cell in the concept of learning factory paradigm for the education of generation 4.0 engineers."?Procedia Manufacturing?45 (2020): 361-366.
80.????Michalos, George, Sotiris Makris, Panagiota Tsarouchi, Toni Guasch, Dimitris Kontovrakis, and George Chryssolouris. "Design considerations for safe human-robot collaborative workplaces."?Procedia CIrP?37 (2015): 248-253.
81.????Malik, Ali Ahmad, and Arne Bilberg. "Digital twins of human robot collaboration in a production setting."?Procedia manufacturing?17 (2018): 278-285.
82.????Stadnicka, Dorota, Pawel Litwin, and Dario Antonelli. "Human factor in intelligent manufacturing systems-knowledge acquisition and motivation."?Procedia CIRP?79 (2019): 718-723.
83.????Bochmann, Lennart, Timo B?nziger, Andreas Kunz, and Konrad Wegener. "Human-robot collaboration in decentralized manufacturing systems: An approach for simulation-based evaluation of future intelligent production."?Procedia CIRP?62 (2017): 624-629.
84.????Bag, Surajit, Shivam Gupta, and Sameer Kumar. "Industry 4.0 adoption and 10R advance manufacturing capabilities for sustainable development."?International Journal of Production Economics?(2020): 107844.
85.????Büth, Lennart, Max Juraschek, Kuldip Singh Sangwan, Christoph Herrmann, and Sebastian Thiede. "Integrating virtual and physical production processes in learning factories."?Procedia Manufacturing?45 (2020): 121-127.
86.????Thomas, C., L. Stankiewicz, A. Gr?tsch, S. Wischniewski, Jochen Deuse, and Bernd Kuhlenk?tter. "Intuitive work assistance by reciprocal human-robot interaction in the subject area of direct human-robot collaboration."?Procedia Cirp?44 (2016): 275-280.
87.????Pilati, Francesco, Maurizio Faccio, Mauro Gamberi, and Alberto Regattieri. "Learning manual assembly through real-time motion capture for operator training with augmented reality."?Procedia Manufacturing?45 (2020): 189-195.
88.????Vathoopan, Milan, Maria Johny, Alois Zoitl, and Alois Knoll. "Modular fault ascription and corrective maintenance using a digital twin."?IFAC-PapersOnLine?51, no. 11 (2018): 1041-1046.
89.????de Gea Fernández, José, Dennis Mronga, Martin Günther, Tobias Knobloch, Malte Wirkus, Martin Schr?er, Mathias Trampler et al. "Multimodal sensor-based whole-body control for human–robot collaboration in industrial settings."?Robotics and Autonomous Systems?94 (2017): 102-119.
90.????Rahman, SM Mizanoor, and Yue Wang. "Mutual trust-based subtask allocation for human–robot collaboration in flexible lightweight assembly in manufacturing."?Mechatronics?54 (2018): 94-109.
91.????Nikolakis, Nikolaos, Kostantinos Sipsas, Panagiota Tsarouchi, and Sotirios Makris. "On a shared human-robot task scheduling and online re-scheduling."?Procedia CIRP?78 (2018): 237-242.
92.????Michalos, George, Sotiris Makris, Jason Spiliotopoulos, Ioannis Misios, Panagiota Tsarouchi, and George Chryssolouris. "ROBO-PARTNER: Seamless human-robot cooperation for intelligent, flexible and safe operations in the assembly factories of the future."?Procedia CIRP?23 (2014): 71-76.
93.????Michalos, George, Niki Kousi, Panagiotis Karagiannis, Christos Gkournelos, Konstantinos Dimoulas, Spyridon Koukas, Konstantinos Mparis, Apostolis Papavasileiou, and Sotiris Makris. "Seamless human robot collaborative assembly–An automotive case study."?Mechatronics?55 (2018): 194-211.
94.????Mengoni, Maura, Silvia Ceccacci, Andrea Generosi, and Alma Leopardi. "Spatial augmented reality: An application for human work in smart manufacturing environment."?Procedia Manufacturing?17 (2018): 476-483.
95.????Schmidbauer, Christina, Titanilla Komenda, and Sebastian Schlund. "Teaching Cobots in Learning Factories–User and Usability-Driven Implications."?Procedia Manufacturing?45 (2020): 398-404.
96.????Louw, Louis, and Quintus Deacon. "Teaching Industrie 4.0 technologies in a learning factory through problem-based learning: case study of a semi-automated robotic cell design."?Procedia Manufacturing?45 (2020): 265-270.
97.??????Rabah, Souad, Ahlem Assila, Elio Khouri, Florian Maier, Fakreddine Ababsa, Paul Maier, and Frédéric Mérienne. "Towards improving the future of manufacturing through digital twin and augmented reality technologies."?Procedia Manufacturing?17 (2018): 460-467.
98.??????Hogan, A., D. Laufer, D. Truog, W. Willsea, and R. Birrell. "The Six Steps For Justifying Better UX." (2016).
99.??????Team Pa11y , Pa11y, automated accessibility testing pal, (2020), https://pa11y.org/
100.????Nielsen, Jakob. "Ten usability heuristics." (2005).
101.????Lund, Arnold M. "Expert ratings of usability maxims." Ergonomics in Design 5, no. 3 (1997): 15-20.
102.????Shneiderman, Ben. "Shneiderman's eight golden rules of interface design." Retrieved july 25 (2005): 2009.
103.????Usabilla, Usabilla for Apps allows you to collect feedback from your users with great ease and flexibility, (2020), https://github.com/usabilla
104.????Curated Digital Twin Knowledge Repository || Digital Twin Ecosystem || Business and Marketing Intelligence, (2020), https://digitaltwin.io/
105.????Tufte, Edward R. The visual display of quantitative information. Vol. 2. Cheshire, CT: Graphics press, 2001.
106.????Teiwes, Johannes, Timo B?nziger, Andreas Kunz, and Konrad Wegener. "Identifying the potential of human-robot collaboration in automotive assembly lines using a standardised work description." In 2016 22nd International Conference on Automation and Computing (ICAC), pp. 78-83. IEEE, 2016.
107.????Robots, I. S. O. "robotic devices–Collaborative robots. ISO 15066: 2016." International Organization for Standardization, Geneva, Switzerland (2016).
108.????OpenKinect, libfreenect, open source libraries that will enable theXbox Kinect to be used with Windows, Linux, and Mac, (2020), https://github.com/OpenKinect/libfreenect
109.??Jerald, Jason, Peter Giokaris, Danny Woodall, Arno Hartbolt, Anish Chandak, and Sebastien Kuntz. "Developing virtual reality applications with Unity." In 2014 IEEE Virtual Reality (VR), pp. 1-3. IEEE, 2014.
110.??Salehi, Vahid, and Shirui Wang. "Web-Based Visualization of 3D Factory Layout from Hybrid Modeling of CAD and Point Cloud on Virtual Globe DTX Solution." (2019).
111.??Robots, I. S. O. " Robots and robotic devices - Safety requirements for industrial robots ISO 10218:2011." International Organization for Standardization, Geneva, Switzerland (2011).
112.??Romero, David, Johan Stahre, Thorsten Wuest, Ovidiu Noran, Peter Bernus, ?sa Fast-Berglund, and Dominic Gorecky. "Towards an operator 4.0 typology: a human-centric perspective on the fourth industrial revolution technologies." In proceedings of the international conference on computers and industrial engineering (CIE46), Tianjin, China, pp. 29-31. 2016.
113.??Papcun, Peter, Erik Kajáti, and Ji?í Koziorek. "Human machine interface in concept of industry 4.0." In 2018 World Symposium on Digital Intelligence for Systems and Machines (DISA), pp. 289-296. IEEE, 2018.
114.??Gorecky, Dominic, Mathias Schmitt, Matthias Loskyll, and Detlef Zühlke. "Human-machine-interaction in the industry 4.0 era." In 2014 12th IEEE international conference on industrial informatics (INDIN), pp. 289-294. IEEE, 2014.
115.??Xiao, Ningcong, Hong-Zhong Huang, Yanfeng Li, Liping He, and Tongdan Jin. "Multiple failure modes analysis and weighted risk priority number evaluation in FMEA." Engineering Failure Analysis 18, no. 4 (2011): 1162-1170
116.??Wittenberg, Carsten. "Human-CPS Interaction-requirements and human-machine interaction methods for the Industry 4.0." IFAC-PapersOnLine 49, no. 19 (2016): 420-425.
117.??Garcia, Ruiz, R. Rojas MA, Luca Gualtieri, Erwin Rauch, and Dominik Matt. "A human-in-the-loop cyber-physical system for collaborative assembly in smart manufacturing." In CIRP Conference on Manufacturing Systems (CIRP CMS 2019), June, pp. 12-14. 2019.
118.??Fantini, Paola, Marta Pinzone, and Marco Taisch. "Placing the operator at the centre of Industry 4.0 design: Modelling and assessing human activities within cyber-physical systems." Computers & Industrial Engineering 139 (2020): 105058.
119.??Guérin, C., Philippe Rauffet, C. Chauvin, and E. Martin. "Toward production operator 4.0: modelling Human-Machine Cooperation in Industry 4.0 with Cognitive Work Analysis." IFAC-PapersOnLine 52, no. 19 (2019): 73-78.
120.??Cohen, Yuval, Maya Golan, Gonen Singer, and Maurizio Faccio. "Workstation?Operator Interaction in 4.0 Era: WOI 4.0." IFAC-PapersOnLine 51, no. 11 (2018): 399-404.
121.??Lorenz, Ruth, Kai Lorentzen, Nicole Stricker, and Gisela Lanza. "Applying user stories for a customer-driven Industry 4.0 transformation." IFAC-PapersOnLine 51, no. 11 (2018): 1335-1340.
?Please share any honest comments or suggestions if you would like to.
Student of Central University of South Bihar
1 年keep it up
Freelance Design, Simulation & 3D printing Consultant
2 年A lot of work!