EQ for AI: shaping tomorrow’s human-machine experience
Exploring the transformational possibilities of artificial empathy. Rebecca Johnson , Head of Research Group at Siemens Technology, discusses her pioneering work in AI and empathic computing; the potential of this to improve our skillsets, evolve the dynamic between machines and humans, enhance the Industrial Metaverse, and help transform the industrial world.
Takeaways
Imagine the possibilities: next-level digital companions help evolve human skillsets through interactions rooted in empathy, transforming how we engage with machines on the factory floor and beyond. Not only that; AI engineers are introduced into the industrial metaverse, radically enhancing a digital environment that is not an escape from our physical reality, but a means of solving its biggest challenges. We are in a new era of human-machine collaboration, one that brings natural and artificial intelligence closer together to supercharge productivity, innovation and, ultimately, transformation within the industrial world.
At the heart of this lies a fundamental truth: the future of human-machine collaboration is not just about form and function. It is about experience too – the extent to which users can positively engage with their digital companions, communicate their intentions, and feel that a genuine connection has been established. This connection is the catalyst for successful collaboration; a shared understanding to support co-creation.
This ‘connection’ is the focus of Rebecca Johnson and her team, who are teaching machines empathy. They are training avatars across digital interfaces and immersive environments – websites, digital displays, learning applications, the industrial metaverse – to actively listen, understand and show compassion. In doing so, they are creating next-level digital companions to transform how we interact with machines. It’s a space that is moving at speed…
“We’re absolutely moving towards machines becoming our personal, digital and empathic companions, copilots or assistants, and we’re moving quickly,” says Rebecca. “Leaps in infrastructure have helped make this possible. We have more powerful computers more readily available, greater bandwidth, and better cameras to analyze human faces. We are combining this with the insights out of psychology; adding psychologists or negotiators to the mix, so we're able to identify what needs to be detected and synthesized.”
Rebecca is quick to cite the universal application of these innovations. Take the factory floor, where the capture and analysis of body language and facial expression could help optimize the physical working environments of today and the future. “We can avoid, for example, too great a distance between colleagues, which results in people shouting at each other and encourages increasingly heated language. We can better identify if a worker doesn’t smile for four consecutive days, which can be a sign of depression so that we can work with them to address this. We can run simulations where a shorter individual is taking over a shift from someone taller and consider what that means for the strain on the body.” Rebecca goes on to depict a scenario where exoskeletons populate a future factory floor, allowing people to be faster and lift stronger, in a healthier way.
This vision for a digitally optimized factory floor is forward-looking but not overly so. Industrial AI-powered, enhanced human-machine collaboration is already revealing its potential to revolutionize the way companies design, develop, manufacture, and operate. For example, the recently announced Siemens Industrial Copilot from Siemens and Microsoft is a generative AI-powered assistant that will allow users to rapidly generate, optimize and debug complex automation code, and significantly shorten simulation times. It can reduce a task that previously took weeks to minutes, promising to boost productivity and efficiency across the industrial lifecycle. Using natural language, maintenance staff can even be assisted with detailed repair instructions and engineers with quick access to simulation tools.
领英推荐
“Ultimately, as humans, we’re builders, we construct things,” states Rebecca. “As we evolve human-machine collaboration, we can really speed this up instead of having to jump through hoops to create something. You’ll be able to work with the system to identify goals and objectives and not get hung up on so many details.”
In discussing progress already made, Rebecca refers to the development of her team’s negotiation training application, where empathic avatars already operate in fully immersive, lifelike training scenarios to boost the negotiation skills of users. “Cloud and VR technology creates the negotiating environment. The latest AI advancements enable completely natural and engaging conversations. Empathic computing allows avatars to mimic human-like behavior and conduct real-time sentiment analysis.”
“We have taught avatars the variety of emotions that could be involved in bad and good negotiation manners. Now, Large Language Models are combined with photorealistic avatars and over 50 years of negotiation experience to present an instructive, digital counterpart to the user, one that negotiates with the user and, in doing so, analyses and helps upskill based on the interactions.”
Rebecca considers the negotiation training application particularly relevant for senior management in large corporations, that will regularly need to negotiate desired results and deliver company-tailored training scenarios at scale. Yet the foundational R&D and innovation can help shape an even larger universe of interaction and collaboration: the industrial metaverse, the immersive digital environment that mirrors and simulates real-world systems to optimize processes and drive sustainable practices.
The industrial metaverse will be a place where engineers, workers, and anybody, can experiment, test improvements and try out ideas. Using generative design and supported by their AI companions, people can try out new concepts and solutions, just as they would in the real world, without incurring additional costs, consuming resources, or risking damage to physical objects. The empathic element is vital to realizing this vision; the more intuitive, emotionally intelligent, and easier to engage the AI companions, the faster they can help solve problems, suggest alternative solutions and provide assistance.
Considering this artificial empathy and the broader enhancement of human-machine collaboration, Rebecca labels it as a positive disruption. “Ultimately, this is about how machines can help augment people into ‘power humans’. Things will get completely intuitive – no more mice or keyboards or monitors. It will be a bespoke interaction between user and system, empowering us to build better and deliver value at a terrific pace for the benefit of our future.”
Author: Daniel Bond. This article was initially published on Siemens Insights.
Analista de Cuentas a Cobrar.
6 个月??????
Toolmaker at Greiner Packaging
7 个月Very useful cvc v
Full Stack Engineer | Embedded Systems Specialist| Driving Innovation with Software & Hardware Integration | Lean Six Sigma Black Belt
7 个月Just brilliant
Senior Management| LeanSixSigma |Digital| Business Strategy |Project-Operations-Quality-TQM-SCM | Old Dominion Uni|Ex-NTTF-SPG| Ex-RIM| Ex Millipore|Ex-Maini| Ex-Kirloskar Ferrous|Ex-iMed |Ex-Continental|Automotive|Glass
7 个月#empathetictechnology #empatheticleadership #empatheticlistening #sustainableinnovation
Global Procurement Leader | Indirect Procurement Specialist | Project and Contract Management Strategist | Supply Management Policies & Processes Expert | John Deere, Bajaj Auto | VNIT, IMT
7 个月Exciting news for industry. Great work by Siemens team. I believe human-machine interface has immense applications beyond industry like recently heard news about Neuralink trials which can probably help specially abled persons a lot. Though later will require extensive trials and approvals, all these developments promise great future if used cautiously.