DIGITAL TWINS AND THE METAVERSE: WHAT’S THERE FOR ROBOTICS?
PROLOGUE
Let us begin with an apology. I have not paid attention to LinkedIn since…well, ever since I have created my account I guess. It is quite easy, as a CEO and part of the dev team myself, to get lost on my daily activities. Hence, I am sorry if I have not followed up on many of the requests I had, be it to answer messages or write a few recommendations. I’ll try to make up for that by being more present around here, starting by sharing some thoughts on the future of #robotics. Or, at least, what many believe to be the future of robotics, myself included. Feel entirely free to step in and share your thoughts as well in the comments!
INTRODUCTION
For those who might not be aware of it, when Ingeniarius was founded in 2014, we had this value proposition of offering internet-connected solutions to our clients. This was a hot topic back then as #iot was finally being a more tangible reality, with cellular and embedded technologies constantly on the rise. We then developed a series of smart solutions connected to our own cloud, ranging from #wearables for motion capturing and positioning to sports-specific devices. Among these, obviously, we developed a few mobile robots, mostly for academia.
For a small company like ours (bear in mind that we were only 3 back then), developing these internet-connected solutions was a cumbersome undertaking – not only we had to design all the mechatronics, as we had to develop many layers of software. For internet-connected robots, for instance, we had to develop (i) the firmware dealing with the kinematics and control of the platform; (ii) the Robot Operating System (#ros) software packages dealing with path planning, simultaneous localization and mapping (SLAM), decision-making and others; (iii) the WebSocket bridge to share data between robots and our cloud; (iv) the backend server to manage all data and make it available to users; and (v) the user interfaces to interact with such robots. Furthermore, all that needed to be done while considering security, fault-tolerance, usability and trustworthiness. As you probably guessed by now, we then decided to concentrate our efforts into a more specific domain, field robotics, and, while we often deal with many of the aforementioned layers, most of our current work ends up being around both firmware and ROS software developments.
DIGITAL TWINS
Fortunately, over the past 9 years, a lot has changed. If before we needed to develop so much from scratch on our own, now there are plenty of services, tools and frameworks one can use in the design of such internet-connected solutions. #microsoftazure, #amazonwebservices, and #googlecloud are just some of the most well-known platforms offering plenty of services and products supporting a wide range of IoT use cases. And, with such platforms, together with the advances in GPU and internet connectivity, came the possibility to virtually mirror elements of the real-world – something NASA - National Aeronautics and Space Administration has made more than a decade ago in the aerospace field – fed by data from devices deployed in the real-world. While this concept was anticipated in the 90s, it was only coined as #digitaltwin many years later.
Then, where do Digital Twins meet robotics? Not so different from the aerospace industry, having the computer-generated equivalent of a robot can be useful for its virtual deployment under challenging tasks, which allows to develop and improve algorithms without all the real-world burden. But it does not stop there – I mean, if it would, then it would be no different than running a robotic simulator, and the ROS community has its good share of those, including the popular #gazebo from Open Robotics . The key difference is that, unlike simulations, a digital twin is backed-up with real-time data and a two-way flow of information between the virtual and the real-world. This bridging of the gap between physical and digital worlds increases the accuracy of the robotic models, which enables to further mature solutions and test them under complex conditions significantly faster, and less costly, than with real-world field testing.
Which Digital Twin platform to use in robotics really depends on your requirements, though the number of choices is still limited, of restricted access, or expensive. The US company Duality AI developed their own Digital Twin simulator for robotics, called Falcon, which makes use of the #unrealengine ecosystem. 英伟达 has been moving at a fast pace, showcasing promising results by combining Omniverse and Isaac Sim tools. Likewise, 微软 also showed interesting results by combining the Azure Digital Twin with the Unreal engine, and, although it is not about robotics, it does integrate LiDAR scan data overlaid with the digital model of the building.
If you are interested on Digital Twins, you will not have any problem in finding many articles out there. Yet, I invite you to read the following LinkedIn article from Richard Kerris, VP of the Omniverse Developer Platform at NVIDIA.
METAVERSE
I would say that the applicability and impact of Digital Twins within the robotics domain is quite clear. Yet, when the term Digital Twin is mentioned, in general, the term #metaverse comes along. Does it mean that we will soon have robots in the Metaverse? Well, yes, in a way, though the answer is not that simple, especially because there are different perspectives on the use of robots, or general #cyberphysicalsystems, in the Metaverse. Before sharing some of those, just a small background on what Metaverse means.
Same way the term “robot” (“forced labor”) was coined by Czech playwright, novelist Karel ?apek is his fiction play Rossum's Universal Robots (RUR) in 1920, the term “metaverse” (“meta” + “universe”) was coined by the American writer Neal Stephenson in his science fiction novel Snow Crash in 1992. Out in the real-world, we could say that, as many other things out there, it started in the entertainment industry. You might all recall the multiplayer online virtual world #secondlife, from Linden Lab , which allowed users to interact with each other and even with user-created content. Many believe that Second Life was pioneer in such genre, though I still remember spending a few hours on Active Worlds ( Activeworlds Inc. ) myself back in the late 90s. Be it as it may, we could argue that this was the beginning of the Metaverse for the following reasons: (i) unlike other videogames, users do not have a designated goal in these virtual worlds, without a traditional gameplay mechanics or rules in place, focusing mostly on social interaction, user-generated content (including having your own brand and business), and general user freedom; and (ii) users have access to an in-game currency, with real exchange of goods and services taking place under such economy, some of which affecting the real-world as well (e.g., online services).?
So where does these and other multiplayer online virtual worlds differ from the overall idea of a Metaverse? There are a few reasons to differ, such as the #vr immersion expected (though not mandatory) and the required level of #scalability offered for it to be fully accessible to anyone. Nevertheless, the major difference falls on the fact that the Metaverse intends to provide the same level of affordances to users, if not even more, than the real-world does, especially in terms of economy. Put it differently, while these online virtual worlds do have an economy, users’ ability to make money or even purchase real-world products is still limited. The Metaverse intends to be a virtual extension of our reality by merging both physical and digital dimensions, where #ai, #iot and other technologies, such as #robotics, will play an essential role.
领英推荐
General view aside, companies from all over the world, such as 微软 , Meta , Epic Games , Unity , 英伟达 , Adobe , 索尼 , Hyundai Motor Group , 高通 , InOrbit.AI , Duality AI , Roboverse Reply DE , and many others, all have their own specific purpose and vision of the Metaverse. Meta intends to expand on its current efforts in the social networking business, embracing a more social Metaverse. This also implies the business perspective of social networking, having, as selling point, the potential to transfer companies to the Metaverse. Of course, not everyone agrees with such vision, with many concerned of what the lack of real-world interactions with colleagues may bring upon. Microsoft, besides the social and even gaming perspective of the Metaverse (which may go beyond Meta’s vision), additionally embraces a more industrial version of the Metaverse (still dully named as #industrialmetaverse so far). This vision, interconnecting IoT, Digital Twins and the Metaverse, aims mostly at businesses, with a strong positive impact on the design and building of new products, as well as on optimizing operations within companies. Microsoft is not alone in such efforts, with strong collaborations made with other big players, such as Epic Games, and also recent players, such as Roboverse Reply, with this later, and as the name suggests, having a stronger emphasis on robotics. A different perspective has been shared by Chang Song, President and the Head of Transportation-as-a-Service Division of Hyundai Motor Group. Hyundai Motor, besides being a well-known vehicle manufacturer, has been investing on self-driving delivery robots. More recently, they have also stepped in into the metaverse domain or, as they call it, metamobility, which intends to go beyond the physical presence in the virtual world, using robots as avatars to affect changes in the real-world. In a nutshell, the difference between The Matrix and Surrogates, right?
So, there is, indeed, a relationship between robotics and the Metaverse after all. It might, however, not be (only) to use robots as real-world surrogates - at least not just yet. Hence, what can the Metaverse offer to roboticists in the short term?
Florian Pestoni, CEO of InOrbit, started 2023 by launching a new Medium publication called “Welcome to the Roboverse”. The very first article published there (which is a republication of his previous 2022 Forbes article) presents his vision around the portmanteau between “robot” and “universe”, Roboverse - to be honest, I was unable to find who first introduced this term, but I recall it being used in one episode of The Simpsons years ago. Anyway, the fact remains that Florian’s vision is very pragmatic: while the Metaverse promises to help us escape the world we live in, the Roboverse intends to make our real lives, in the real-world, better. Even though I would not go to such lengths, because I do believe that the Metaverse, utopically, might make our real lives better in the future, I completely agree that the Roboverse, as a particular case of the Industrial Metaverse, will have a completely different impact in our quality of life. The main reason being that we will have access to physical non-player characters (#npc) in the real-world, the same way that we do have NPCs in the virtual world, with all the perks these can offer.
For those familiar with massive multiplayer online role-playing games (MMORPGs), such as World of Warcraft, Final Fantasy XIV, Tera and many others (some of which I have spent many hours on in the past), NPCs are sort of a “supporting cast”, helping the player every now and then by offering advises for further progressing in the game and even collaborating under specific tasks (e.g., fighting side-by-side to defeat common enemies). We already have robot guides in museums for a while now, as is the case of Softbank Robotics’ Pepper, which has been used in a few museums worldwide, including in the Smithsonian National Museum of African Art in Washington and in the Museum of Modern Art in Barcelona. Likewise, we already have collaborative robots, or #cobots, capable of some level of cooperation with humans in factories, as is the case of some Universal Robots manipulators, or even mobile robotic solutions that are now starting to materialize.?
So, if these solutions are already out there and they are not enhanced by this whole Metaverse or Roboverse paradigm, then why do we need it in the first place? Because of #connectivity. More than a decade ago, a different term was used to express the advantages of (inter)connected robots, which would leverage on robotics abilities with task offloading and information sharing: cloud robotics. Cloud robotics was intended to support operations with distributed sensing, computation, and memory. Furthermore, knowledge sharing and skill acquisition were key selling points of cloud robotics – a robot could acquire sensory data about a specific glass cup and learn how to hold it without breaking it or spilling its content, and share this knowledge with another robot deployed elsewhere in the world. The RoboEarth project, funded by the EU, was a good step in this direction, which led to the Rapyuta’s DevOps platform for robots. This is also the business model of the US company InOrbit, which even coined the term as RobOps – watch this cool video inspired in the 24 TV show as an example.
What is the difference between Roboverse and cloud robotics, if you ask? The Roboverse inherently includes the concept of cloud robotics, but it is so much more than that, with Digital Twins playing a key role. Marek Matuszewski, manager of the Roboverse Reply company, briefly described this general idea at Microsoft Inspire 2021 by demonstrating it through Boston Dynamics’ Spot dog-like robot. Likewise, the business model of Duality Robotics also revolves around the use of Digital Twins and the Roboverse to accelerate the deployment of AI models and robotic systems.
All in all, we could state that Digital Twins ensure the practical applicability of the Metaverse to the real-world, being the particular case of the Roboverse the most promising one, which cloud be split into 3 main perspectives:
The industrial perspective is the one that would be more valuable for a tech company like Ingeniarius, while the enterprise perspective would make complete sense for our clients managing robots installed in their facilities or their clients’ facilities. The consumer perspective affects the general population, though I believe that, among the 3, it might take us some more time to get there.
CONCLUSION
From where I stand, the future seems brighter than ever, with a seamless integration between both real and virtual worlds. Being an expert in robotics myself, not in the Metaverse though, I still feel that it will take time to reach an omnipresent Metaverse. Probably we will see several Metaverses, driven by leading companies as those described in this article and others, each focusing on different perspectives (industrial, enterprise and consumer). Digital Twins are here already though, be it for robotics and other domains as well.
If this raises any concerns? Sure, it does. Which new technology, especially internet-connected one, does not? #cybersecurity and #privacy are of major concern, and many efforts and advances are being made by the big players out there to make sure that these problems are mitigated – mark the word mitigated, because it is simply not possible to address all the hurdles related with the Metaverse, or even the internet for that matter. However, there are ways to cope with those to some extent, starting with #standardization and #regulation.
Standards are crucial in any industry. Robotics is not the exception. ROS, for instance, started more as an academic standard than an industrial standard. However, over the years, and due to the many efforts of the ROS community, the ROS-Industrial consortium and the many companies adopting it, Ingeniarius included, ROS has aligned itself with industrial standards more and more. Note that, by ROS, I mean both ROS1 and ROS2, being the latter even more aligned with industrial standards as it offers a real-time data distribution service protocol with a quality-of-service profile for better scalability and security. Back to the Metaverse, big players, including Microsoft and Meta, have joined efforts to foster #interoperability for the Metaverse, which led to the creation of the Metaverse Standards Forum (MSF). We already have the Universal Scene Description (USD) ecosystem (by Pixar Animation Studios ) as an open standard for describing, composing, simulating, and collaborating within the Metaverse. However, much more is required, and the MSF is definitely here to help.
Regulation is also key, though it is hard to regulate something that is not completely established yet. Furthermore, regulation takes time and is difficult to apply globally. Nonetheless, the European Union (EU) intends to be the driving force behind the global regulation of the Metaverse. They have signed up for a huge undertaking though, that’s for sure, as regulation might have to cover a wide range of dimensions, such as cybersecurity, privacy, copyright, IP, regulation of #nft and #cryptos, and so much more. Yet, it must be done, and better sooner than later, especially because regulation tends to lag behind the pace of technology.