The Evolution of Digital Collaboration - Games meet Metaverse
2 people collaborating in a realtime 3d environment in the style of the game Human Fall Flat

The Evolution of Digital Collaboration - Games meet Metaverse

As a digital transformation veteran, I have my fair share of scars from working with people to adopt a new technology, be that shipbuilders in India adopting a new 3D CAD tool (when their current process is paper), or getting an energy super major to adopt gaming technology for serious HSE training applications. One thing is clear, the pattern is similar. The initial effort is high, and as humans we resist the initial change with passion but eventually the benefits start to shine through and it then becomes business as usual.

My current obsession is, the digital transition to remote, spatial, realtime collaboration, and I think this will be come business as usual one day too.

Now you might think that this is nothing new, after all we have had the internet for decades, giving us asynchronous collaboration via email and SharePoint. And since the pandemic, we have learned how to collaborate in realtime, with video meetings allowing us to talk more effectively in realtime and great 2D collaboration tools like Miro and Figma allowing us to work in 2D on designs or ideas on boards in realtime but we don't yet have the same thing established for 3 dimensions.

Tools like Miro and Figma allow us to remotely collaborate in realtime on 2D content

3D design - from specialist CAD tools to immersive engineering

As humans, one of our unique skills is our ability to conceive of, and build things conceptually bigger than we can build ourselves, just take the Icon of the Seas, the largest cruise ship ever built. Just one look at it and most of us can imagine the amount of project management, planning and coordination effort to get the many smaller teams to work together to build this mega project in an incredibly short time. And while those project management tools, share point folders and one drives are an essential part of the tool kit we use to build ships like this, so are 3D tools like CAD and rendering engines.

3D is now such an important part of shipbuilding that one Singaporean shipbuilder I worked with quite accurately called 3D the 'Central Platform for collaboration between key stakeholders' justifying the huge investments made in software and trained users in industries like shipbuilding. This is why traditionally 3D has been locked behind a huge barrier to access, namely its complexity to learn and its cost.

While I think its right to say that today's 3D tools have not yet had their Figma moment, we are getting closer. Since the reemergence of VR in 2015 there has been VR modes, or VR viewers for most of the well known CAD tools, so users doing their design work, on 2D screens, can put on a headset and see their designs in stereoscopic 3D. More recently there has been a rapid development of new 3D design tools, creating new categories of 3D tool.

Spatial Design tools like Bezi and Spline have arrived on the market very quickly. my take on that is that they take advantage of the traditional 2D input approach to 3D creation, and combine it with the accessibility of the web to enable multiple users to edit and collaborate in realtime on any 2d computing device. I suspect that this combination is seeing relatively rapid user growth.

Bezi is like Figma for 3D, taking a traditional approach to the 3D authoring UI but adding interaction and collaboration all in a browser based form.

Immersive Design tools like Gravity Sketch ShapesXR and Arkio are taking longer to bring to market as they choose to take advantage of the fully 3D nature of tracked VR and MR devices, allowing users to both create and collaborate in fully immersive 3D spaces, with very rich interactions being possible between the users but requiring a complete rethinking of the designers UX.

Gravity Sketch puts the designer in the 3D space with the design and gives them intuitive tools, directly in their hands for a unique connection to the design process

All of the above new generation 3D tools are either using new web 3D technologies, or gaming engines behind the scenes. These technologies are accelerating their ability to bring novel new features to market like multi player, support for many device types, procedural 3D modelling, APIs to other web platforms and cloud storage.

So it makes sense that the CAD tools providers, who have to build on proprietary CAD kernels, have stayed a few steps further behind. However the Enterprise 3D space has watched and listened, and in the next year or 2 we will see more Immersive Engineering applications coming to the market. Most notably Siemens and Sony have partnered to bring a unique headset to market for the professional engineer https://www.youtube.com/watch?v=jiikr42fQUE for their NX CAD platform and https://www.shapr3d.com/ are working on a Apple Vision Pro app for their iPad based CAD tool.

As far as I can tell both Siemens and Shapr3d are looking at a hybrid approach to UX, where most authoring is going on in the 2D canvas on a traditional 2d device, with some authoring in fully immersive devices but on a 2d canvas in 3d space, and mostly the collaboration capability happening in the immersive mode, which makes sense to me.

Presence

There is a lot going on in the section above but what I can see clearly emerging is the addition of Presence in 3D applications.

It goes without saying that presence is needed to enable realtime digital collaboration, just imagine a simple task like carrying a washing machine up a flight of stairs, without voice and visual presence it would be very hard to coordinate the cooperative nature of that task.

In realtime digital applications I see the following main paradigms emerging for presence:

Presence as a 3D cursor

In Bezi, which is 3D on a 2D canvas, real time collaboration is limited to a 3D cursor representing the other users. The 3D cursor takes into account the angle they are viewing the 3D model from on their screen. This is simple and relatively easy to implement and when combined with realtime voice communication the effect is very functional.

Conceptual image from the Bezi website showing how their collaboration mode works.

In the enterprise engineering space, Microsoft and Hexagon AB announced a 3D viewer that would take advantage of Microsoft's live share capability as far as the screen shots how this uses the same 3D cursor approach, essentially showing where the other users are in space and the angle they are viewing the 3D model from on their screens. The integration in MS Teams in this case being very useful for realtime voice and video communication and process related collaboration like taking notes and recording actions.

Hexagon's prototype app for Microsoft Teams with Live Share.

Presence as an Avatar

In immersive 3D apps like, Gravity Sketch, ShapesXR and Arkio, the fully immersive approach puts the user digitally, into the same space, thus requiring an avatar of some kind. This avatar can be a simple abstract representation such as a floating headset and hands, or a fully embodied digital humanoid avatar, both having their pros and cons.

In my experience the combination of the full immersion of the headset and the realtime updates of the head and hands of the other users is very convincing that you are indeed co present with the other person.

The image above shows 2 people in a Gravity Sketch collab room, if they are collocated and using Mixed reality mode this is what they would actually see but if they were remote an avatar would be needed to represent the other user.

For example, in one of my most memorable moments at my time at Gravity Sketch I (or my avatar) was inside the headlight module of a car that a student was designing, and next to me in the same small scale was a designer that specialised in light design. Meanwhile at a far larger scale in the same session, were other designers were working on the body panels and wheels. I remember that session as if I were physically there.

So presence via avatars in the 3D space, not only provides the user with more feedback on where and what the other users are doing but it also makes you feel like you are there with them, you get a basic sense of their body language, you can see if they are paying attention to you or if they are engaged in something with someone else. Moreover when combined with spatial audio, or proximity chat this means that multiple collaborations can happen at the same time! Turbocharging the richness and speed of collaboration on large and complex projects.

Arkio showing clearly how multi scale allows different users to work on a large model synchronously and have multiple collaborative conversations

Presence as a avatar but on a 2D screen - 2.5D collaboration

Presence in fully immersive 3D via a tracked headset is the pinnacle of realtime collaboration in my opinion but many people are still not ready to put their heads in a headset for hours at a time. Luckily there is a 3rd way, and probably the way that much of the next generation of workforce are very familiar with.

It is of course possible to be represented in 3D space as an avatar and to collaborate there without the need of wearing a tracked headset. And 100s of millions of us do that every day, in games.

Being a gamer, I spend a decent amount of my time in live collaborative scenarios in digital 3D worlds, i.e. multiplayer games. In those games, I collaborate with my friends to different degrees. Sometime we are simply shooting shit, where the degree of our collaboration is shouting 'he is over there behind the green hut!', or in my case 'I'm down I need a medic!' And because we are in a perfectly synchronized detailed 3D environment with less than 10ms of lag our collaboration works.

Recently my Kids and I have discovered the joy of multiplayer pirating in Sea of Thieves, we collaboratively sail the virtual seas, coordinating our navigation and operation of the sailing ships in realtime. They are growing up, knowing intuitively how to collaborate with people, regardless of their location on the planet thanks to well made experiences like SoT.

An example of good cooperative play in Sea of Thieves by RARE.

Multi player games are getting good at adding these tools to unlock a more engaging, collaborative style of gameplay. So much so that one could argue that this is the dominant style of game emerging (Think Fortnite, Roblox, Minecraft)

I think that 2.5D collaboration has a lot of potential in the enterprise space but a merging of the traditional approach and the gaming approach is needed.

Agency and Engagement

Charlie Fink Made this excellent slide in his Forbes article about the future of work. It clearly shows how digital experiences that combine presence (discussed above) and agency generate the most engagement.

the Venn diagram of Presence and Agency by Charlie Fink

Improving agency and engagement in 2.5D

2.5D apps (games) are generally poorer at giving the user agency compared to fully immersive (XR) experiences, as the user is interacting via an abstract controller and not with their hands directly. For example opening a door in a game is simply a matter of looking at it, and pressing a button. Whereas in VR this could be a full 1:1 interaction like in the real-world, i.e. reach out --> grab handle --> turn handle --> pull door. VR allows rich 1:1 interactions like this, which is why it's so effective in training scenarios.

So how does one give a player agency in a game?

My gaming buddies and I always notice when there is a really good multi player coop feature and we even start to call it 'metaversy'. So I asked them, what are the kinds of features that make for a great cooperative multiplayer experience, this is what they said.

  • Perfectly synched objects - both position and their physics
  • The ability to point and gesture (emotes)
  • Simultaneous Player to Player interaction (helping, grabbing inventory)
  • Proximity based audio chat
  • Real time Shared Building capability - a preview of where the block I want to place will be shown to others
  • Tasks that can only be completed in cooperation - You pull this rope while I slide a block in
  • Perfectly synchronized maps, or other documents needed to do the mission
  • Using characters physics in cooperative mode - (think of a seesaw)
  • The use of virtual cameras - one player watching a virtual security monitor guiding the other from room to room or helmet cam
  • Tagging - marking a pointer and an area and tagging it with different meaning/color
  • Integrated shared 2D sketchpad to communicate ideas quickly, Sketch pad augmented with AI to create straight lines and potential extrusion in 3D environment

There are some notable examples of games that really nail the collaborative aspect of gaming.

Human Fall Flat by No Brakes Games gives the gamer avatar usable hands, which a unlock all sorts of interactions for collaboration.

Sea of Thieves by Rare allows rich collaboration where users must work together to effectively sail the boat, maintain it and fight enemies

Star Citizen by Cloud Imperium Games is an ambition space exploration and economy game where there is excellent use of shared screens and Human Machine Interfaces (HMI)

Most First Person Shooter (FPS) games as players must instinctively communicate with each other to instantly react to threats.

Rec Room by Rec Room probably features most of the wish list above in one way or another. I think this is because it was originally developed for VR, resulting in a few key design decisions, firstly the user has hands which was translated over to 2.5D in a really nice way. Secondly unlike Roblox or Fortnite nearly all the games or experiences in Rec Room were created in the Rec Room app, and not externally in a game engine. This means that the creators work fully with the in game tools and moreover can collaborate with other players to build bigger more complex experiences together (Minecraft has this too)

Rec Room creation showing shared, UI, and how the avatar shows others what the user is doing very clearly

Here is my example of Rec Room's creator tools in action

Collaborating in the Industrial Metaverse, will it be 2.5D. Mixed Reality or fully immersive VR?

Most people associate the Metaverse with virtual reality and today that usually means a full 3D tracked headset. However as we have seen above, 2.5D experiences can deliver many of the benefits of of VR app but in a more accessible way.

Just like the way we use a mixture of mobiles, tablets, laptops and desktops for our flat compute today, I think we will use a combination of computing devices to access and collaborate in the Industrial Metaverse.

If the kind of collaboration to be done is very much, ,focused, maybe hands on work, requiring what would be a face to face interaction in real life (IRL) but remotely, then a VR head set might be the best choice as it gives the users the richest form of presence and agency.

If the collaboration is more of a 'come and take a look at this' task where only looking and reviewing is required for many users then a 2.5D approach would be completely valid to increase the accessibility with only a slight reduction in focus. (A great example of this is Gravity Sketch's 2D mode for collab rooms which are mostly used by Design managers.)

If the collaboration required is on site, requiring hands free compute but with digital content that does not yet exist IRL then Mixed reality mode on a head set would be appropriate.

And if the collaboration is revolving around remote review of 2D documents then a good old fashioned video call wold work fine too - perhaps in a spatial computing device like the Apple Vision Pro when the size comes down.

What is clear is that its not a black and white decision, and app developers need to consider making their apps multi modal.

In Summary

  • 3D is evolving to have multi player realtime collaboration as shown by the new wave of Spatial Design apps and Immersive Design Apps
  • There are several approaches to presence in collaborative 3D apps
  • Immersive design apps are leading the way in creating new types of remote digital collaboration but suffer from accessibility for some
  • Gaming is leading the way in creating engaging real time multi user apps both in fully immersive 3D and in 2.5D flat screen
  • 2.5D collaboration apps can give users some of the presence and some of the agency of Immersive apps
  • Enterprise app developers have a lot to learn form the gaming world and this is effectively a new market
  • Ultimately we will use a combination of computing modes to access the industrial metaverse









Sean Keogh

Sean Keogh is Partner and co-founder at headroom, Managing Director at eskape.digital GmbH

5 个月

Excellent! Thank you for sharing!

回复
Manu Vikraman

Helping companies design impactful XR experiences that reduce costs, improve efficiency, and deliver measurable ROI

8 个月

Very detailed and informative.

回复
David Thomson

Product Director with Digital Marine & Platform Focus | AI IoT, XR and PLM | AVEVA, IBM, Gravity Sketch Alumni

11 个月

I would be interested to see what Kent Bye and Charlie Fink make of this article.

回复

要查看或添加评论,请登录

David Thomson的更多文章

社区洞察

其他会员也浏览了