Portfolio - Project Avalon

Portfolio - Project Avalon

This article was written to present my artistic and technological contributions to one of the greatest projects I had the pleasure to work on. Before I start, I want to add up front that this wouldn't be possible without the motivation, encouraging and energetic (daily based) elevation of my friend Oliver Koch (Creative Director & Hero) + the whole team Sebastian Lüdke (Senior 3D / Env & Texture Artist), Edward Andrukaniec (Senior 3D Artist), Christian Kratzert (Senior Animation Supervisor), heiko sülberg (Senior FX Artist), Clemens Endre? (Senior Sound Designer), Stefan Meyer (Unreal Developer & OptiTrack Superpro), Andre Bürger (Frontend Developer & 3D Generalist), Robert Chapman (Senior FX Supervisor & FX Lead) and Fabian Deiss (Unreal & Sound Technician). They are all outstanding artists, technicians, and fine persons. The guys did an amazing job pushing Unreal to its visual and acoustic limits without glitches in VR, which is still a technical and artistic challenge.

But I also want to thank Marie Jo Tucholski for ALL her support and allowing me to transform our kitchen, living and bedroom into a messy lab for months.

Here are some gameplay screenshots (where I have no credits at all).



With "Project Avalon" we've created an escape room like location based multiplayer VR experience for a cyber-security awareness training, where up to 4 players can operate on a 10x10meter tracking area. Wireless VR-headsets, DMX systems like air-scent, heaters, fans, motion control chairs, custom hardware and spatial sound are pushing this to another level of immersion.


Slim13 OptiTrack cams we've used for active real time tracking (photo by Oliver Koch)


Tracking area in an early production stage


DMX fans for creating wind


Consumer heater with an Arduino hacked IR remote for DMX control (photo by Oliver Koch)


4 motion control bases from Motion Systems with custom chairs from Cleeman Seats for VR flight sequences

For about 14 months I was the technical lead, responsible for avatar-locomotion, rigging, hardware-integration, prototyping custom tracking devices/robots and developing/programming the multiplayer VR architecture (together with Andre and Stefan).

As you can see below, we came up with a quite complex system using UE4 as our main core. We used a listen server where real time tracking data, a custom ESP32 server for motion controlling custom robotics and devices, DMX and subwoofer sound were processed. A powerful client workstation per user took care of rendering and sending the visual content to an HMD via Wifi6, controlled a single motion seat and processed spatial sound (using FMOD). In addition, we've fused inside/out headset and hand/finger-tracking data with custom OptiTrack tracking mounts.

Overview about our system architecture and hardware flow

Our system and the content was build on some special requirements for a couple of tasks, where you can't simply buy out of the box solutions. So the following content shows some customs IPs where I was mainly responsible for.


"Custom ESP32-Unreal server"


This ESP32 server was programmed for controlling 4 custom devices, which will be explained later. This server was connected via serial (USB) to the main Unreal-server processing data bidirectional and communicating to the clients via the WifiNow protocol.


"The Artefact"

Concepts by Oliver Koch


Final 3D print and hardware implementation

The artefact was supposed to be THE item the users have to find during the experience. You can grab this item in real, so it has 2 representations. A virtual design you can see in VR (see concepts) and the real haptic you're holding in your hand (image above). For fusing both worlds, we've used a special real time tracking component called "active tag" from OptiTrack. It's actually a PCB (with an IMU in this case) and up to 8 diodes you can place/install as needed. After calibration, you're going to get a super precise pivot/transform you can attach a virtual object to.

In the middle part, 4 rumble motors (from the PS4 controller) and a ESP32 microcontroller had been installed. Because of the nature of such motors (alternating voltage) you can design various vibration patterns. I've also used the hand tracking information (hand distance relative to each motor) to create an interactive mode where you can let the vibration "move" with your palm.


"Moving Platform"

Because we didn't want to have fix installations on the play-area where the user could possibly fall over or bump into, we needed a solution to carry the artefact to a specific position in the right moment. Therefor, I've built an autonomous moving platform, tracked by OptiTrack and completely motion controlled by Unreal. I've used stepper motors (because of their precision), 3D-printed custom wheels and attached 3 magnets on top together with a 3D-printed tilt-gimbal. During drive, all 3 magnets are active. When the platform reaches its final position, the lower magnets turn off. If a player grabs then the artefact, it starts to tilt and after a certain amount/threshold of tilting (measured via OptiTrack) the last magnet, holding the artefact, finally turns off too. (The solution you see above is the third iteration/version btw)


"Sliding Platform"

photos by Oliver Koch

I made this sliding platform mainly for safety reasons. This system also operates completely autonomous. At some point, the user, holding the artefact, will have to put it into a virtual console (in VR). We've matched the virtual console with the physical magnet-holder. A sensor then checks if the artefact had been placed/positioned properly and if so the platform starts to move the artefact towards a safe position.


"Custom HMD OptiTrack mounts"

First 3D printed tracking mount prototype for Vive Focus 3 (photo by Oliver Koch)


3D printed tracking prototype with custom gimbals for Quest2


Final tracking mount (right photo by Oliver Koch)

For bringing all users into the same VR space, we needed to fuse their individual positions/orientations. You can buy such mounts directly from OptiTrack, but not for Meta HMDs. Therefor I've developed a custom tracking mount (first for the Vive Focus 3, which we later replaced with the Quest2 and again with the Meta Quest Pro)

Technically, we used the positional data provided by the active tag of this mount and combined it with the rotational data coming from the inside/out tracking from the headset itself.


"Custom tracking solutions"

Custom OptiTrack wrist tracker


Blue-tooth ESP32 "thumb stick" I've made for combining finger tracking with precise joystick input


2 early prototypes for a shoe tracker

We've headed for the best immersion possible without controllers the user will have to use for interaction. Meta is doing an excellent job with their hand/finger tracking features. But this wasn't giving us the precision we needed, plus that the inside/out tracking drift accumulates over time. You can ignore that mostly during a "normal" VR experience, but not in our scenario. Also, in case of hand/finger tracking, hand positions are getting lost when they aren't visible to the headset. So I've developed a custom wrist and foot tracker that gave me both, precise hand/foot positioning and solid tracking data anyway where the hands/feed are. After fusing the position with the inside/out tracking of the fingers, I could achieve an error less than 1cm, which was also needed to precisely touching things like grabbing the artefact in VR and reality.

The player itself was supposed to be shown in 2 different ways. The individual player could see itself as a first-person character with hands and feed and a third person view of the other players with a full hologram like avatar. The locomotion was designed as a combination of a third-person-gameplay-locomotion with mocap data as main driver and the real time tracking data of head, hands and feed. Both locomotion could be blended together using Unreal's awesome control rig. Here I must say that promising prototypes had been finished, but not the whole implementation ready to play.

...


It was one of the best work experiences I ever had until something that never happened to me before hit me (and the whole team) right in the face.

This whole project came to a sudden and unfortunate end (so short before the finish line) when our client had to file for bankruptcy. That was 6 months ago. I'm still heart-broken, to be honest, and it was (and still is) a hard lesson of life.

...


But ... I've earned a lot you can't buy.

I've learned so much ...

Met new and old friends on this project ...

And I'm quite grateful for all the freedom I had to grow on so many fields at the same time ...

THANK YOU!





Stefan Meyer

Tracking Verfahren | Virtual Reality | Unreal Engine | Web/iOS/Android | Motion Capturing | Virtual Production

8 个月

A huge thank you to the entire team; it was a joy! Thank you so much for the opportunity to work with you all. <3

Lukas Przybylski

Freelancer 3D Artist

12 个月

what the heck bro ??

mahee pal

XR Animator/ Game Artist / Storyteller

12 个月

Fantastic!

Eric Bintner

Former Lead Creative Technologist @ BBDO | AR, AI, Emerging Tech

1 年

Seriously next-level on every level. Love the custom hand tracking hardware!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了