Unreal Engine Build: London'17
We were recently in the privileged position of supporting Epic Games and 200 leading innovators from key businesses in the media & entertainment, design & construction, healthcare, manufacturing and automotive industries at Unreal Engine Build in London.
This event, held at BAFTA 195 Piccadilly, showcased the use of the current Unreal Engine [UE4] across a growing number of enterprise use cases that highlight the growing adoption of one of the world's most capable 3D real-time platforms. The enterprise team at Epic have an impressive collection of big brand clients delivering high-end VR solutions at a professional level, and it was great to have been there to represent the advanced use of Unreal in developing believable environments for the purposes of visualising architectural design and for realistic end-user training.
The various presentations were held in the famous Princess Anne theatre, and provided a compelling insight into the different ways each of the represented businesses were defining their own journey with real-time production processes and virtual reality.
Our short presentation followed the opening keynote by Epic, where Enterprise division Director, Simon Jones, shared some of the year's highlights and stressed the growing significance of the professional sector to their business; alongside a well-established dominance in the gaming arena. We heard about various adjacent technologies making their way into enterprise applications and some indication of new tools being developed to improve the production workflow across enterprise applications. One such solution that they covered was the recently announced Datasmith workflow toolkit, which although currently in beta - aims to provide design professionals across all sectors with a viable one step solution for getting existing model formats directly into Unreal [more on this later].
The short presentation we delivered was a quick guided tour across a busy past year delivering a range of high profile projects for big name clients using Unreal to not only provide cutting-edge VR environments; but also to provide a more efficient production process for more familiar forms of CG imagery and digital media.
We talked briefly about how this approach had helped us whilst having the opportunity to act as the client, where we developed a series of increasingly detailed environments to help inform key decisions around the relocation of our new headquarters.
Our presentation then allowed me to share detailed insights on how we approached some of our most high profile projects to date - where the quality of visual production had to be matched by the UX/UI design, software back-end and interactive functionality across all manner of different output devices.
We concluded our session with a look inside the various aspects of the work completed with Tottenham Hotspur in producing their SPVRS [Stadium Project Virtual Reality Suite]. We were joined by Jamie Cassidy, a Senior Sales Executive from the club, who shared their view on how this unique VR facility had enhanced their visitor experience and more specifically, how the numerous different premium spaces we developed using Unreal Engine, has helped them exceed the ambitious targets for the new stadium.
“Building a stadium for the future, we have to be forward thinking in everything we do. It was a brave decision - as we were way ahead of the curve - but it has proven to be a very good one as we are months ahead of our forecasts already.
A huge amount of credit goes to Soluis and Unreal Engine, who have provided incredible content and user experience.” - Jamie Cassidy
Testament to this was the fact that one of the most prestigious areas in the new stadium, 'The H Club' has the capacity to house 90 founding members, but already there are only a small handful still available; which is somewhat unprecedented at this stage.
As a contrast to our session on architectural design visualisation - McLaren then shared some of their latest innovations in accelerating their design process with the use of some bespoke VR design tools. These custom built tools allow their design team to sketch and from there, sculpt forms in a fully immersive 3D application. In a bold, and highly effective demonstration, their Design Director, Robert Melville did this live on stage and in around 10 minutes had produced the basis of a new model with all of the main curves and surfaces defined. This was hugely impressive as you can imagine, and undoubtedly this predicts a likely future for the design of products & furniture, as well as form based architecture.
One of the most innovative industrial uses shown during the day was the oceaneering and subsea exploration platform developed by Abyssal. This is featured in Epic's latest case study and provides an effective means for ROV operators to safely navigate in low visibility conditions when operating these units for subsea evaluations & maintenance. The video below provides a more detailed overview of this solution:
The afternoon session then went into more focus on the use of Unreal Engine in the media & entertainment sectors, and this was great in validating some of our own production choices and clearly indicated that this was the direction of travel being set at the highest end of the production spectrum. First to go were The Mill, who talked in detail about their BlackBird solution, which is effectively an augmented reality car rig which allows them to direct film of any car in real-time by superimposing a fully photo-realistic model of the bodywork and interiors. This has plenty of application elsewhere , but it was clear to see how this unique combination of Unreal with AR enabled seamless filming of cars not yet in mass production to support essential marketing campaigns.
One of my personal highlights of the afternoon was the session hosted by ILM around their growing reliance on Unreal Engine as part of the production process for major Hollywood blockbusters. This included detailed examples of where Unreal has been used increasingly to allow directors to evaluate a series of different shots quickly - and the main examples used focused on last year's Rogue One from the Star Wars franchise. The revelatory moment here was the explanation of key final scenes having been processed directly from the engine, which is a major turning point in where we have been convinced for some time of the merits in moving from offline processing to real-time rendering for this level and quality of production.
The event was concluded with a great evening of networking where the function suite was opened, with a bar conveniently constructed in the centre and our stand forming part of a small group of hugely varied VR demonstrations around the four corners of the room. It was an enthusiastic crowd full of belief and enthusiasm, and it was great for us to hear the various different perspectives of those in other industries - where many of the challenges faced are similar to those we face in keeping up with expectations on quality of output and speed of production.
We also had the chance to discuss our early findings in our use of the Datasmith toolkit, where we were able to evaluate how this method of data transfer might help in repurposing existing 3ds Max scenes that have previously been prepared for producing more traditional forms of CG views. Although this is at an early stage of development, the signs are promising and the following comparison views show the difference between what was produced natively from 3ds max alongside the same scenes exported to UE4 via Datasmith
L-R: Original 3ds Max render -> 3ds Max exported to UE4 via Datasmith
There are some distinct visual differences as you'll see in each of these examples, but we've avoided editing the outputs to ensure a fair assessment.
The biggest difference in this approach, in comparison to what we normally we do when developing functional high-end environments - is that we are better able to control all of the various components that constitute a high peformance VR experience. Our talented team of real-time artists have specialised skills in producing correctly optimised model geometry and apply the necessary level of control when mapping textures & lighting the environments; something that is at best exceedingly difficult to automate due to the level of interpretation needed to produce a suitable basis for high performance. This is also essential to ensure that the experience is not only as realistic as it can be, in terms of visual quality; but also that the scene runs smoothly with suitably high frame rates that are as much a part of the user experience as is creating a convincing space for users to inhabit. We'll be tracking this one very closely indeed and hope it will encourage more of our own clients to consider the merits of a real-time visualisation pipeline based on the use of UE4.
Overall the event was a great blend of enjoyment, education and inspiration - and I'd like to re-iterate our huge thanks to the Epic Enterprise team for both their hospitality and a seamlessly professional event. We've certainly come away from this with plenty of new ideas along with a renewed belief in our existing strategy for blending real-time production processes across our range of digital media outputs. Most of the industries represented work at a similarly fast pace (arguably faster in some cases), so seeing the leaders in these industrial professions share their latest innovations is invaluable for the rest of us servicing similar markets from a largely creative position.