Unity Vs Unreal for XR Development

Unity Vs Unreal for XR Development

The battle for Unity Vs Unreal for XR development rages on. Where do you stand?

Get comfy, grab your popcorn and welcome Alex Coulombe under the spotlight to share his opinion and do not forget the “Product Spotlight” where you can access some of the freely available APK to experience his motion capture in Meta Quest headset firsthand ??

Subscribe to the newsletter, so you don't miss the next edition ??

Interview with Alex Coulombe

In this interview, Alex Coulombe offers his perspective on the ongoing Unity vs. Unreal debate in XR development. He explains how to reach high visual fidelity and the transformative role of VR in education and design. Alex’s extensive experience makes this a valuable read for anyone who wants to understand how to build digital worlds and truly immersive experiences.

What's your take on Unity vs. Unreal for XR development?

Alex Coulombe: My journey began with Unity around 2010 due to its accessibility, especially as an architect with minimal coding experience. However, I gradually transitioned to Unreal Engine for projects demanding higher polish and photorealism. Unreal, with tools like Datasmith , simplifies importing large architectural files, delivering superior out-of-the-box quality, albeit requiring more powerful computers. Unity is still preferable for targeting multiple platforms, but for visually demanding projects, our extensive Unreal library offers an advantage. The choice between Unity and Unreal depends on the specific requirements of each project.

What is the “cost” of high visual fidelity in Unreal?

Alex Coulombe: Attaining high fidelity in Unreal Engine does involve trade-offs. Post-processing features in Unreal like tone mapping and bloom enhance cinematic quality but are resource-intensive. You must optimize by balancing detailed materials with lower polygon counts and managing draw calls effectively. With Unreal Engine Five, innovations like Nanite and Lumen have eased optimization and lighting challenges, yet they demand high-end hardware and don't support mobile devices currently. In essence, high fidelity in Unreal comes with a need for technical optimization and powerful computing resources.

Are there templates in Unreal Engine similar to Unity's SDKs?

Alex Coulombe: Yes, Unreal Engine provides valuable templates and SDKs. The VR Template facilitates basic VR functionalities and can be directly deployed to various platforms. The lesser-known Collaborative Viewer Template is ideal for multi-user interactive experiences. It supports both VR and desktop, offers different navigation modalities, and uniquely features out-of-the-box multiplayer capability. These templates are excellent starting points for diverse interactive experiences in Unreal Engine.

How is MetaHuman evolving in Unreal Engine?

Alex Coulombe: MetaHuman is advancing but is still emerging in commercial projects. It enables the creation of highly realistic characters easily imported into Unreal Engine projects. Innovations like Mesh to Metahuman convert any 3D head model to a MetaHuman format, while Metahuman Animator, using iPhone's depth sensor, captures intricate facial details. Although not yet ubiquitous in mainstream projects, these advancements are steadily expanding MetaHuman’s potential in gaming, film production, and more.

As an 'XR Dad', how do you see VR influencing your kids' learning?

Alex Coulombe: Introducing my kids to VR has significantly enhanced their problem-solving and spatial abilities. They’ve been adept at complex VR puzzles from a young age, something I attribute to their early exposure to VR. However, it’s important to use VR actively and educationally, not as passive entertainment. Interactive, curated VR experiences can profoundly benefit children, fostering creativity, imagination, and advanced cognitive skills.

What are your suggestions for learning Unreal Engine?

Alex Coulombe: For learning Unreal Engine, a blend of online resources, structured courses, and hands-on experience works best. Online tutorials and YouTube content creators offer accessible lessons. Live instruction through authorized training centers like Agile Lens provides structured, certified courses, beneficial for those who prefer interactive learning. Community engagement through forums, webinars, and live events enhances learning and networking. Finding a method that resonates with your learning style is key, be it through self-led online resources or structured courses.

What common mistakes do you observe in immersive experience development?

Alex Coulombe: A common error in immersive experience development is not fully utilizing the medium's unique qualities. Often, creators try to adapt traditional media concepts to VR/AR, which doesn't always translate well. Each medium has distinct strengths. Understanding them is crucial for creating good experiences. Active design in VR – making real-time adjustments in a spatial context – is often overlooked. Also, VR should be viewed not just as a tool for visualizing real-world projects but as a platform for unique, standalone experiences that explore creative possibilities beyond physical world constraints.

What is the current state of VR/AR hardware development?

Alex Coulombe: The VR/AR hardware landscape is evolving. Meta has advanced in making VR accessible, but the lack of competition has led to slower technical progress. Apple's entry into the market, especially with its focus on high-quality tracking, will likely drive improvements and competition. The Apple Vision Pro, despite its current high price, indicates a shift towards sophisticated, user-friendly VR/AR technologies. This emerging competition is crucial for spurring innovation and enhancing user experience, making these technologies more widely accessible.

Do you recommend investing in mixed reality development?

Alex Coulombe: Investing in mixed reality requires careful consideration. My personal experiences with mixed reality, especially with pass-through technology, have been less than optimal due to issues like visual distortion and discomfort. However, as hardware evolves, these challenges may diminish. Developers should assess the current limitations against potential benefits. If mixed reality aligns with project goals and offers clear advantages or innovations, investment can be fruitful.

What’s the role of VR in the design process?

Alex Coulombe: VR's role in design transcends mere visualization. It should be an active tool throughout the design process, allowing designers to make spatially informed changes in real-time. Design decisions should happen within VR, adapting to various stages of design development. VR is not just a pre-visualization tool for the real world, it's a medium where innovative, standalone virtual experiences can be created, experiences unattainable or impractical in the physical world. This perspective opens up vast creative possibilities, leveraging VR's unique spatial and interactive capabilities.

Any final thoughts you’d like to share?

Alex Coulombe: I encourage exploring our production of "A Christmas Carol" in VR, an annual event that evolves with new features and technological experiments each year. It's a unique VR experience that pushes creative boundaries, and we're eager for audiences to see the advancements we've made this year. Also, check out Ink and Paint for a fascinating project I'm currently involved in. These projects reflect the possibilities and creative applications of VR and Unreal Engine in storytelling and immersive experiences.

Watch the full interview on the platform of choice ??

Product Spotlight: Alex’s Secret Quest 3 Apps

Alex’s passion for theater and XR is certainly not a secret as he is a strong ambassador of how XR can support the architectural design of theater and the shows that happen within as well as how performances can be “ported” to this new digital reality.

Check out these apps that you can run directly on your quest to experience motion capture in 3D and the entire Brockman Hall completely reconstructed in VR

To learn more directly from him about this project check out this video

That’s all for today

See you next week

Subscribe to the newsletter so you don't miss the next edition ??


Klemen Verbovsek

Digital Transformation Expert in Manufacturing, services, Logistics, Warehouse, Maintenance. Consultancy | Project Manager | Business development I Marketing

6 个月

When the content is valuable I am happy to write some good words Gabriele Romagnoli ?? . To everyone, I highly suggest reading it!?? My background is a bit different. I remember my time being at Immersed vr in Toronto back in 2015. But currently working in erp digital transformation industry +using UE for virtual production for our marketing needs. Alex Coulombe I am curious what can be done in UE:below is what I’m eager to learn and invest in. 1. regarding the industrial AR and VR since I see there is no windows mixed reality plugin available anymore. Is there something else to be able to create ar projects for headsets (projects like using gps, geo location) and trainings like Varjo has for airplane industry. 2. I am trying to imagine the tour guide a meta human of some family heritage company sales experience with the final space - the showroom of products connected to e-commerce. 3. Imagining the showroom where a sales person is able to see that the potential lead is currently in this immersive space and the sales person can get in and start approach as a metahuman. Maybe making a digital twin. Or using the VP greenscreen with decklink black magic and communicate and be seen as a keyed video in real time. All the best!

The Viewalk game prototype was originally built in Unity. It builds on top of ARkit on iOS and ARcore on Android to track the position, rotation and movement of the player to create a first person "handheld VR" like gaming experience. We also had spatial VOIP, although that effectively required headphones to work well. Before moving to a closed beta we ported the platform to Unreal Engine (4.27.2). We believed this would be the better choice in terms of long term future engine development and from a UGC and content creator point of view. Personally, as a game and level designer I loved the change. This shift unintentionally resulted in it not being so easy to create an Android build (we still do not have one). It is related to how deep we have to go with integration of ARcore and ARkit into the engine and operating systems respectively because we are not using either toolkit in a normal fashion (normal being overlaying 3D graphics into a real space/video feed). ARBlueprint did not include all the functions needed so C++ programming was needed. I am not a programmer though so I can't speak to the details. Looking back at it we probably should have stuck with the rough, Unity-based prototype, but hindsight is always 20/20.

回复
Richard Perrine, M.Ed Secondary Science

Creative STEAM & Arduino|Virtual Reality Content Creator/Educator/Facilitator

6 个月

We decided on Unity for our students. Much more community support, uses C#,which is similar to Java, a common language taught in high schools and , it just works. I am dabbling in Unreal but for instructional purposes for our kids, Unity makes more sense right now. Unreal can certainly create more amazing visuals, but that is not a strong enough reason to make a change.

Kate Balabanovich

Python Developer (switcher from Ruby) ? 3+ years of experience in Software Engineering

6 个月

>"VR should be viewed not just as a tool for visualizing real-world projects but as a platform for unique, standalone experiences that explore creative possibilities beyond physical world constraints". I completely agree with it. Somehow, VR developers have forgotten that this technology can bring a completely new experience.

Timothy Partee

Senior Full-Stack Software Engineer/Manager | C#, Unity, C++, JS/TS, PHP, Python, WPF, RDBMS | Ex MSGS SquareEnix EA Wizards Razer Logi zSpace | VR Developer, Enthusiast and Evangelist

6 个月

The questions asked by the interviewer seem intentionally biased towards Unreal rather than attempting to remain unbiased between Unreal and Unity. Knowing both engines and how each handles VR (and Alexandre mentioned this in his reply), it really depends on what you're trying to do and the needs for your project. In a nutshell, Unity has the advantage with their XRI SDK partnering with OpenXR to be able to run across virtually any HMD/tracker hardware on the market right now, while Unreal has better templates and render pipelines but with worse cross-platform XR support. The render pipeline gaps are also rapidly becoming smaller and smaller as Unity has poured vast resources into catching up with Unreal on render engine quality and performance. Undoubtedly Unreal is investing efforts into building a better cross-platform XR library because Epic has scads of money and is badly behind Unity in those efforts.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了