Apple's New Headset: Reaction from an Old-Ass XR Developer
A reaction to 苹果 's new Vision Pro (#AVP) headset from someone who’s been developing #XR for the #AEC industry since the #Oculus CV1 was shiny and new.
While so much of what they showed at the announcement today was exactly as we’ve all been predicting for some time, it was great to see things confirmed, clarified and come into better focus today. I’m struggling to organize these thoughts so I’m following some conceptual groupings that hopefully translate.?
What the device isn’t: It isn’t a Quest killer. It isn’t a Magic Leap killer.?
AVP is to Meta #quest3 as a macbook is to PS5. Both are essentially computers that can hypothetically do the same type of thing, but they are designed for totally different use cases and can fully coexist in the ecosystem. If you want a gaming device, the games are on quest and the price-point makes sense to be a game console that lives on your face. Apple didn’t even show a VR game as far as I saw. No beat saber or anything!?
Magic Leap 2, the most advanced AR headset to date, is also designed to solve totally different problems than AVP. It is ready for frontline workers with its true transparent display. It is spatial first, with all the focus on use cases that exist fully in the physical world. They’re not interested in putting a spreadsheet in your eyeballs because you have a computer for that.?Spatial-first transparent AR is a much harder problem to solve than VR, even than pass-through AR like AVP. But in the long term, I see it as much more important and even revolutionary to how humans interact with the digital world.
What AVP is: The MAYA that head-mounted XR has been waiting for.
AVP on the other hand, seems to be designed from the question:?
What if your existing apple uses could be pushed out of the device, into your surroundings?
And then the followup question:?
And what if it was actually good at that?
And that’s the problem this solves. It leverages Apple’s unique advantages, namely the mountains of existing content, the ecosystem buy in, the connectivity. And then follows Apple’s standard playbook, applying relentless attention to detail in the small subtleties that makes it feel better to use. My twitter feed is absolutely overloaded with people pointing out that basically every use case or feature here already exists on other platforms. They mock Apple for claiming to invent existing tech. But that’s how they do. Every one of those features has struggled against some friction or barrier that has held it back from enthusiastic adoption.?
Virtual screens are old hat. I used virtual screens on my Vive Pre. But I didn’t keep using them. And there’s lots of reasons. Headset comfort, display resolution, experience isolation. For all these reasons, I tiptoe back to my familiar monitors. Each of these issues have been solved in one way or another by amazing and brilliant engineers. There are incredible new headsets that are much more lightweight and comfortable but might not have the same fidelity. Or they do! But the experience is still very isolated by default. But there are great multiuser tools like Pluto! But they require buy-in to an unfamiliar platform.
Seeing a pattern? While industry pioneers churn through fantastic innovative, but rough new inventions, Apple works quietly in the background, finessing the most refined version of the most basic concepts. And they introduce it slowly in the most accessible way. And like the first watch and the first iPhone before this, this is still just the starter.?
The ground is littered with the bones of VR meeting apps that required everyone to buy into one app that was disconnected from the rest of their workflow. On this one, you join normal-ass video call with normal-ass people on their normal-ass computers but you have some superpowers.?
This is MAYA. Most Advanced Yet Acceptable. If I’m not mistaken, the concept comes from George Bernard Shaw talking about how to introduce political revolutions like socialism. But it’s been picked up fairly widely by tech to help inform how new tech can be successfully adopted. It’s especially useful for new interaction paradigms like XR.?
领英推荐
Leveraging 2d-native content.?
Let’s return to the apparent driving design thesis of the new platform (being as unnew as it can manage). Since Apple already has SO MUCH 2d stuff, they asked how can they make that matter. That why there’s such an emphasis on virtual displays and feeds of 2d content surrounding you. And it’s a good use case if you do it well. This year at AWE, the “AR Laptop” won best in show. People want it but so much has to be done right for it to work and that’s what they’ve focused on.?
The display quality is huge huge huge. Critical. They really focused on multiple high-res virtual displays that are clear from a distance or any angle. A large virtual display means nothing if you can’t see it as well as a physical display. Then the variable presence of the room, and the ability to leave it on your phase while talking to someone on the outside. That weird holographic eye display on the outside. Its all to let that unique advantage shine. Like how transparency mode lets my AirPods stay in my ears, this is something previous VR never had and probably makes a larger difference to willingness to use that device than we want to admit.?
They also include options out of the gate to play games on this device with NO developer action. That single choice will give this device more content out of the box than other platforms could even dream of. If steamVR had done more with this concept, I think it would have been an enormously beneficial move. The think I’m waiting to see is how easy they make it for developers to adapt their existing games to ADD value in AVP compared with desktop or handheld versions. A good example of this is how Tilt 5 handles adaptation to their platform. Drop in the mat prefab to declare how the ARness is handled and hit go, basically. Apple could make it at least that easy and I hope they do.
Imagine any game with a 3d camera being rendered exactly as you expect with a virtual window, but the 3d camera is stereo so the window has depth rendering. Could be magical.
Spatial computing. Location based. Meatspace native. Whatever you want to call it. Where is it??
Conspicuously absent (to me, at least) is any claim for spatial understanding. They talked about remote sharing, or interaction with humans in your environment (eye display etc.) but absolutely no mention of local multiuser. And no mention of environmental understanding/augmentation.?
They didn’t even show room scale VR games!?
It seems clear to me that Apple has not adequately solved the very hard problem of spatial alignment, persistence, and environmental understanding. At least not to their satisfaction. And I get it. It’s a very hard problem that I’ve spent years working on. The good news is that with a good Unity collaboration, it seems likely that the tools and interactions that we’re used to using for developing on VR platforms should generally work here. But we don’t know much about 6dof controls or room meshing, or semantic understanding or ANYTHING.
And more interesting to me, no mention of AR location persistence. We have to assume they’re working on something and that it’s more advanced than the POS currently running in ARKit. But for my purposes, I can only assume it’s not up to the standards of what we need for Argyle . And what the industry needs at large. We’ll keep working on pushing those boundaries, then and I’ll be glad that they seem to be giving developers enough control that we can leverage the tech we’ve built on this new platform.
So, what?
I see this as a good thing for the industry at large. And basically everyone in it. The key is legitimacy. Most lay people won’t be running out to buy this thing immediately, and most devs and startups like me won’t immediately be turning all their engineering resources to Apple Vision. But this is pushing XR back into the mainstream conversation in a way it’s maybe never been before. VC funding will start to flow back into XR, and we will get a new influx of developer talent as young devs more broadly believe this to be a meaningful part of our tech future.?
I don’t hate it.?
Business Development, Immersion Labs | AR/VR & Mobile App Expert | AI-powered Soft Skills Training
1 年I initially thought you meant Maya in Sanskrit ;-) meant so much sense
Thought-Leadership | Strategy | Innovation | Emerging Technology | XR/Spatial Computing | Trusted Advisor | Product Management | Customer Success | Business Value | Go To Market | AEC | Sustainability
1 年https://www.dhirubhai.net/posts/dacecampbell_spatialcomputing-ar-mr-activity-7071708031805259776-dYcF?utm_source=share&utm_medium=member_android