24 Augmented Reality Trends to Keep an Eye on for 2024

24 Augmented Reality Trends to Keep an Eye on for 2024

For over a decade now, I have started my year by making a list of trends I will watch in the augmented reality and wearable technology space. The following are the 24 AR trends I will be keeping an eye on for 2024. These are not predictions but major areas of focus that I believe will dominate the AR space this year.?

?? Grab a cup of coffee and get comfy; this is a long read!?

This post is best enjoyed on LinkedIn, so if you just received this in your inbox, hit the "Open in LinkedIn" button above for a more optimal reading experience.


Industry?

#1 Spatial computing knocks the Hype Cycle crown off GenAI as it takes its place at the Peak of Inflated Expectations

Last year, GenAI knocked Metaverse down the Gartner Hype Cycle and took its place at the “Peak of Inflated Expectations.” But GenAI’s time is up! Plagued by copyright and regulatory issues, growing mistrust and social concerns, and prompt exhaustion, GenAI is set to move down through the cycle into the “Trough of Disillusionment,” where it won’t go away but instead start to get to work, just like the Metaverse did. While we will continue to see innovation around generative media and LLM apps, most of the activity in this space will be around enterprise use of GenAI, including LLM lakes, private co-pilot solutions, and a focus on enhanced security.?

Taking the Hype Cycle crown this year will be Spatial Computing. This will be driven mainly by the debut of Apple’s spatial computer, Vision Pro, and the emerging mixed reality headset category, now made up of all major tech players. But it will be reinforced by significant advancements in the technologies enabling computers to perceive, interact with, and navigate 3D space, all underpinned by AI and ML.

This will be a big year for Spatial Computing as it grows in awareness beyond the industry that has been busy working within it, becomes the new media darling, and enters the mainstream conversation. But, just like GenAI and Metaverse before it, 2024 will mark just the start of this transformative set of technologies, and it too will slip down the trough to get to work shortly after its peak.

On an aside, I would have guessed that Mixed Reality would have been the technology next in line for the Hype Cycle crown, but when Apple debuted Vision Pro as a spatial computer and avoided the terms “virtual reality” and “mixed reality” ultimately it was clear that this was not going to be the case. Apple has a way of leading the language in a category, so my money this year is to see the rise of Spatial Computing. And if Tim Cook himself says , “The era of spatial computing has arrived,” I believe it.

#2 Developer and investor activity in XR surges as mixed reality headsets present the next blue ocean opportunity?

As the mixed reality headset category comes together, it presents a new opportunity for developers to create applications that demonstrate the value of these devices. The app store is a proven place for developers to win big, as we saw in the early iPhone days. However, the blue ocean opportunity for smartphone applications has subsided as the catalog has become quite crowded, and users have hit app download fatigue. The new blue ocean opportunity for developers is with mixed reality headsets, especially now that the major tech giants are all in on this device category. While the user base may be smaller than mobile, they will be craving quality content, especially content with sticky use cases that justify the device's price and encourage them to put it on repeatedly. As the catalog of VR and MR apps in these brand-new headset app stores is still relatively small, the time is now to launch an app and capitalize on an eager audience with very little competition. MR headsets also present a new opportunity for VR apps, which may have grown stale or been too early, as they can choose to reemerge with the use of the new mixed reality capabilities.?

The return of developers to AR and VR will, in turn, attract investors back to XR. I expect we will see a surge in funding and M&A activity in startups and content studios very reminiscent of 2016 when the first consumer-ready Oculus Rift shipped and Apple and Google launched their AR platforms.?


Hardware

#3 Virtual Reality HMDs are dead, long live Mixed Reality headsets

2024 will be remembered as the year we said goodbye to virtual reality head-mounted displays. Taking their place are mixed reality headsets capable of both virtual reality and augmented reality. This shift began in 2023 with the launch of Meta Quest 3 and the debut of Apple’s Vision Pro. But the category of mixed reality headsets will really start to come together in 2024 when Apple finally ships its device in February, and we start to see other tech giants make their headsets available, including those expected by Samsung/Google, Bytedance, and Oppo as well as devices from up-and-coming players like Xreal.

Mixed reality may prove to be the missing piece needed to accelerate the slow but steadily adopted VR device market. Giving users the ability to see their space and the people around them while wearing the headset not only opens new types of applications but also addresses one of the biggest concerns about VR—isolation. Something as small as starting your headset experience in MR makes the overall experience feel less claustrophobic, and being able to see people in the room with you makes the headset experience more social and safe.?

While this will be a big year for mixed reality, it is still early days for this category. Today’s mixed reality category reminds me very much of the early days of the PC, clunky and expensive but capable enough to enter the home of early adopters with enough use cases to enrich the lives of many family members. Like the PC, we will continue to see this device innovate, especially with improvements in comfort and design, display and visual fidelity, and interaction and tracking. We will also see changes in cost and a growing list of applications which will aid in wider adoption.

We may also get a sneak peek of what’s to come this year as we learn details and gain a ton of rumors about the successors of Meta Quest 3 and Vision Pro. The focus for Quest 4 will most likely be around eye-tracking and resolution to compete with Apple, while the focus for Vision Pro 2 will be all about price to better compete with Meta.?

#4 Natural interactions using voice, gaze, and gestures will usher in the new pinch-and-zoom, tap, and swipe

A brand-new device form factor demands a new way for users to interact with it. With the PC, it was the mouse; with the laptop, it was the trackpad; and with the smartphone, it was on-screen gestures such as pinch-and-zoom, tap, and swipe. Headsets are also ushering in a brand new way to interact with our computing, making use of the device's sensors to digitize our eyes, hands, and voice to interact with virtual content. While eye-tracking, hand-tracking, and voice control are not new to the headset, this year, we will see a greater shift away from controllers and virtual keyboards to make use of these input methods.?

Setting the tone is Apple’s Vision Pro which debuted without a controller and has powerful sensors designed to pick up on hand and eye movements that enable an interaction experience that many first reviewers have noted feels like “magic”. Apple’s emphasis on a controller-less user experience will challenge app developers and designers in similar ways to the early iPhone days when they were asked to design software with a device that was literally just a pane of touch-sensitive glass. While Meta Quest 3 did not ship with eye-tracking, Meta did make great strides to update its hand-tracking capabilities last year, and I expect that we will see future Quest devices bring a user's eyes into the mix, especially as Meta is hinting that brain-sensing with EMG is on the horizon.??

I expect this year will surface other devices that make use of our eyes and hands and software updates that improve this tracking. Be on the lookout also for a new interaction paradigm forming around eyes, hands, and voice as wearables move the way we interact with the digital world away from physical input methods.?

#5 Multi-modal AI triggers a wearable tech renaissance fueled by virtual assistants that want to make sense of the world

The wacky and weird world of wearable technology, which started to get us to wear sensors on various parts of our bodies over a decade ago, will see a return this year. While iterative improvements in hardware components can now be used to create smaller, lighter, and more wearable form factors, the main driver for their return is AI. The GenAI boom of 2023 has ushered in powerful multi-modal LLMs, an advanced type of AI that can understand and generate not just text but also other types of data, such as images, audio, and possibly even video. "Multi-modal" refers to the model's ability to process and relate information across these different modes or formats. The development of multi-modal LLMs is a significant step forward in AI, opening up new possibilities for more intuitive and comprehensive AI systems that can better understand and interact with the world in ways similar to humans. Wearables play a major role in this leap as they equip the AI system with eyes and ears through sensors such as cameras and microphones, which make our experience in interacting with these systems much smarter and more contextual. In addition, the output from these wearables, such as spatial audio systems, projectors, and other displays, gives the AI system a means to communicate back to us without the use of typical screens, which is less intrusive and distracting.?

We started to see these AI-enabled wearables bubble up at the tail end of 2023, with the debut of Pendant, Humane’s Ai Pin, and the next-generation Meta Ray-Ban glasses. All of these devices center their value around AI. We caught a glimpse of just how powerful the combination of smarter virtual assistants and wearables can be with Meta Ray-Ban’s roll-out of its multi-modal AI feature to select users where Zuckerberg used the eyewear to ask Meta AI to help put together an outfit based on the shirt he was looking at.

I expect that this year, we will see AI-enabled wearable devices launch, leveraging models from the likes of OpenAI and Anthropic or smarter versions of Alexa, Google Assistant, and Siri. We may also see new applications and updates to existing wearables, like the Apple Watch, which turns ten this year, which may make use of smaller, more efficient LLMs running on the smartphone, offering enhanced privacy and offline capabilities.?

#6 The smartphone gets a new role as a wearable tech sidekick??

Every year I include a trend that highlights the smartphone becoming an even more powerful augmented reality machine because of iterative upgrade cycles. Annually flagship phones get equipped with better chips, displays, cameras, batteries, and more. This will be true again this year. But in addition to the latest phones enabling a more optimized mobile AR experience, the device that has been the star of our digital lives for over a decade now is officially getting a new supporting role as a wearable tech companion. This year will accelerate the smartphone’s eventual transition into a device we keep in our pockets where it will enable the wearable devices that have already started to infiltrate our lives. We have started to become used to a new relationship with our phones with our smartwatches and earbuds, but the limits of these devices make them more of a companion to our phones rather than the other way around. As AI-enabled wearables grow in adoption, including the growing offering of smart glasses that have dependencies on the phone, this dynamic will change. With smarter virtual assistants in our ears and eventually displays in our eyes, we will find less and less of a reason to pull out our phones that will now be more useful as a computing hub.?

But we are also seeing our smartphone take on another supporting role for wearable technology which makes use of its mobile nature. Indoor-centric mixed reality devices will rely on smartphones to perform tasks outside of the environments they are anchored in. This will include capturing spatial media, like the relationship with the iPhone 15 and Vision Pro today. It will also include extending headset applications beyond the headset, enabling use by non-headset users in the room or unlocking related use cases outside.?

#7 Smart glasses that mirror our smartphone screen are the new “Google Cardboard”?

2023 saw some positive signals for smart glasses that mirror your smartphone screen. In fact, wearable displays that offer a larger screen and a more comfortable heads-up position saw six-digit adoption numbers last year, according to Xreal . We may see even further uptake this year as consumers who are curious about the Vision Pro and Meta Quest 3 look for cheaper alternatives to get a taste of this next wave of computing. While this will be a win to get consumers to wear tech on their faces, it could become a “Google Cardboard”-like moment for mixed reality if these consumers are truly expecting to get a Vision Pro-like experience. Cardboard was a cheap alternative for VR back in 2016, and while it provided a VR-like experience, it paled in comparison to what was possible with higher-end machines. It arguably both helped raise awareness and democratized VR while at the same time setting it back by under-delivering based on the mainstream misunderstanding that not all headsets are created equal. That isn’t to say that video glasses don’t have value on their own, and certainly, users who are interested in the monitor extension use case may be satisfied with the device. These devices will also benefit from a resurgence of 360-degree and 3D content that is sure to be readied for Apple and Meta use.?

#8 Spatial computing displays show the future is not just headsets

While most of the spatial computing attention this year will be on headsets, we will also see activity from spatial computing displays that don’t require eyewear. From interactive billboards and magic mirrors to holographic displays and heads-up systems in-vehicle, we will start to see how this shift in computing from 2D to 3D is much more than wearable technology. I expect to see devices like the Looking Glass Go embody AI, newer vehicles that boast AR HUDs as luxury features to be announced this year, and new products and updates from telepresence systems similar to Project Starline from Google. We may also see more street corners and retail flagship stores to be outfitted with systems that 3D displays that understand and interact with the environment in three dimensions.??

#9 MicroLED breakthroughs brighten the future of mixed reality

Currently, mixed reality headsets predominantly utilize LCD and OLED display technologies, prized for their color accuracy and contrast. However, MicroLEDs are gaining attention as a superior alternative due to their higher brightness, improved energy efficiency, and potential for more compact, durable designs. MicroLEDs are crucial for mixed reality headsets as they offer high brightness and contrast, essential for overlaying virtual images onto real-world backgrounds in various lighting conditions. Their energy efficiency is beneficial for the typically battery-powered MR headsets, extending usage time. Moreover, MicroLEDs' fast response time and compact, scalable nature allow for clearer, more stable images and versatile design possibilities. The technology's durability and longevity, coupled with improved color and brightness uniformity, provide a more immersive and comfortable MR experience, making MicroLEDs a key component in the advancement of MR headset technology.

In 2024, MicroLED technology is poised for significant innovation, primarily through improved manufacturing techniques that promise to make these displays more affordable and scalable, enhancing their adoption across various devices. Advancements in color and brightness uniformity are expected to elevate image quality, which is crucial for applications demanding high visual fidelity. Additionally, the integration of MicroLEDs with flexible and transparent substrates is anticipated to revolutionize product design, enabling a new era of bendable and wearable devices with seamless, high-quality displays. These developments collectively signify a transformative phase for MicroLED technology, expanding its potential in consumer electronics, including MR headsets and other wearable devices.?


Infrastructure

#10 WiFi-7 arrives to supercharge wireless speeds and?unlock next-level spatial computing experiences

In 2024, WiFi-7, based on IEEE 802.11be technology, is set to revolutionize the wireless landscape as it becomes widely available, following the Wi-Fi Alliance's certification expected by the end of Q1. This new standard is entering the market with promises of unprecedented speeds of up to 40 Gbit/s, which is nearly five times faster than its predecessor, WiFi-6. With the introduction of advanced features like Multi-Link Operation (MLO), WiFi-7 is designed to enhance spectrum efficiency and tackle interference issues prevalent in congested areas. As devices with WiFi-7 support hit the market and routers receive updates to leverage the new standard, we can anticipate a substantial boost in wireless performance across various applications.

For mixed reality devices, these enhancements are set to unlock new levels of performance and user experience. The high data rates and reduced latency will allow for more detailed and complex virtual environments, making interactions smoother and more realistic. The increased capacity and efficiency mean that more devices can operate simultaneously without compromising the quality of the connection, perfect for multi-user MR scenarios. The improved reliability and coverage ensure a stable and consistent experience, which is crucial for maintaining immersion. As WiFi-7 becomes the backbone of wireless connectivity, MR headsets will become more responsive, immersive, and enjoyable, pushing the boundaries of what's possible in virtual and augmented realities.


Platforms and Tools

#11 AI continues to accelerate augmented reality development?

As we look towards 2024, the convergence of AI and augmented reality will continue to accelerate immersive content creation. The advent of generative media and the rapid evolution of prompt-generated 3D assets and animation, in particular, is supercharging the prototyping and development of VR and AR. This is helped by AI-enabled co-pilots which are fast becoming a necessary developer companion for new and advanced developers alike and are enabling a wider spectrum of no to low-code solutions. These advancements democratize content creation, moving it more into the hands of prosumers, akin to the revolution seen with Adobe Photoshop years ago. I expect that we will see further innovation in the way of 3D-generated media as well as further roll-out of co-pilot systems for XR developers.?

AI will also help filter creation go mainstream. I expect we will see social media platforms follow TikTok's lead in releasing user tools to create and remix filters and lenses without the need to open a studio tool. This trend is set to bring a new level of personalization and immersion to user-generated content, further blurring the lines between virtual and physical realities.?

Simultaneously, the adoption of new scanning technologies is set to revolutionize how the real world is virtualized for developer use. Techniques like MERF (Multi-Echo Resonance Fusion), SMERF (Spectral Multi-Echo Resonance Fusion), NERF (Neural Radiance Fields), and Gaussian Splatting are making headlines, promising to simplify the creation of detailed 3D models for use in XR. This advancement is not just about creating spaces; it's about capturing moments and the essence of reality, enabling developers and creators to weave these elements seamlessly into AR experiences. Expect to hear a lot about these technologies and more solutions adopting and innovating with them.?

#12 Mixed reality comes to mobile as smartphone AR levels up with advanced spatial awareness

This year will continue to see mobile AR grow up as it shifts into its more advanced form of mixed reality made possible by sophisticated spatial awareness, precise positional tracking, and systems that use artificial intelligence and machine learning to allow digital content to behave in more sophisticated and realistic ways. Keep an eye on development platforms and creator tools as they roll out further mixed reality features around GAN-enabled filters, geospatial systems, semantic spatial understanding, and new tracking features. These new tools will enable developers and creators to create AR experiences where the real and digital worlds interact in much more complex, realistic, and immersive ways.?

#13 Widgets and ports offer developers an easy entry into spatial computing

Creating applications for MR headsets, like the Vision Pro and Meta Quest 3, often requires significant investment. To encourage more developers to create content for these platforms, both Apple and Meta are concentrating on strategies to simplify this development process.

Apple is simplifying the transition for developers to bring their existing iPad and iPhone apps to the new Vision Pro, as most apps will automatically be published on the Vision Pro App Store and run unmodified on visionOS, thanks to the shared frameworks between iOS, iPadOS, and visionOS. For the most part, Apple has taken a “baby steps” approach for developers into spatial computing. Encouraging developers to consider bringing 2D apps that float in space and adopt the standard visionOS system appearance as a milestone of success with the added bonus of taking it one step further to add elements, such as 3D content tuned for eyes and hand input. To get more VR apps into the Vision Pro App Store, Apple has also teamed up with Unity to help existing VR applications port over to visionOS using Unity’s new Polyspatial platform.?

While not yet open to developers, Meta announced its Augment offering in 2023, which could potentially be a lightweight way to get branded and existing app content into the headset. Augments are interactive, spatially aware digital objects that are persistently anchored in your physical space. Essentially you can consider them spatial widgets. Meta announced they have plans for branded augments such as an iHeartRadio music player, Beat Saber trophies, a portal for the Supernatural fitness app, and more. Augments will launch in 2024.

#14 Cross-platform tools become necessary to succeed in the fragmented spatial computing category?

As we move into 2024, the spatial computing landscape is becoming increasingly fragmented, with more players like Meta, Apple, Samsung, and Google entering the mixed reality headset category. This proliferation of devices, each with its own operating system and programming languages, presents a complex challenge for developers. They must now navigate a maze of different requirements and specifications to ensure their applications can function across a range of devices. This fragmentation extends beyond headsets to include the platforms powering users' other devices, notably iOS and Android smartphones, further complicating the development process. In this rapidly evolving ecosystem, the necessity for cross-platform development tools is becoming more pronounced. Platforms such as Unity, Snapdragon Spaces, and 8th Wall are becoming indispensable for developers looking to create applications that work across devices as they offer the ability to build once and deploy everywhere.?

The significance of standards like OpenXR and open-source tools such as the Mixed Reality Toolkit (MRTK) in this fragmented landscape cannot be overstated. OpenXR serves as a unifying force, offering a common standard that can bridge the gap between different devices and platforms and is currently adopted by Magic Leap, Microsoft, Bytedance and Meta. This standard is critical for developers who aim to reach a broader audience without the need to rewrite code for each unique platform. Meanwhile, open-source tools like MRTK, which is being guided by Magic Leap, Microsoft, and Qualcomm, are driving innovation and collaboration, providing a shared resource that developers can contribute to and benefit from. I would expect to see further contributions and adoption of both OpenXR and MRTK this year.

#15 The immersive web continues to take shape, but the headset browser lags behind mobile as it waits on standards?

The immersive web is steadily taking form, offering glimpses of a future where the browser becomes the destination for mixed reality experiences. The progress in mobile WebAR is due to platforms like Niantic's 8th Wall, which offers robust AR capabilities built using standards-compliant JavaScript and WebGL. This year, we may continue to see further advancement in mobile WebAR with a specific focus on geospatial systems and improved spatial awareness, making WebAR feel more real and relevant to the environment. We may also see WebGPU, a more turbocharged WebGL, begin to make its way to mobile in addition to further updates to browser-based rendering frameworks like three.js and Babylon.js. I also expect to see activity from more players in the space as competition in WebAR heats up.?

While the mobile browser continues to become a more powerful place for augmented reality, the headset browser has yet to reach its full potential, primarily hindered by the ongoing development and adoption of universal standards on which it is currently reliant. Meta is ahead with its Chromium-based Meta browser enabled with both immersive VR and immersive AR session capabilities along with robust features to create spatially aware content that uses the WebXR browser standard. But while developers can create WebVR content for Safari Vision Pro, it is currently behind an experimental flag, and Safari has yet to enable WebXR’s immersive AR sessions at all. As Meta’s Presence Platform and Apple’s visionOS were built with privacy in mind, the passthrough APIs currently cannot access, view, or store images or videos of your physical environment from the device sensors. This means raw images from device sensors are processed on-device. As such, third-party SDKs are not equipped to help advance the browser at a faster rate than standard implementation and adoption, as we have seen on mobile.?

There is a tremendous amount of work being done in the web standards space. This year, expect advancements in the WebXR Device API for enhanced VR and AR experiences, WebGL and WebGPU for improved graphics, and emerging 3D format standards like USD and glTF, which provide more efficient and interoperable ways to transmit and render 3D content in web applications. All of these efforts are steps in the right direction to create a common language and set of protocols that will enable an open web for the spatial computing era.?


Consumer Solutions

#16 Reliving memories becomes a killer app for mixed reality headsets as it allows consumers to be present in their past?

With smartphones now capable of capturing spatial media and mixed reality headsets able to replay them in an immersive manner, this year, consumers will start to see the future of photos and videos. The shift from 2D to 3D media not only gives users a different perspective when viewing captured memories, but especially in the case of spatial video, it also unlocks the ability to travel back in time to be present in captured moments. Apple has started to tease the power of reliving memories on the Vision Pro through spatial video recorded on the iPhone 15, with much of the media reporting it to be an extremely emotional experience that felt like they were transported back in time. Media capture and consumption is one of the major use cases for the smartphone and it will undoubtedly be the same for the next category of devices. I expect reliving memories to be a big draw for MR headsets, and I am eager to see what new social networks may be unlocked through a new relationship with user-generated media.?

#17 3D movies and 360-degree video return as sports and entertainment become a big draw for MR headset use

3D movies and 360 (and 180) degree videos are not new, in fact, many would argue they tried to be a thing and failed. But expect a resurgence of 3D and 360 content as part mainly due to Apple’s Vision Pro. Apple’s launch event highlighted how Vision Pro is, among many things, Apple’s first “TV” with a focus on 3D movies, unique viewpoints at sports games, and large floating screens and environments that offer a movie theater-like experience. Immersive media consumption is old hat for VR devices. Meta Quest has a number of apps that offer this, including Netflix and Bigscreen, but it may hit differently on the Vision Pro thanks to its specs of 4K screen per eye, low-latency eye tracking, and wider field of view. Sports and entertainment is already a key vertical for Apple, and it has its vast media catalog via Apple TV+ and Apple Music, partnerships with the likes of Disney+ and NBA, as well as the technology it has acquired from the likes of NextVR to bring to the table.?

Over the holidays, we saw first-hand how powerful a draw top-tier entertainment content can be to get users into the headset as Swifites found out that they could watch The Eras Tour on Prime VR in the Meta Quest headset.? Arguably, Apple is in the best position to bring Taylor Swift-level content to the Vision Pro, which could be one of its killer apps in adoption. In addition, mixed reality brings a new dynamic to the sports and entertainment headset offerings. Whereas in VR, experiences could only connect viewers as avatars, mixed reality viewing allows you to get all the benefits of immersive viewing while still being able to see and easily communicate with the people in the physical room you may be watching it with. I expect we will see a number of big moves in MR this year from movie studios, record labels, sports franchises, and streaming apps as they explore this new channel to engage audiences.?

#18 Hyper-realistic avatars bring presence and emotion to spatial computing experiences

Hyper-realistic avatars will become more prevalent in 2024. We will see avatar systems leverage the latest advancements in graphics technology, 3D modeling and AI-driven animation to get us closer to indistinguishable features from real humans. Keep an eye on Meta’s Codec Avatars, Apple’s Personas, and Google’s Project Starline, just to name a few. In addition to more realistic features, avatars will also be enhanced with expressive AI, which makes them more capable of conveying a wider range of emotions and reactions by interpreting a user's real-time expressions, such as sticking your tongue out. These avatars will play an increasing role in our professional and personal lives, attending virtual meetings and socializing in digital spaces, with evolving norms and etiquettes around their use. They will also present new opportunities for personalization and commerce, as we have seen within the gaming space with Roblox or in the social media space with Bitmoji.?

#19 Immersive technologies further embed themselves into marketing and advertising strategies

The use of AR in marketing and retail strategies has grown consistently over the past couple of years, especially due to the prevalence of WebAR and filters and effects in social networks. This year we will see a greater focus on immersive technologies as part of marketing and advertising strategies driven by a few key factors, including more advanced mobile AR made possible by AI, a growing collection of proof points and case studies that show ROI, and the hype around augmented reality and mixed reality driven by Apple’s entry into the space.?

In addition to smartphone AR, agencies and brands will also be looking at other spatial computing opportunities that make use of some of the latest technology available, including MR headsets such as the Vision Pro, interactive billboards, magic mirrors, and even holographic displays like those from Looking Glass, to cut through the noise and demonstrate their forward thinking on this next wave of computing. Much of this activity will be aimed at experimental marketing or built solely for video purposes to share on social channels.?

#20 Mixed reality energizes the fitness industry as a new opportunity to get people moving at home and the gym??

Fitness is not a new category for immersive headsets, but this year we will see a surge of activity in the fitness-vertical driven by mixed reality and AI-enhanced tracking capabilities to track and estimate body movements. We know that fitness is a key focus for both Meta and Apple. Meta acquired Within’s widely successful Supernatural fitness app, and Apple has its own Fitness+ subscription service. It is yet to be determined whether the Vision Pro is suited for vigorous movements based on reports that the device is quite heavy on the head, not to mention made up of metal and glass. But we do know that mixed reality on Quest 3, a lighter and more fitness-friendly product, has attracted the attention of fitness industry players, including Zumba, Xponential Fitness (Club Pilates, PureBarre, and StretchLab studios), and Les Mills’ BODYCOMBAT, to name a few. Keep an eye out for more fitness offerings in headset app stores, along with accessories aimed to enrich your VR/MR fitness experience, such as treadmills and sensor-infused sports equipment like heavy bags, paddles, bats, and more.???

While VR and MR fitness apps bring a new type of workout into your home. I expect that we will also start to see MR headsets make their way into the gym to enhance classes and gym-goers' time on equipment like stationary bikes and treadmills. This includes the rise of immersive gyms and marketing fitness as a key benefit at LBVR locations.


Enterprise Solutions

#21 Spatial computers give us a glimpse into the future of work

Despite Meta and Apple having a plethora of consumer-centric use cases for their mixed reality headsets, it is clear that one key focus is actually replacing the PC and monitor for professionals at work. Instead of the one small monitor you have on your laptop or for your PC, MR headsets can offer you an infinite amount of screens you can shrink and grow as needed to do your work. You can choose to do this in a completely simulated environment, giving you more focus, or in mixed reality, which will enable you to be more social with your coworkers. Beyond being a better monitor, Apple and Meta have also highlighted remote collaboration and telepresence as further ways its spatial computer will disrupt how we work. Apple’s Vision Pro unveiling showcased demos with Freeform and FaceTime, complete with its avatar system Personas. While Meta’s Horizon Workrooms continues to evolve, now enabling support for non-headset video calls from Zoom. I would keep an eye out for enterprise programs, fleet management tools, and enterprise-focused apps this year, both directly from the manufacturers and also from third-party organizations and startups.?

#22 Higher-end MR headsets with improved passthrough are employed for enterprise training simulations

While today’s MR headsets boast good enough color passthrough to truly enable training simulations within an organization, these devices need to pack an even more powerful punch, with a key focus on resolution and security. Varjo is leading the way with its XR-4 Focal Edition and Secure Edition headsets. The Focal Edition headset, at over $10,000 USD, uses a combination of eye-tracking, LiDAR depth sensors, and auto-focus cameras to bring passthrough quality close to “human-eye” resolution. This quality of passthrough resolution is important for scenarios where clarity of real-world objects is vital, such as pilots training in mixed reality in real-world cockpits. This year, I expect we will see organizations get more serious about adopting higher-end MR machines as part of their training strategies. I also expect we will see further players focus on this segment of the market, competing on resolution, security, and comfort while sitting at a price the enterprise can entertain.?

#23 AR glasses built for the enterprise augment healthcare, manufacturing, and the military?

While we have a way to wait for optical see-through AR glasses to truly be ready for consumer use, devices from Magic Leap and Microsoft, which are in their second generation, have already begun to disrupt many industries. These devices are purpose-built for enterprise and are currently deployed in manufacturing, healthcare, public sector/government, architecture, engineering, and construction (AEC), and the military, to name a few. Both Magic Leap and Microsoft are committed to advancing their AR glasses offering. This year, I would expect to see further software updates for these devices that improve tracking and spatial awareness, enrich its enterprise capability through improved security and device management, as well as provide developers with further tools to build solutions. We may even see (or at least learn more about) the next generation of these devices. In addition, I expect we will hear more success stories that measure and prove the ROI of using ML 2 and HoloLens across industries as well as see solutions providers gain the necessary funding to grow further into their target audience.???


Ethics & Human Impact

#24 Society begins to reevaluate privacy and social contracts in the age of ubiquitous cameras

One of the biggest challenges for new technology adoption is not fashion and style, price point, or even content—it is cultural fit. As spatial computing technology integrates more deeply into our daily lives, it won’t just change the way we interact with the world; it will also challenge our long-standing cultural norms and values, particularly around privacy and consent. Societies worldwide have different thresholds and expectations when it comes to privacy, and the advent of technology, especially technology heavily rooted in the ubiquity of sensors, can potentially infringe upon these expectations. This necessitates a cultural recalibration. People, communities, and legal systems will need to negotiate and redefine what is considered acceptable and respectful in this new era of spatial computing. This isn't just a technological or legal challenge; it's a cultural one. It will require a collective reassessment of how we balance innovation with the right to privacy and the norms that govern social interactions.

Discourse and debate around technology and cameras are not new. We saw major backlash when the feature phone got a camera, as it was banned in restaurants and gyms back in 2008. We saw this again in 2013 when Google Glass Explorers were labeled “glassholes,” a term that aimed to reflect society’s feeling that camera-equipped glasses were inconsiderate or invasive. And just at the tail end of 2023, the NY Times published a story on “The New Age of Surveillance” with its focus on the new Meta Ray-Ban. Meta said in a statement that privacy was top of mind when designing the glasses. “We know if we’re going to normalize smart glasses in everyday life, privacy has to come first and be integrated into everything we do,” the company said.

2024 will push this topic even further as more of us begin to let spatial computers into our homes and put AI-enabled assistants in our ears. In doing so, we will inch closer to the tipping point where the value of ubiquitous sensors far outweighs today’s societal norms. This will trigger a cultural readjustment, which may mark “the death of privacy as we know it.”


Thank you for reading!

Have feedback or questions, or want to collaborate? Contact me at [email protected] .

Subscribe to my newsletter and follow me on LinkedIn to get regular posts on augmented reality, spatial computing, and the metaverse, including my “24 Augmented Reality Trends for 2024,” which will be released in early January 2024.

This article was written as an independent piece. The ideas and opinions expressed in this article are mine and do not represent any past, present, or future organization with which I may be affiliated. All images were generated using Midjourney.

Stephen Woodard

AWS 8x Certified | Amazon Alum | LinkedIn Top Blockchain & Cloud Voice | ex-AWS-Ambassador | Industry Recognized Speaker | The Open Group Certified Solutions Architect

9 个月

Tom Emrich ????? incredible article Tom, The prospect of AI-driven co-pilots for XR developers is groundbreaking. How do you think this shift will impact the skill set requirements for aspiring developers, and what kind of innovations might we expect in collaborative development environments?"

回复

The era of spatial computing.??

回复
Sachin Pandey

Metaverse Developer @Digital Punjab Labs | Talks About:- Virtual Reality | Augmented Reality | Immersive Learning | 3D Designing | Robotics| Artificial Intelligence | Data Science

9 个月

Thanks for posting

回复
Andy Martinus

Global Innovation Leader | Integrated Marketing | AI | Digital Transformation

9 个月

Thanks for taking the time to put this together Tom Emrich ?????. Incredible useful and some really exciting trends on the horizon.

要查看或添加评论,请登录