Using data to improve XR apps' performance, retention, and presence

Using data to improve XR apps' performance, retention, and presence

Today we are joined by Tony Bevilacqua , CEO at Cognitive 3D, to understand how spatial analytic data can be used to improve the user experience and app performance. For the weekly product spotlight, I will share an app that allows you to finally visualize your WiFi signal in 3D like never before.

Subscribe to the newsletter and get a list of 200+ MR apps

Interview with Tony Bevilacqua

In this interview, Tony Bevilacqua, founder of Cognitive 3D , explores the role of spatial data in understanding and enhancing user experience in XR apps and games. Tony discusses the challenges in utilizing spatial data, some key performance metrics, and the strategic implementation of privacy frameworks in immersive technologies. He also delves into the impacts of eye-tracking and biometric data, offering a glimpse into the future of XR data analytics and its potential to reshape user experience design across various industries.

How does capturing 3D data enhance your understanding of user interactions compared to traditional 2D data?

Tony Bevilacqua: Capturing 3D data is vital because it provides insights that traditional 2D data cannot offer. For instance, in VR and AR, understanding how users interact with the environment around them requires a spatial perspective that 2D data lacks. This spatial data helps us analyze how users move through and interact with virtual spaces, which is critical for applications like training simulations, consumer research, or product design. It allows us to measure engagement and how elements within those spaces draw and retain user attention, which is incredibly valuable for making informed decisions on design and functionality.

How do you visualize spatial data effectively to aid developers in understanding user interactions?

Tony Bevilacqua: Visualization of spatial data in 3D is key. We use a Matterport-style dollhouse view in our Scene Explorer tool, where data is overlaid within the virtual environment that developers are already familiar with. This approach allows them to see user interactions and behaviors directly within the context of their environments. It makes identifying problem areas and understanding user behavior intuitive because they can observe how interactions unfold in the virtual space they've created.

Example of doll-house view for a VR training

Can you explain what are some performance metrics for XR apps and how you optimize them?

Tony Bevilacqua: Performance metrics are crucial because they directly impact user comfort and the overall immersive experience. At Cognitive 3D, we focus on metrics like frame rates, GPU and CPU usage, and where frame drops occur within a virtual environment. This information allows developers to pinpoint and optimize areas of their applications that might be not optimized. By understanding spatial relevance—where these issues occur within the virtual environment—we can provide actionable insights that help developers improve performance, ensuring a smoother and more engaging user experience.

What insights have you gained from using eye-tracking and biometric data in immersive technologies?

Tony Bevilacqua: Eye-tracking and biometric data provide profound insights, especially in the enterprise sector. For example, in training and simulation, eye-tracking can show us how users “scan” the environment, enhancing our understanding of their situational awareness. This data is instrumental in sectors like law enforcement training, where understanding how a trainee obeserves an environment can be crucial. However, consumer applications have yet to see widespread adoption of these technologies due to privacy concerns and the invasive nature of the data collection.

How do you address privacy concerns while collecting and using data in XR environments?

Tony Bevilacqua: Addressing privacy concerns is fundamental. At Cognitive 3D, we've developed the XR Privacy Framework to give end-users control over their data. This framework includes a consent window that informs users about the data being collected and the purposes for which it is used. Users can choose to opt-in or opt-out, and our system respects these preferences by adjusting data collection accordingly. It's crucial for developers to be transparent and to adhere to OEM policies and privacy laws, ensuring that data use is ethical and compliant.

How do you ensure the XR Privacy Framework aligns with global privacy standards and user expectations?

Tony Bevilacqua: We designed the XR Privacy Framework to be robust and adaptable to various global privacy standards. The framework functions similarly to a 'Do Not Track' feature but is built specifically for XR applications. It provides clear, actionable choices for users regarding their data, ensuring transparency and control. We regularly update the framework to stay compliant with international privacy laws and best practices, maintaining trust with users and developers alike.

Can you share a use case where 3D data has significantly improved a training simulation?

Tony Bevilacqua: One standout use case involved a law enforcement training program where 3D data allowed for detailed analysis of how trainees interact with their environment under stress. By using eye-tracking and spatial analysis, we could assess how effectively participants observed potential threats and navigated complex scenarios. This depth of data enabled trainers to tailor feedback and instructional content to improve decision-making skills significantly in real-life situations.

What challenges do you face when implementing mixed reality applications, considering the unpredictable nature of real-world environments?

Tony Bevilacqua: Mixed reality introduces unique challenges because it combines virtual elements with the real world, which is inherently unpredictable. We address this by providing developers with tools that can capture essential data about the room's layout and identify obstacles like furniture. This helps in creating experiences that are adaptive to each user's physical space, enhancing safety and user engagement.

What future developments do you see in the field of XR data analytics, and how might they impact user experience design?

Tony Bevilacqua: Future developments in XR data analytics are likely to involve more advanced machine learning models that can predict user behavior and preferences more accurately. This will enable even more personalized and engaging user experiences. As these analytics become more sophisticated, we'll see XR applications that are not only more intuitive but also more responsive to individual user needs, potentially revolutionizing fields like education, healthcare, and entertainment.

Check out the full interview on your favorite platform ??


Product Spotlight: SeeSignal

SeeSignal is an app created by the team at BadVR that focuses on spatial data visualization. It is freely available on Quest andallows you to visualize your WiFi signal thanks to the help of mixed reality.

It supports both hand tracking and a variety of settings to change the look of the signal. I have tried it personally and I have to say that the visualization is quite accurate and helped me find some dead spots in my house ??.

https://www.meta.com/en-gb/experiences/4793438284066128/

Subscribe to the newsletter to get a list of 200+ MR Apps


That’s it for today

See you next week

要查看或添加评论,请登录

Gabriele Romagnoli的更多文章

社区洞察

其他会员也浏览了