Ethical Considerations in Spatial Computing: Privacy, Security and Apple’s Vision Pro Headset
Spatial computing?has been a part of the XR development pipeline for a long time but Apple’s new announcement of its?Vision Pro Headset?presented it in a slightly different context than developers are used to. Apple is positioning its entire Vision Pro branding around “spatial computer” terminology and focusing on user utility and integration with human design. As companies innovate daily, so does the marketing and branding. There are other considerations however.
Emerging technology companies like?Apple,?Snapdragon?and?Magic Leap?for example always develop at a rapid rate but best practices, legislation and policies usually trail behind creating vulnerabilities and marketing challenges. The influx of new 3D headset technologies and 3D development are making 3D asset management even more important. Companies like?echo3D?are planning to build their 3D asset management features with companies like?Apple?in mind.
Register?for your free echo3D account.
Click on this?link?or scan this QR code to see a dinosaur in AR!
Data Usage
These new headsets require much more than button inputs to work which brings us to our ethical dilemma. Apple has been leveraging?biometric data?for years as part of their human-centered design approach to product development. They have access to fingerprint and facial recognition data which are very sensitive forms of personal identifiable information. Both of these are stored natively on the device in “Secure Enclave” — a dedicated and secure subsystem of Apple systems chip–instead of the cloud to prevent it from being accessed by bad actors.
Given Apple’s reputation for allegedly protecting user data and privacy, it’s very likely that they will come up with a reasonable way to address ethical concerns about?spatial computing. Some might argue that users are not entitled to know what happens with their data and others argue they should outright own it.
TYPES OF DATA
Location Data
Spatial computing?uses physical location data points using GPS to physically map out its digital experience. Many augmented reality, virtual reality and location-based experiences rely on these data points in order to run as intended. For example,?Pokemon Go?requires accurate user location in order to deploy its game. Without this information provided in real-time, the game would essentially be non-playable.
Spatial computing?can provide very useful features like checking traffic in real time, weather conditions and the allocation of people in a given space.?According to Marcin Frackiewicz at TS2, these data points for example can be used for policies on public safety, resource allocation and infrastructure investment. Ultimately, these decisions are valuable but without the informed consent of users, it is ethically a gray area. Some examples of?location data?include longitude and latitude, IP addresses, mobile ad ID, timestamps, altitude and elevation.
In a whitepaper by?USC?and the ethical implications of location-based experiences, they examine the unknown and the future: “While this may appear harmless at first, these data points, when summed across days of travel, generate unique geographical footprints for each individual. The root of privacy concerns lies with this user location data and more specifically with who has access to it.” These data logs can reveal a users’ home, places they frequent and general routine. A lot of the tension over data and privacy is who has access to it and who controls it. This information in the wrong hands can be fatal in worst case scenarios and annoying once in the hand of advertisers.
Biometric Data
3D immersive medium provides access to more data points on our physical bodies. Specifically we are talking about eye tracking, gait detection, facial tracking, emotional sentiment analysis, galvanic skin response, EEG, EMG and ECG. We can summarize a users emotional and physical reaction to content based on these metrics and match it to the millisecond. Users involuntarily provide useful feedback to providers with their biometrics. This is a problem because biometric data cannot be retrieved once exposed.?Oculus’ privacy policy?even states that their headsets collect “information about your environment, physical movements, and dimensions when you use an XR device.”
In a whitepaper called?“The Ethical and Privacy Implications of Mixed Reality”?written by members of?Voices of VR,?Mozilla?and?Magic Leap, the authors share, “There may be some amazing insights that could be gained by capturing and analyzing patterns in our biometric data, but there are also many risks associated with how our biometric data could be used against us. Part of the ethical dilemma around biometric data is deciding how much of it should be considered ephemeral and not recorded versus how much we want to allow technology to imperfectly attempt to quantify our emotional sentiment to a variety of input stimuli as part of a permanent record about us.”
领英推荐
CONCERNING OUTCOMES
Surveillance and Control
Another major concern in spatial computing is using this technology for surveillance and control.?The Apple Vision Pro?headset is intended to be used for long periods of time while carrying out various tasks including app usage, telepresence and game play. Since XR headsets are mostly for gaming, data points are limited to game play.
However, with Apple offering the?Vision Pro?as a utility headset, there is plenty more data available to collect like social media app behavior, shopping and overall usage patterns. The more data points that are available, the easier it is to predict future behavior or influence a users current behavior. This is a slippery slope that is borderline ethical. User experience design focuses on understanding human interaction with products and many apps used today are optimized based on these data points, especially social media apps.
Click this?link?or scan this QR code to see this lounge chair in AR!
Discrimination and Bias
Another concern is the opportunity for bias and discrimination. According to?Statice, facial recognition algorithms have produced biased results based on gender and race which could lead to unfair outcomes in varying contexts. There is always the possibility that there are bad actors looking to do harm when they obtain user information. For example, there are discriminatory ways by targeting or banning specific messaging from certain groups. This is particularly useful for marketing, politics and social injustice.
This data can be used to create discriminatory or biased algorithms to reinforce existing power imbalances or negate critical issues. This would continue to exploit and hinder some groups more than others. Large corporations are desperate to continue capturing our attention and it’s nerve wracking to consider what they are doing and will continue to do with this information.
In order to create a protected experience for users, there must be transparency and options for users to determine what data can be collected and who accesses it. Removing this fear of the unknown can create a more comfortable spatial computing experience. Once information is out there, it cannot be undone so it’s critical that personal information be managed from the beginning.
Register?for your free echo3D account.
Check out these other articles:
#3Dassetmanagement #3Dcontentmanagement #3DCMS #3Dcompression #3Dwarehouse #3Dcompression #contentdeliverynetwork #3dcontentdeliverynetwork #3Dwarehouse #ar #augmentedreality #vr #virtualreality #3d #metaverse #softwaredevelopers #cloudcomputing #softwareengineers #mixedreality #mr #extendedreality #xr #3dassetmanagement #3dassetmanagementsoftware #3dassetmanagementopensource #3dassetlibrary #3ddigitalassetmanagement #3dasset #3dassets?#spatialcomputing #meta #apple #visionpro #echo3D
Headline photo credit:?Freepik
echo3D?(www.echo3D.com) is a 3D asset management platform that enables developers to manage, update, and stream 3D, AR/VR, Metaverse, & Spatial Computing content to real-time apps and games.