A continuation of my posts on AR/VR interface.
Jorgeonline.me - Star Trek: Into Darkness viewscreen interface.

A continuation of my posts on AR/VR interface.

As I am a visual person, using visual descriptions of ideas make sense to me especially when discussing VR or AR interface. In a previous post I defined what the Stage was within my terminology. I’d like to discuss how I see the elements on the Stage and how they interact with each other.

I’m envisioning the stage as an invisible sphere that surrounds the user at infinite distance. It has a horizon that orients the stage level with the world horizon, but it also has a horizon in the sense of a controlled distance at which AR or VR elements can be seen and interacted with. The user has the ability at this high level to dictate the extent of their VR/AR experience. There is also a differentiation between the elements that are trusted by a user and those that are un-trusted in their visibility horizon as well as their ability to affect the users stage.

The AR system operates as I see it? as a combination of spheres within spheres, each with their own horizon control for distance and each with their own control over elements. At the highest level it must follow the rules laid out by the Stage.

The world sphere is the real world and never a digital representation of itself. It is the outermost sphere and contains information used by the AR system to position the theatre within the world. It relies on GPS, image recognition, attitude and altitude, yaw, pitch, roll, time and date, landmarks Wi-Fi &c. The point of the world stage is to create at the highest level the basis for interaction of the real world with the AR system and to accurately position the user’s theatre.

The Player layer is the next layer of the onionskin, it is the users discussion with other users. It dictates how far other users can see them from. It doesn’t actually display any elements.

The Actor layer comes next. Although it resides closer to the user, elements presented and controlled by this layer can fall behind world layer items as well as player layer items. Elements displayed on this layer can transverse the users whole visual frame, but their opacity and size are controlled by the Stage.

The in-app command layer is static to the users movement. It is much like a heads up display that is always persistent and has a restriction of never being able to block a certain area of the users vision. It moves with the users head, but not with the users gaze.

The app command layer can have permission to block all areas of the users vision. The Stage still controls the opacity and ability to display its elements.

The system command layer is the only layer that has the ability to obliterate the users view of the world. It can replace the stage completely. Access to this layer and its functionality requires a logging out of the AR program.

The higher the likelihood of causing injury to the user the more each layer is restricted by other layers permissions.

If you have gotten this far, you probably are wondering how all this fits together in use.

I’ll give a small example.

When I turn on my AR headset I am taken into my AR desktop application in which I can launch any number of applications. So I launch my friend tracker let’s say. As it launches I can see the pins that represent my friends floating around me. Each has a unique size and opacity based on the parameters of my stage and my preferences within the application itself, as well as those of my friends. I have a basic heads up display let’s say at the top of my view that shows how many friends are online and access to my in-app dropdown controls. If I receive a notification from one of my friends and click the icon to see the notice, the presented text box is restricted from a certain part of my visual space based on information passed to the application by the world sphere. So If the world sees I am moving, such as in a car it won’t allow certain things to happen. It may force the application to only provide audio of the text message.?

This takes us back to one of the first posts I made where I said Creators/Streamers must be accountable for the consequences of AR/VR

要查看或添加评论,请登录

Howard Suissa的更多文章

  • Canadian Support during Covid-19

    Canadian Support during Covid-19

    My accountant sent this really helpful email out today and said I could share it with anyone who was interested in a…

  • Typography in a digital age.

    Typography in a digital age.

    “Digital design is like painting, except the paint never dries.” Neville Brody Most people are familiar with Tim…

  • 2020 - Perfect Vision.

    2020 - Perfect Vision.

    Over the last 20 years I’ve been fortunate to have had the opportunity to work with some amazing people and companies…

    2 条评论
  • Tevosol - Advanced Human Organ Transplant Technology

    Tevosol - Advanced Human Organ Transplant Technology

    I was really thrilled to have been part of the team developing the EVOSS Ex-Vivo Organ Support System. So far the…

    1 条评论
  • The New Design

    The New Design

    I have always been a proponent of the integration of user experience across all touch points of a solution. I push the…

    1 条评论
  • Experience the Project

    Experience the Project

    By designing the experience of the project for your employees we can create an environment conducive to a great product…

  • "What can we do ‘to’ Big Data with Design Thinking?"

    "What can we do ‘to’ Big Data with Design Thinking?"

    Designers come from a place of collaborative style with recognition that it’s about dialogue. Dialogue with the client,…

    1 条评论
  • Yamaha's motorcycle and instrument designers trade jobs

    Yamaha's motorcycle and instrument designers trade jobs

    Yamaha's motorcycle designers have created some musical instruments, while the instrument designers have created…

    1 条评论

社区洞察