Virtual Reality or Virtual Insanity
Nirav Desai
CEO @ Moonbeam | Product Development, Product Leadership, Enterprise Solution Design
Originally published in Booz Allen Hamilton's in futures newsletter Short Circuit in September/August 2018 -- https://bah-the-solutions-business.us.newsweaver.com/ShortCircuit
Driving home amid a smoky forest fire haze in Seattle this summer, I heard a classic song from 1996 on the radio. Acid-jazz, one-hit wonder Jamiroquai articulated a dystopian vision of technology and humanity with his famous song, “Virtual Insanity.”
// “Future's made of virtual insanity
// Now always seem to, be governed by this love we have
// For useless, twisting, our new technology
// Oh, now there is no sound
// For we all live underground”
// --Jamiroquai, 1996
This artist’s vision sparked reflection as a technologist about the future of an important technology: virtual reality (VR). It has taken over 25 years to progress from early, glowing, line-based, and motion sickness-inducing VR into a stable, expressive, and engaging commercial platform that is popular today. Why did it take so long for the industry to mature? I see tremendous potential for VR and associated technologies, but will these technologies be good for humanity and lift us up or pull us down to new lows? And, are these advancements reality (true innovation) or insanity (hype)?
How My Career at as a Consultant Helped Me Dive into Emerging Technology
Several years ago, I refocused my technology attention from business intelligence and big data to VR. Maps always inspired me. As a child, I spent hours looking through the atlas or tracing the interstate system to find the most interesting route from Houston to Chicago. When Google Earth was released in the early 2000s, I started seeing how I could present data on a globe. When two of my team members (a rockstar developer and a former submariner) raised the idea of visualizing undersea terrain data in VR, I knew that the results would be compelling.
Subsequently, my teammates (at Booz Allen Hamilton) and I worked together to create OceanLens, a VR tool that fuses terrestrial and underwater topography data into an interactive three-dimensional (3D) environment. With this tool, I was able to immerse myself in geospatial data below the ocean’s surface. The first active duty sailor to whom we demonstrated this product was amazed; he had traveled in and out of the San Diego Harbor many times but had not once seen the ocean’s topography. Through OceanLens, the sailor was able to gain an immersive experience similar to driving through the ocean’s underwater landscape. OceanLens demonstrates the ability to interact in 3D with 3D data representing an early application of a nascent wave of computing -- immersive and spatial technology. Through immersive technology, the U.S. Navy can now create more precise and data-driven plans to increase sailor safety and national security, as well as save time and money on training.
Immersive and Spatial Technology Today and How It Got Here
The early commercial VR experiences were insane trips that brought us to surprising places. I could stand on the bow of a wrecked ship and be inches away from the tail of a whale swimming by (e.g., theBlu by WEVR). I could pretend to be a robot rediscovering ‘90s office culture (e.g., Job Simulator by Owlchemy Labs). I could see 3D art exhibits or visit a haunted house. Fun, frightening, and weird.
As the industry matured, applications grounded in reality and human learning began to emerge. Game studios like Free Range Games achieved success working with Raymond forklifts on a simulator to improve worker safety. Ford invested in 3D design of car interiors using VR. Microsoft and their partners pushed forward with enterprise and industrial augmented/mixed reality (AR/MR) to bring new products into the industry, including 3D maintenance procedure cards (Taqtile) and building design visualization (Studio 216). Booz Allen worked with the U.S. Department of Defense to improve the effectiveness of soldier training, so that paratroopers could evaluate a proposed jump without ever leaving the ground.
Advancements in VR and AR have progressed in parallel with a proliferation of data and immersive devices, which in turn accumulated even more data to mine and analyze. Hence, VR/AR can now benefit from artificial intelligence (AI), making it possible for these tools to learn and infer from their users’ activities. As an illustration of this confluence, imagine that we are strolling through Pike Place Market in Seattle, and while looking at a fishmonger, I say “that pitcher has got an arm.” Based on location-relevant knowledge, the direction of my gaze, language context, and hand gestures, you would likely infer that I am talking about a fishmonger throwing salmon and not a baseball game. This is an intelligent inference. For a computer to piece this context together, it must process my voice into language and parse the language. The computer must then use computer vision (an AI technique to extract information from images) and object identification to contextualize the fish and the fishmonger. The system must also know my location and build a spatial map to see that we are in Pike Place Market where fish throwing is a tourist attraction. To emulate the human mind, the computer must understand the difference between literal and figurative language. With all this data, the computer must calculate using probability to infer and evaluate the meaning of my statement. Excitingly, we are not too far from this becoming a commonplace reality! This is immersive computing in a spatial context.
Use-cases of immersive technology to help people get work done are particularly intriguing. Immersive technology brings forth insight that helps users make decisions in a ways that are natural, intuitive, and relevant to the context in which they are working. Various sensors work together to bring in information about the world, process data, and present users with context-relevant possibilities. We will not be tied to a desk or phone to compute; instead, we will speak, gesture, or mentally indicate intention using brain-computer interfaces. These interfaces include devices that read brain activity and leverage this information to allow us to interact with machines. Real and insane.
Are These Advancements Good or Bad?
Immersive technology can enable human interaction, creativity, and data-driven decision making in innumerable ways. Let me map this out with a few examples:
Training: Immersive technology allows for several senses of perception to be trained at once. When the U.S. Army was fielding new counter-drone equipment to forward-deployed soldiers, Booz Allen used this opportunity to train these solders on new equipment in an immersive way. Immersive training simulates scenarios that may not be efficient or practical to train for in real life due to considerations of worker safety, environmental impact, or simply resource availability. (For more, see: Booz Allen’s Counter Drone Trainer)
Informed Interactions: A doctor reviews a patient’s chart before meeting with the patient. Imagine the improvement in medical care if a 3D hologram of the patient’s medical images is available while the doctor describes treatment options or surgery details. Booz Allen has a Holographic Medical Imaging service offering that previews this future technology.
Telepresence: Have you ever had to fly from Seattle to Hawaii for a 1-hour meeting and then turn around even before processing the time change? I have, and it’s not fun! Immersive telepresence helps people collaborate across geographies in a natural, interactive, and meaningful way by allowing participants to interact with each other as if they were in the same room. Telepresence and collaboration solutions like Doghead Simulations’ Rumii and Valorem’s Holobeam take different approaches to productivity across geographies; however, they create a shared virtual workspace that users and their tools can access. Doghead accomplishes this through a VR meeting room with productivity tools; and Valorem leverages connected devices such as Azure Kinect in Holobeam, holographically projecting a person and anything that they are holding to any participant wearing a HoloLens headset. As a result, a user can speak to and interact with a remote collaborator in a way similar to a scene from Star Wars.
Sensory Extension: Brain-computer interfaces (BCI) such as Evotiv’s Epoc can identify brain activity and diagnose intention, emotional state, and discrete direction. The Freie Universit?t in Berlin even leveraged BCI to help a muscularly-impaired individual drive a car by using brain waves. Integration of brain activity to train AI systems for critical activities offers opportunities to augment natural senses or assist people with physical impairments regain function. Working with the University of Washington, Booz Allen initiated development of an analytics platform using BCI to determine if VR travel can have the same benefits as physical travel and offer the potential to delay the onset of Alzheimer’s Disease.
How “Virtual” and “Artificial” Can Make a Better Future Reality
As I arrived at my home to meet my wife and young daughter, Jamiroquai’s voice came to an end. His song was a warning of what may happen if we develop technology without soul or service. I believe that the fate between human and machines can be sung to a different tune. What we build is only as good as the intentions and ethics of those humans who build it and use it. Can technology be used to enable that which makes us human and not subvert it? Will it make the world better for my daughter? I thought of this, as the song faded away.
Looking at these applications – existing and proposed – AI is what makes spatial and immersive computing work, analogous to how wireless technology and miniaturization made mobile computing work. If used appropriately, this technology could help us make more informed decisions, spur creativity, and promote the attributes that make us human, such as empathy and understanding. These characteristics are what excite me about the next wave of computing – the ability to use technology to intuitively, naturally, and pervasively enhance our lives.
President at 612 Consulting Group
6 年Nirav this is a well crafted synthesis of how we can make progress in this space based on a moral/ethical viewpoint. Thanks for sticking this together.