Qantas Engineers Trial HoloLens 2 To Transform The Way They Learn
A few times in your career do you actually get to work on a project, that as you're envisioning, you know if you can pull it off, you're not going to just help a person or a customer, but you're probably going to change a whole industry.
(skip to the press release)
November 2018, I sat in a conference room over in Mascot at Qantas headquarters. We were going through learnings from some early PoCs on Gen 1 HoloLens. As we met with various audiences from Engineering, to Training, to Group Capabilities - we realized that while the first HoloLens (announced in 2015 and launched in 2016) was good, for some of the really bold use cases -- it wasn't going to cut it.
Though, of course, everyone wanted more immersion -- in fact all of the users felt that they could get used to the limited field of view on the first device -- there were a few things that represented show stopping compromises. The biggest was the weight. Although the first device weighed just over 500 grams - the balance was off. For short engagements or trainings, it would have been fine - but to really work without comprimise, users needed to spend at minimum an hour and a half or more to do what they needed to get done. Then of course, there was the user interactions for gesture. Rolling the HoloLens 1 out across an organisation as large and as diverse as Qantas would require a certain amount of effort and to train the users how to interact with the environment. And while, of course, this could be designed around by making the user a spectator, it took away from the learning opportunity. The idea was to give Qantas empoyees an opportunity to learn by actually "doing it", and as would become a sort of mantra as we built out the project -- to give Qantas capabilities team an opportunity to "train unsafe things, safely".
Safety is Qantas' number 1 priority. The ball was in our court. Luckily for us, just after we determined there was just enough value to move ahead on a Generation 1 device, Microsoft announced HoloLens 2.
The device itself is a revolution. For those of you who haven't tried it yet: a much improved field of view (2.4x bigger, and oriented for greater usability versus watching media), hand tracking that actually extends beyond your field of vision, meaning you can inteact with things without needing to actually position your head so they are in the middle of your field of view. And speaking of interaction: with eye-tracking and hand tracking -- we could now add direct manipulation into the learning modules. And as a person who's used the Gen 1 device everyday for 4 years -- the balance on the HoloLens 2 is revolutionary. No. Really.
We did a lot of work resulting in a device, that though it's only 13 grams lighter, is perfectly balanced on your head allowing you to easily and comfortably wear it all day long. (Ah, the magic of engineering.)
The result: A world's first. A fully digital, mixed-reality Ground Engine Run flight deck. Multiple users can work together in a 737-800 cockpit and go through the 45 step flight procedure to start up or shut down engines. And when I say "go through", I mean - you reach out and toggle a switch to cut to an engine, you physically turn a virtual dial to "FLT" and you push a virtual throttle into position and get a really satisfying 360 degree audial engine rev that actually gives you goosebumps.
I got to witness the first time we put a flight engineer into the alpha build earlier this year on a Preproduction Engineering Evaluation HoloLens 2. It's one thing to put it on someone who appreciates the asthetics - but to put someone who spends times in actual 737's into the device and watch how, without hestitation, without any instruction, they go into the procedure -- and to see the grin spread across their face as the Mixed Reality simulator responds almost identical to a real plane was a career highlight for me.
Below is a video with the team at Qantas and our partner Altoura talking about their work.
For those of you who watch the video and think: "Those are special effects." Nope. We shot this video with the team in the live product using Spectator View.
We still have so much to do, but the solution, even in these early days holds so much promise. We are actually capturing data for each individual learner - not just what and who they are interacting with, but we get data on what they looked at while they were carrying out the tasks. We've built a platform that lets engineers and pilots practice anytime, anywhere - even at home, with others, real time across distance. And I think, more importantly, we're cutting down the up the 6-8 hours that it normally takes strapped into a physical simulator, so not only that you have to spend less time on-site (allowing more people to get time), but also that when you do get that precious simulator time you're more productive with it.
And for those of you who were wondering "is it possible to revolutionize how my learners engage with their complex training using holograms"?
"Yes." The answer is, "Yes it is."
API Services at Agriculture Victoria
5 年Nice one Lawrence! Looking forward to ours arriving for our research SmartFarm digital twin experience.
Retired EdTech Researcher
5 年That’s really interesting to see hololens being used for training in a real world situation
Passionate about Creative Training, Process Improvement and Making Tech Simple.
5 年This is awesome stuff!
ISG | Accelerating sustainability & renewable energy transition through innovation and technology
5 年Thanks for all your efforts on this Lawrence (+ team, + Altoura). Great to have your passion and insights to make this happen. Looking forward to the trial results!
Founder / CEO I Intelligent Training for Frontline Skilling
5 年Great article Lawrence Crumpton!