Taking 400 School Students to the Moon!
In 2016, Peter Koch and I embarked on a VR project that was not only exciting but incredibly rewarding. Over the course of just a few weeks, we dedicated our evenings and weekends to bringing an ambitious idea to life: creating an immersive VR experience that would take 400 students from his kid's Primary School on a virtual journey to the moon, recreating the historic Apollo 11 mission.
I led the creative direction, while Peter, took charge of the software engineering. Our brainstorming sessions during lunch led to productive evenings where we rapidly iterated on the concept. With the help of Martin Brown from Samsung Australia, I was able to secure 15 Samsung Gear VR headsets. This allowed us to design the experience in batches and ensure all 400 students could experience the moon landing.
One of the ideas that Peter and I were both passionate about was including John F. Kennedy’s famous “We choose to go to the moon” speech. We went back and forth on how to incorporate it since JFK never lived to see the mission's success. In the end, we placed it within a drive-in theatre setting as part of a historic news broadcast, which gave the students a feel of what life was like at the time, and set the scene for the virtual moon journey.
The first sequence begins with the Saturn V rocket on the launchpad. I designed the perspective from the base of the rocket, giving students a powerful sense of its sheer scale. Real-world dimensions were crucial here, complete with custom particle effects and sound synced to the countdown, to recreate the excitement of space exploration. The attention to detail was something I was particularly proud of—ensuring that students felt the awe of standing at the base of the rocket before it launched.
I created a series of episodic moments, one of which was designed to ‘fly’ the students above Cape Canaveral. The stereoscopic VR sequence allowed them to explore and view the scene from all angles. Our goal was to avoid motion sickness, which meant focusing on slow, smooth movements. I built the 3D environment using satellite imagery from Google Maps of Cape Canaveral which I photo-textured for accuracy and better realism.
I was initially working on an accurate model of the Lunar Rover, which only seated two astronauts. After discussing it with Peter, I modified it to fit 15 participants, including their teacher, seated in front. Peter networked all the headsets so students would see each other as fellow astronauts, enhancing the collaborative and social aspect of the experience. This setup created a cohesive group journey, with students interacting and pointing out things to each other during the moon landing simulation. Peter and I even experimented with a method where we could take each student's photo and map it onto a 3D plane inside the astronaut helmet, creating the effect that their face was inside. While we managed to get it working, the lack of animated facial expressions made it look a bit unnatural and fell into the 'uncanny valley.' Ultimately, we decided to abandon that idea and opted for a mirrored astronaut visor instead, which felt more seamless and added a cool, reflective look.
领英推荐
We recreated the moment Neil Armstrong stepped onto the moon by making every student feel like they were the chosen astronaut descending the ladder. As they looked back, they would see their friends still seated in the rover, but they experienced the iconic moment as though it were just for them. Many students later commented on how incredible it was to feel like they were stepping onto the moon themselves.
This project reinforced my belief in the potential of collaborative VR for education and other fields. You can read more about the project in the Manly Daily press release here .
The Manly Daily press release captured the essence of the project perfectly, with Peter stating, “It feels like you are really there,” which was exactly the effect we wanted to achieve. Although there were technical challenges, like Peter managing the battery life of each device, the project was a resounding success. We created something that was more than just a classroom activity; it was a shared, collaborative VR experience that engaged and inspired students in a new and meaningful way.
I truly believe that this is the future of mixed reality. If two or more people can see and interact with the same thing, at exactly the same time, whether they are in the same room or seated next to one another, or if they are in different parts of the world, that connection is profound. It can lead to better social interactions, education, or industry applications. This level of interaction should be the foundation of any mixed reality experience.
3. We placed chairs for the students in a specific arrangement, and then entered that arrangement into the server. At the start of the day we simply mapped a GearVR to each seat, just a press on the server console to re-assign. Then when students looked around they could "see" the other kids sitting around them. As they put in their headsets they would appear in the experience like magic. This then led to being able to have a spectator camera for bystanders that wanted to watch - we built a desktop version of the app and could render from a series of virtual cameras placed in the scene OR amazingly simple it could show the point of view from any person in the experience, and you could feel their excitement as they looked all around. Which then led to the spectator app having an algorithm to automatically switch between cameras and also show information such as what chapter was being played. Attached is what the server looked like. The seat assignments are in the middle as BLUE (active), RED (ping time exceeded) or GREY (unassigned). I could write a whole post on the safety features to keep the students safe, such as monitoring the FPS to make sure it didn't induce sickness if the FPS got low! We would blank the screen!
2. Initially I wrote a WiFi listener into the app that I could send a broadcast on the network and that would then start the experience for all devices. But after that I wanted to know that the device had started and so I added a back-channel that would send back the app and device status (including battery level) which I then displayed on a master control MacOS app. Then, it was a logical next step to add in more fine grain controls around having Unity scenes as chapters and being able to control which chapters to play - the idea being shorter sequences if we were running out of time. The protocol was super efficient, with the client apps sending telemetry to the server over UDP (and a back-channel of TCP), and the server co-ordinating everything and sending back over Broadcast UDP. Pretty soon it became obvious that the same protocol could be used to make this multiplayer with sensor data (3-DOF + input buttons) could animate the avatars, send that to the server and it would relay it to all apps in a single broadcast. In theory we could update over 100+ clients on one WiFi node in real time. That then led to it being a co-located multiplayer experience with a very simple solution...
1. It's a rare project that all the stars aligned at the right time to make such a memorable day for everyone involved. Even the backstory is kind of crazy... As a superfan for spatial computing as an educational tool, I was blown away by the Apollo 11 VR experience by ENGAGE XR. Such that I went to visit the Meredith Tomkins the principal of my kids primary school, and asked her if I could run that VR experience for my son's class. I expected a positive response, but what she said was along the lines of: "I can't let you do that, because it would be so unfair for the rest of the students" Being an enterprising software engineer, I said to "leave it with me and I'll come up with something"... The problem with Apollo 11 VR was that it required a beefy PC and needed hard to get Oculus headsets. But the Samsung GearVR second generation was much more accessible. I realized if I could get enough of those I could do multiple groups of students and do the whole school: We'd have to build our own using Unity and target the lower spec GearVR. My initial plan was not multiplayer at all, but was to synchronize the experience starting on each headset at the same time so all students are seeing the same thing simultaneously...
Transforming Business with Strategic Partnerships | Driving Growth & Innovation | Strategic Solutions Leader
1 个月What a fantastic memory Paul Kouppas and Peter Koch . I still have the framed photo of this event on my desk. I fully agree that VR/mixed reality should be an integral part of any education system.
--
1 个月Thanks ??. Paul you are amazing ??