Augmented Reality happens right before your eyes.
Seong-Jik Kim
UX designer (Immersive Experiences; MFA, New Media), Director of Post Production at Korean Broadcasting System
Recently, I posted kinda VR/AR video series on my linkedin channel. The first is to map two 3D animation scene and a video on my own businesscard with a simple AR technology. The second is an interesting touchable interaction between #VirtualWorld and #physicalObject, And the third and the forth were live broadcasting AR demo I was directed 4~6 years ago at Korean Broadcasting System. Most of all, AR businesscard demo was super highly focused by over 11K linkedin users and recorded 255 likes so far and continuously increasing little by little. It was just made just for fun on doing ideation for my company project. I didn’t think that it could have such interesting responses when I posted on the Linkedin. Here is my compilation of the series.
As you know, #augmented #reality is transcending time and space. On the still imagery, I mapped two animated-styled #3Dobjects : earth and circus scene from library and #2Dvideo : a recorded footage of my driving from Oakland downtown to San Mateo about a year ago. Both 3D animated objects were mapped on my current employer’s logo with a little different angle, and the video footage on whole rectangular businesscard outline. What do you think about this for potential use cases? This is very simple and elementary level of #AR #prototype for demoing transfoming between 2D and 3D as well as very rough #computerVision technology. #TargetImage-based AR is not so hard but looks pretty cool, right? In my case, 3D perspective driving video seems to be opened via 2D businesscard. This could be called to a magical businesscard go into wonderland. #Media #transformation is not so hard and has already come. However, it’s still pretty challenging for becoming more refined and much more usable.
Here are kinda magical #mixedReality real-time production (not a post-production) that I had directed at Korean Broadcasting System as a VR/AR director. Video quality and design levels looks cappy, since it’s been 6 years ago, but you can watch pretty interesting transitions from #VirtualReality to genuine space by #interacting with physical objects in virtual spaces.
In the above YouTube video, how could journalists
- pick a piece of pizza in a virtual fridge? (see 1:18~)
- pick an umbrella which was absent from the scene? (see 3:24~)
- ride on the treadmill to walk? (see 4:14~)
There were some tips and tricks, but kinda confidential :)
Do you want to see much nicer look #AR broadcasting scene? Here is an augmented reality News Briefing scene.
Is this much more compelling than my previous VR trick? This is also produced 5 years ago. My team used #realtime #tracking #crane and #VizArtist. 3D modeling was done by #maya, then exported to #VRML format after baked material and texture. Viz artist imported the scene files. I collaborated high-skilled 3D modeler. It’s kinda #LevelDesign workflow for high-quality looks design. After that, I made animtion sequence in Viz Artist, and editted the cues for live broadcasting. Effective scene plots and crane camera movements were directed by me :) These were my part of this video clip. I was responsible for directing whole workflow and realtime production and animation including technical system management. Though this was presenting in Korean, I think that you can understand what the female main anchor is explaining because this is a sort of #visualstorytelling. Enjoy 5 year old AR video for live broadcasting :) In this extra clip(https://vimeo.com/101600133), you can see mustached me and iPad interaction to play pre-cued AR animated scene.
What’s happening there 4years ago?
The left frame is a real news studio view in Korean Broadcasting System, and the right is the real-time AR composition recorded simultaneously. I was controlling with an #iPad to change scenes and animate pre-cued sequence. The Full HD camera was being used on the end of the arm which was almost 20 feet length on exporting tracking data from #Shotoku crane to #VizEngine. It’s non-rendering (real-tine) 3D visualization system (but not a game engine like #Unity3D or #Unreal)
Is this augmented reality?
My answers are both YES and NO. The reason of YES is that the augmented visualization was made in production side, that is, people who were in the studio could physically experience magical AR feeling like the left frame of below video. While, The reason of NO is that real audiences of live News were just watching the right frame scene as like they were watching general TV screen, as it were, there were no transformation between production side and user side. Do you understand what I mean? If you could remember my businesscard AR demo, you can understand much clearer.
“The real magic of AR happens to people who identify both phisycal and simulated world at the same time.”
Thanks for reading and watching my writing and works. I am going to wrap up this article with my own passion and interest. As you understanded by above contents, I have worked to produce and direct VR/MR/AR video and real-time composition. Technically, my earlier experience is a little different from current game-engine based VR/AR/MR stuffs, because VizVirtualSet, VizEngine, and VizArtist system is not based on such technical base and super high cost system, which is almost 500 to 700K dollars system including crane or robotic pedestals. However, the principle of immersive visualization is almost the same, because a software camera is in the simulated graphic scene to be synchronized with real movement of vision. It is called to tracking in both system in common. Additionally, for much preciser tracking, calibration is necessary between virtual world and real world. As GPU and CPU performance has been highly speeded up, this calibration has become faster and much easier, at last, Unity3D and Unreal Engine is almost plug-and-play concept even though it could be used for mixed reality in a green screen studio. Furthermore, the cost of production is exponentially decreased. I am explaining you that current immersive technology is not so emerging stuffs in a principle based. I have made such videos since 2009, Korean Broadcasting System had used VR/AR system even before the first version of smartphone (Steve Job’s iPhone.) So, most of current mobile AR/MR/VR demo or prototypes are already challenged and tried concept almost a decade ago.
In my personal goal or vision, I am seeking to change track from a little older workflow and user cases to younger technique and tools. My AR business case is the one of my trials, and I am doing my personal project for an interactive VR experience with Unity3D. As I mentioned above, the real augmented reality should be necessarily done before me, and the same magical moment should be happened before you, as well. The authentic visual transformation happens just before our eyes, because it’s really transcending time and space. How about do you join my exploration?
Great article look forward to meeting you at AWE!