NAB 2017: A Quest for Cameras & QC in VR
I didn't go to NAB. Which, depending on if you're a hardware manufacturer, a VR guru, or an end client ready to plunk down a load of cash... might be a good or a bad thing.
I've been following developments though, over Youtube live, Facebook and Linkedin. VR is evolving at break-neck speed. From the NAB floor itself so many new products and announcements - particularly of interest are the rivals - Google and Facebook... coming out with cameras, and workflows to create compelling cinematic VR.
OPTICAL FLOW Stitching: the Elephant in the VR room
Take a look at the image above. (click here for larger) It's every VR creator's dream come true! No Stitch lines. On closer look...
But let's get back to that in a while. First:
There's been two new exciting stereoscopic 360 cameras announced that stand out from the crowd. The first one - The Zcam V1 Pro.
If you've been following my writings you'll know that video based VR to me, is not the QTVR of the 90s (vanilla 360 video). To qualify as video VR, at minimum the scene needs to be captured with the 'depth channel' intact. The Zcam V1Pro is just such a camera, and affordable too - well, compared at least to the Jaunt and the Nokia OZO (which in my opinion, is not a VR camera since it tapers off to a disconcerting monoscopic view as you start looking around your shoulder).
The reason I was excited is because the Zcam V1 Pro promises really good video VR in a compact form factor. Not being at NAB, I couldn't see it in person but I've seen videos posted online and that's where the video-grabs in this article are from. However, I'm saddened for these reasons:
- I still refuse to believe that the cameras are un-synced. I think they are synced, but the evidence is there that the output is un-synced, if you look at the image above.
- I prefer instead to believe that the optical stitching process is to blame for the anomalies and un-synced imagery.
Let's take a closer look.
- Look at the man's sandals (scroll up to the side-by-side image above)- It's clear to see the difference between left and right eye images. Same goes for the woman behind him.
- We'll not take the large billboard across the street into consideration because that could just be the angle or a polarizer (if the billboard has one in front of the LCD / LED display)
- However, in the scene the cars in the background at times, run at the wrong depth . That's temporal mis-sync.
- Look at the man's arm - That's optical-flow artifacting. So, what you gain by not having seam or stitchlines, you lose in a stereoscopic VR video with these artifacrs. In a monoscopic video, it's plain to see you'd use the left eye view- where his arm is clear - as the master output.
In the image above, if you look at the man near the garbage can, you'll see another artifact of Optical flow stitching (click for larger) rubber-stamping. Those who remember the in-famous Clash of the Titans 2D to 3D movie conversion will get it.
The video from which these stills are taken, exhibit mis-sync and artefacting throughout. Such video when viewed in an HMD is what could do physical harm to an audience who wouldn't know any better.
Find it hard to believe? See below...
The need for Stereoscopic QC in VR movies:
Use anaglyph red-cyan 3D glasses and look at the two women behind the man near the garbage can. (larger image)
Here's the conversation between your brain and eyes - many times per second.
Brain to Eyes: Could you refocus on those two women, please?
Eyes: Done.
Brain: Hmmm that's not quite right now is it? Everything from my knowledge of the world tells me the woman with the backpack should be nearer to us than the woman walking behind.., Do us a favor .. refocus
Eyes: Yeah... Done.
Brain: Dammit... moving on...
Now imagine this conversation happening for different parts of a 360 VR scene multiple times per scene - subconsciously - That's the beginnings of a migraine right there.
This happens when you have temporal mis-sync with cameras capturing any moderate action and they're not "Genlock level" synced. Simply trying to get frame sync in post by slating or audio is not going to cut it. It was bad in 3D movies - It can certainly cause physical harm in a VR HMD.
However - I'm going to give the benefit of doubt that this is not a camera mis-sync issue, but a short-coming of the implementation of the Open source (Facebook's?) optical flow algorithm.
I'm not basing the next suggestion on researched fact, but I'm hypothesizing that the optical flow algorithm is taking deltas from up-to one frame forward and/or reverse to compute the current frame, which, in monoscopic 360 would work just fine but in stereo - shows these anomalies.
Why do I believe that? because in other parts of another video, I saw background (in the far distance) imagery to exhibit good sync. Fountains sprinklers and lights running down the sides of a building. (A scene shot from a hotel terrace poolside).
** Update Apr 29th 2017: I spoke with Jason Zhang of ZCAM V1 Pro and he confirms there's hardware - linelevel sync, but optical flow in stereo can mess up, as the algorithms will look at one frame behind to smooth out stitch lines etc... This confirms my hypothesis (in a way). The good news is - the Zcam V1 Pro is a perfectly synced 360 Camera.
The Insta360 PRO: A Camera I'd love to have...but
The other big announcement this past week was the launch of a USD 3500/- Stereoscopic 360 VR camera. To me, it beats the (in my opinion) Nokia OZO in price-for-quality.
The positives:
- It really does have full 360 Stereoscopic 3D unlike the Ozo (which is the reason why there's terrible eye-hurt 3D 360 on the Oculus GearVR videos section right now - but that's for another article.)
- It's some of the sharpest 3D 360 video I've seen in a while.
- There is a Jpeg still version of the scene posted in 8K and it's the best - most lifelike stereo (lending presence) I've seen.
Caveat:
Unlike the ZCam V1, I can't exactly say it might be Optical flow algorithms messing it up. I mean.. I hope it is! Because if it is, then there's hope to still use regular stitching and get one of the best 3D 360 cameras for the price.
Take a look at the Car in the image above - the wheel spokes say it all. The mis-sync shows even more on all people's legs as they walk. - I'm hoping it's an implementation of optical flow.
(When viewing the video) The train in the far backgound does exhibit temporal mis-sync and seems to run wrong depth, and people "clash" into each other as they walk past
Again, I'm still hoping it's the implementation of the open-source Optical flow algorithm that's the culprit or an early version of the camera.
Food for Thought for All Stereo 360 Cameras - Matched Lenses:
The more cameras on a rig - the more Optically matched Glass, needed:
Back when Stereoscopic 3D filmmaking was the rage (to be fair there's still people shooting native stereo for films)... There was the famous Angenieux Optimo series - a matched pair of lenses made from the same "run". This guaranteed no lens distortion from one lens to the other.
Why is that Important?
Take a look with anaglyph glasses, at the FIFA image above from a review I had done during the FIFA Cup 2010. Do you see the skew of the turf dipping in "depth" toward the left and right on the respective images?
That can either be due to stereo windows being employed (which in this case, shouldn't have) or mis-matched lenses on the rig.
However, it might not be so bad on a multicam 360 rig, because there's less "glass" (elements) to deal with in the lenses used on a 360 rig. Yet, it would benefit rig designers and Lens manufactures to pay heed to this - to help optical flow algorithms do their job, and have stereo that's even, across the scene.
A call for Stereoscopic Quality Control in general:
Besides hardware/software issues - there really is a need to have a professional Stereographer whether you are a ProdCo, a manufacturer or a VR software developer.
One simple (and by no means, exhaustive) reason: Take a look at the two images below and tell me which you prefer:
Chances are you'll prefer the depth sweetened one. It's like spice you add for extra "presence". Ask why you preferred it in the comments below, and I'll tell you.
A qualified stereographer will bring this to your production - on a scene by scene - basis.
...that, besides the above mentioned QC and more.
The names mentioned below might come at a premium, some might agree if you sweet talk them. - ALL - are worthy of being approached for on your next VR project:
- Demetri Portelli
- John A Rupkalvis
- Marcus Alexander
I've forgotten a few other good professionals, I'm sure.
If you find your Stereoscopic project need's fixing in post - I'd recommend getting in touch with: Patrick AlManza.
Why do I do write critique?
Why do I write public critique and not send an email directly behind the scenes? There's a few reasons:
1) Integrity - Over the past years, I've found there are few who have integrity and give credit where due. From experience, they've taken (free) advice and not even acknowledged it. I've given advice to hardware and software manufacturers in return for nothing.
2) Public awareness - I certainly wouldn't want to put down my money and find out I have to wait for fixes. The "move fast and break things" rule is good, but not when it could/may cause physical harm to audiences, and the medium. We've seen it happen with 3D...
3) I get noticed and get paid consultancy work: Some of my clients include the Govts. of India, AbuDhabi and Singapore, Panasonic R&D Labs, Philips Hi-tech campus- Netherlands, McKinsey, GoogleMENA and others.
4) VR publications need to do more in-depth investigative journalism.
Principal Product Manager + Team Lead QA and Technical OPs at Neuro Event Labs Oy
7 年On OZO, you're wrong. The OZO software offers both monoscopic and stereoscopic output.