It is time for HDR! Can’t wait anymore!
Bj?rn Isakson
I am an entrepreneur and business leader with a passion for building, scaling, and transforming companies.
By Kennet Eriksson, Bj?rn Isakson and Kojo Mihic
The industry has promoted it for years, but the market has not been as enthusiastic. Consumers have so far not really understood the benefits of it – until now. Will the third episode in season eight of Game of Thrones be the actual awakening of HDR?
The entire episode took place at night in a very dim and foggy setting. All over the globe people seem to complain about the picture quality. In summery the audience found it hard to grasp what actually happened and to make sense of the visually muddy action. Some viewers even compared the streaming to bad GIF animation… This is one of the most significant examples so far of where HDR would make a fantastic production the true justice it deserves.
Why a high-end production may look so bad on your TV set
Game of Thrones is a high end, high dynamic range production. Probably the single most important series for HBO on many markets.
We are fascinated of the behind-the-scene clips where we see the production team and all their challenges to create every scene. We gladly read about an episode that took 55 nights on location to record! Realizing that every actor, and they are quite many, must get dressed, get makeup, get some food and everything else that actors need during their working days. All special effects. All logistics it takes. It is a stunning production!
Technically, the dynamic range of today’s camcorders makes it possible to do color gradings that are extremely dark, but still full of details. Not just the story, but the entire production is very well done. A wonderful job made by super talented peoples!
Game of Thrones is a true HDR production using the latest techniques to bring the fantasy alive. With this is in mind we all sit there, in front of our brand-new TV sets, to take advantage of all these cool new features. Super exited to be blown away again.
And then finally when the show begins…
Blocks, banding, faces blurred out, fires bleeding in the whites and so on. What happened? Could it have been saved in any way?
Streaming quality
In the Nordics, where we’re located, it's not possible to watch Game of Thrones in 4K HDR. And that seems to be the case for the rest of the world as well. It’s not possible to find it in 4K standard dynamic range either. On the other hand, in the streaming world, a larger resolution is no guarantee for higher quality, rather the opposite. Even in an HD resolution the world seems to have a poor viewing experience of the third episode of the 8th season of Game of Thrones.
What we see clearly is that the streams are full of banding and coding artifacts. Sometimes so pronounced that people though it was a GIF they were watching.
Why is that?
Let us explain the mumbo-jumbo in some key points that makes this stream a technically failure.
Bit depth
Ever since we started with modern video streaming, we used a bit depth of 8-bit and that have worked out quite well. Far from perfect, but at least good enough. What happened lately is that the productions have taken the next step, with Game of Thrones as a very good example. The production is not made to be viewed in an 8-bit environment. There are not enough color shades to present details in the dark scenes for example. And this is getting worse for larger resolutions since a larger resolution means more pixels, but 8-bit still have the same limited number of color shades. The artifacts 8-bit gives is called banding and it is basic knowledge when working with video.
So, bit depth is very important nowadays when the productions are mastered in high dynamic range for large resolutions. But even if the end-user-stream is not HDR, it will still look much better if the bit depth is higher than 8-bit. With true HDR you should not get any banding effects due to the use of the perceptual quantizer transform which optimizes how the color shades are presented. The HDR specification BT.2100 also specifies that bit depth must be 10 or 12-bit.
One problem is that the production already taken the step to higher bit depths and higher dynamic range, but the distribution chain is obviously not really there yet.
There is also another quite big problem. The widely used video codec for streaming, h.264, can store 10-bits video, but there are extremely rare to find h.264 decoder implementations with 10-bit decode support. You can try this at home by taking a 10-bit h.264 video and play it back on your TV set. That will most likely not work. Not even the QuickTime player on your computer will play the file without complaining/re-transcoding.
Codecs and artifacts
Modern video codecs not just compress each frame with so called spatial compression. They are also designed to make efficient compression over time, temporal compression.
Encoding still pictures needs different amount of data due to the complexity of the pictures. This is why a JPEG picture gets different data sizes depending of the motive you are compressing.
Temporal compression, in its simplest way, can be explained as areas in a video frame, that doesn’t change over time, doesn’t need to be saved as data in every video frame. Instead that information can be loaded one time, and just continue to be present in the player over the next frames until some other information pops up in that area. This is a simplified explanation of temporal compression and codecs can do much more, like prediction motion energy of moving objects in the film. But the important thing here is that temporal compression is much more efficient when there are very small movements in the film. That’s why you need more data (higher bitrate) for action movies to keep the picture quality.
The experience of the third episode in season eight of Game of Thrones really showed extreme encoding artifacts. A lot of motion combined with smoke and snowflakes, in a very dark grade made it impossible for the streaming codecs to keep quality. The temporal compression could not save almost any information from frame to frame. Instead the quality was lowered in action scenes.
This combined with many details in the pictures also made the spatial compression to work really hard. In total this gave extreme encoding artifacts. These encoding artifacts also made the banding look worse.
What is the solution?
Game of Thrones just pointed out a challenge for the industry. Streaming need to adopt HDR now! The productions are too good for many of today’s streaming solutions.
There are examples of streaming services that already have adopted HDR support, where Apple, Netflix and Amazon are pioneering this specifically.
If we needed to choose between HDR and 4K for the last season of Game of Thrones we would prefer HDR to get rid of banding effects and get better dynamic range in the really dark scenes.
A higher bit depth also gives the encoders higher precision to take smarter decisions during encoding, so a higher bit depth does not mean a higher bitrate, rather the opposite.
And to raise the quality without raising the bitrate, add ladders of HEVC and VP9 to the old h.264 encodings.
Last, but not least, streaming services needs to start doing quality-based video encodings and skip the old-style bitrate based encoding ladders. Prefer to start measure quality as well since all films are unique
What about linear broadcast?
On the Nordic market the linear broadcasting is not that sensitive to bitrate costs as streaming solutions are. This made the broadcasts of Game of Thrones look better in linear TV than in the streaming services.
This is a perfect time for the streaming industry to knock down linear broadcasting by adding HDR, since the terrestrial television network in the Nordics probably never going to be HDR compatible due to the need of backward compatibility.
Colorist 调色师 / Founder of Black Peak 黛川
5 年My first thoughts were: 1) Covering for not-so-good VFX. 2) Compression issues Now, adding a bad trimming (or just an auto conversion) would also add to all of it. Probably a bit of each, although this being the 8th season I would still incline more towards VFX issues: at one point watching the episode I just turned my monitor to max brightness and the background mountains came out looking ugly. In that moment I thought it would've looked better without the mountain, almost pitch black and give the foreground characters much more needed brightness and contrast. In any case, I finished the episode more confused about the battle tactics being so childish and why so many characters were still breathing.
Principal Researcher
5 年Assuming that SDR was downgraded automatically from HDR master, this is a perfect example of the following dilemma: - using one fixed tone and colour gamut mapping for the entire frame/scene/movie; - adapting the mapping to the content. In the first scenario, something will be sacrificed - you can't have perfect reproduction of specular, diffuse white, gray and dark regions using one fixed mapping (otherwise why use HDR?), so a generic mapping approach will fail in some regions and it can't be avoided. In the second scenario, we can mimick HVS and adapt the mapping spatially and/or temporally to extract all meaningful information but the creative intent will be changed and the acceptable amount of such change is very subjective. I didn't see the episode yet but if it is really dark - maybe the intention was to be very dark and moody? Finally, SDR encoding is not very well aligned with more HVS-uniform PQ encoding so even if the above mapping is done perfectly, BT.709-based, 8-bit encoding won't be enough to preserve all details, not to mention video compresion that will add even more artifacts to the result.
Senior Product Manager Transcoding at Spotify
5 年But I guess the directors wanted the episode to be really dark. And by that, 8-bit SDR doesn't fit for the production. My guess is that the episode looks lovely in HDR, watched on the mastering display used. Maybe it's an up-sell for the upcoming Blu-Ray release :-) Seems that the Blu-Ray will be 4k UHD HDR. But since GoT is HBO's flagship, they could at least cranked up the bit rate a bit or just done quality based encoding instead of static bitrate ladder, which of course would have given a higher bitrate on the highest ABR streams, but h.264 in this case couldn't keep up the quality on that limited bitrate. In Sweden we was also able to see GoT via Telia IPTV, and their streams looked much better than what was seen directly from the HBO-app. The IPTV streams was somewhere around 12-15Mbps, which is more than enough to get rid of the encoding artefacts. The banding was still present in the IPTV streams (since they are 8-bit)
Sr. Solutions Architect, Media and Entertainment, Global Accounts at Amazon Web Services - [Opinions are my own]
5 年I think it's more complicated than just blaming the compression and bit depth. You can do the grading of a master in the highest quality, with 4K and HDR, in a dark room and with a monitor grade 1, and the technology allows you to get amazing results for that environment. But you need to do another different HD SDR master for streaming / DTT (if your streaming service doesn’t support 4K HDR). This multi-mastering is very well-done for years from Cinema to TV. This master fits to the viewing parameters of your audience (normal TV, not dark env) in which you can grade while watching it in an HD SDR TV, and this is where you can solve the problems that compression is going to produce (banding, artifacts, etc) reducing the complexity of your image. There are 2 options to obtain that SDR master from the HDR: do it again in SDR (not rocket science) or derive an SDR automatically from the HDR and ‘trim’ it (make some changes on it). Any of them takes time. It seems that HBO ‘forgot’ to do this second, different, master (maybe time concerns) and they used the master of 'cinema' 4K HDR, they converted it to HD SDR with some adjustments 'by default', and here is where, in the HDR- > SDR ‘down-conversion’ (tone-mapping) where they 'messed up', because this conversion it is something that requires advanced techniques of inverse tone mapping and/or a human eye correcting to avoid these problems. Probably, once that clip was generated, the Quality Control people complained that it was 'very dark', and maybe someone said: 'It's the creative intention'. That someone did not see the chapter as it was streamed, as many of the other 17 million, who many of them still haven’t seen what happened in the episode :-) You can join our conversation about this issue in the HDR Video group:?https://www.dhirubhai.net/groups/13621216/