What's the Deal with Levels: Full vs. Video
Frank Glencairn
Cinematographer - Senior Editor/Colorist - Workflow Specialist and Smart-Production Evangelist.
Let’s say you have a full movie that you edited in Premiere. You watch it in the theatre or a festival. The whole time you’re thinking, this looks really washed out. But you can’t put your finger on why.
Or maybe you’re kicking out your final from Resolve with an Avid codec like DNxHR. You upload it to YouTube and it just doesn’t look quite right. The blacks are really crushed and the whites are blown out.
What gives??
Both of the scenarios above are potential levels issues.
So what are levels?
Like most things in video that are difficult to understand, levels come from the ancient past of video creation. Analog video hardware like tape decks and monitors were set to record and display video levels. Film scanners as well as computer generated graphics on the other hand usually recorded or used full range data.
Nowadays, levels aren’t understood well. The demise of expensive video hardware and the turn towards software has rendered some of the technical concepts of video useless. But these concepts still apply to professional video work.
So why learn about levels at all?
Since software and digital files are so much more ubiquitous now, levels are an important concept to understand especially as a professional. By understanding levels, you can set up a correct signal path in a color bay, render your files correctly with the right color space and encoding to another artist, or even export your film or commercial correctly for a film screening or the internet.
What are levels?
Levels refer to the range of values contained within an image file. Every image and video file is encoded within a specific range of values. At their most basic, there are two main distinctions: full for computer displays and video for video monitors.
Full level files encode their image data within a full range container. For full range 8-bit files, this means values from 0 to 255, 0 being pure black and 255 being pure white. For 8-bit video range files, this means values from 16-235, 16 being pure black and 235 being pure white.
Many cameras only shoot in video range with values between 16-235. For values beyond video levels, some cameras offer options to capture extended values higher than 235 or lower than 16. These values are sometimes referred to as super brights or super blacks. In file encoding these values can also be referred to as YUV headroom or footroom. We’ll talk more about these files later in the article.
RAW video files on the other hand can be debayered into full or video ranges depending on how the files are interpreted in the software.
All digital graphics are encoded as one or the other. Generally, video files are encoded with video levels and graphics or image sequences are encoded with full levels.
The reason for this is that traditionally video files were watched on video monitors which was designed to display video levels and graphic files were viewed on a computer monitor connected to the output of a computer’s graphics card which is full levels.
Full Levels and Video Levels Terminology
So why is this concept so confusing and convoluted?
Cameras, software, displays, codecs, scopes, etc. each have different ways of describing levels. And sometimes the terminology overlaps with color space terminology.
WFT!
That’s an insane amount of terminology to describe the same thing.
While the nomenclature is confusing, the basic concept itself isn’t. What is confusing is knowing what levels your files are and how your software is treating them.
Most of this terminology comes from broadcast history. Tapes and files were usually within the legal broadcast range for video which is 16-235. Any values beyond this would be clipped or your file or tape would be flagged during QC for out of range values. This is still the case for delivering broadcast compatible files.
Today, tapes have largely been replaced with files and software.
Software interprets the level designation based on the input information of your files. If you’re working with video files, it can be really tough to understand why your files look different in different pieces of software, why it looks right inside your program, but not when you export it and many other situations.
Levels and Software
Color management has evolved very rapidly in the last few years. With the accessibility of DaVinci Resolve, other software like Nuke, Avid, FCPX, and Flame have opened up options for color management. Levels are an important part of that conversion.
Each piece of software handles color management very differently. Some software works in a full range environment, other software is video range by default. More and more software packages offer options for working in a multitude of color spaces and level designations.
A few examples from post production software:
The most important thing to understand about working with video files within post production software is that values are scaled back and forth between full and video levels based on the software interpretation and project settings.
DaVinci Resolve, for example, which is one of the most flexible programs for color space, works in a full range, 32-bit float ecosystem internally. Any video level files that are imported into Resolve are flagged as video automatically and scaled to a full range values.
While the processing is 32-bit float, Resolve’s scopes display 10-bit full range values. With 10-bit values, full range is 0-1023 and video range is 64-940.
So Resolve is working full range data internally. If there is a video card attached like the UltraStudio and data levels aren’t selected, Resolve will scale those internal full range values to video range values before sending a video signal to a display monitor.
The software is scaling values back and forth from video to full and back to video. This is an important concept to absorb.
That’s how Resolve works. Avid, on the other hand, has been re-designed with tons of color space options now as well. You can interpret your source files as video or full levels even after you’ve imported the files. And you can pick which type of space you’re working in. You can work in a video range space if that’s how your workflow.
Premiere is a little more limited when it comes to color management. There aren’t a lot of options for re-interpreting your source files into a project color space. You can use some color management settings, but compared with other major NLE’s, Premiere Pro and After Effects are definitely not leading the pack.
Scopes and IRE values
The next thing we need to understand when working with levels is how our scopes work and something called IRE.
Here is a definition of IRE from wikipedia:
An?IRE?is a unit used in the measurement of?composite video?signals. A value of 100 IRE is defined to be +714 mV in an analog?NTSC?video signal. A value of 0 IRE corresponds to the voltage value of 0 mV.
So IRE refers to actual voltage in an analog system. Actual electricity.
Why is an analog measurement like IRE important for the modern age of video?
Post production software still uses IRE values for scopes. If you see a scope with measurement values from 0-100, they are most likely IRE measurements.
How do these IRE values correspond to levels?
With 8-bit encoded files:
Depending on your editing software, your scopes could correspond to video levels or full levels, but IRE will remain the same.
Not confusing at all right??
This is why levels are so convoluted among other reasons.
If you NLE is running in a Rec709 environment for example, the scopes could be represented as video levels like in Avid for example. You’ll see on either side of an Avid scope 16-235 on one side and 0-100 on the other side. Now you know why.
For a rec709 environment like an Avid project, full level files will be scaled to video levels.
Scopes are relative. While IRE is a bit of an outdated concept, it’s helpful to have a scale from 0-100 to simply describe the video range. For your particular piece of software, it’s important to understand what type of levels environment you’re looking at to understand the scopes.
Encoding Files: Scaling Between Full Levels and Video Levels
When we start to talk about video files, we have to get some more terminology out of the way. This is where we start to hear terms like 4:4:4, 4:2:2, RGB, YUV, and YCbCr. ProRes444 for example or Uncompressed YUV.
In general, RGB refers to digital computer displays and YCbCr refers to digital video displays.
Historically, RGB values are converted to YCbCr values to save space for video bandwidth as it used to be far more costly. We still live within this legacy to a certain extent. RGB is 4:4:4 full range values which means there is no chrome subsampling in the encoding. All that color information is maintained. YCbCr on the other hand is 4:2:2 and video range.
I won’t get into every description of these technical aspects of video encoding. The thing I want to focus on is what your files are doing and how your software is interpreting them based on what the file is.
There is a misconception that video files must be converted to full range to be viewed properly on a computer display.
This simply isn’t true. Video range files can be display correctly on a computer monitor. What is important is that the software that is playing back your video file, knows it a video range file. Thankfully, most software is designed to know the levels well based on its encoding.
For example, if you’ve exported a ProResHQ quicktime with video levels from Resolve (which Resolve will render by default for ProResHQ,) that file will look correct if you play it back with a program that understands it is a video level file.
领英推荐
For the most part, video software is good at estimating the correct level designation for your video files based on information in your file.
However.
This is where things get tricky.
The line between full range and video range files has become much more muddy with the latest digital codecs.
Quicktime and MXF codecs like DNxHR, ProRes444, Cineform can contain YUV values or RGB values.
ProRes Special
ProRes is always YUV internally. Any RGB data (eg. 16bit RGB) send to ProRes encoder is always converted to YUV (as 4:4:4, not 4:2:2 as this would be pointless, with max. of 12bit precision) through its internal processing. If you think otherwise read Apple’s white-paper carefully. It uses some wording which may suggest that data can be stored as RGB, but it’s always YUV (it’s just that encoder can take natively RGB based pixel formats as input and also present RGB on output). ProRes by its spec should be always limited levels and this is why there are problems with full rang files. Apple has not intended ProRes to be full range and there is no dedicated header in ProRes frames to specify range (there is one in eg. DNxHD/R for example). This still doesn’t stop files to be exported as full range ProRes, but then there is NOTHING in ProRes data to tell that file is full range. If container doesn’t have such an info either (eg. MOV has no standard range flag, but MXF does if I’m correct), then apps simply can’t know how to treat such a file and only user manual interpretation (user has to know if file is full or limited range) allows for proper handling. This is why any full range ProRes exported file should be treated with caution and may behave differently depending on the app. Manual interpretation may be needed, so people should always keep it in mind. DNxHR ( also Cineform) is slightly different. Not only it has 2 separate modes: YUV and RGB (so you can actually store input RGB data without any conversion), it also has more flags in its frame structure and allows for range specification: full vs limited (there is even way to flag actual range which can be different than standard eg. 64-940). If apps are bugs free and read DNxHR files properly then automatic levels detection should work fine. Resolve use to have bug in setting proper range in DNxHR headers during export, but it was reported and fixed (although BM keeps introducing same bugs in later versions). Resolve allows for manual interpretation of range, so this way you can import full range ProRes (or any other) files, but most likely with manual intervention. Premiere doesn’t, so things most likely will go bad with full range ProRes files in Premiere. As mentioned in the article- Resolve always works in full range internally (so scopes etc). No one should ever try to limit grading to 64-940 range because eg. later it will be exported for broadcast which meant to be typically limited levels. This is wrong thinking, but looks like there are people doing it. Want pure black, grade to 0 in scopes. If codec will be YUV Resolve will scale this internally to correct value during export. This brings to another point. Source files, processing, monitoring doesn’t have to go through the same range setting. You can work with limited levels files, but monitor over full range. Same with export. All those elements are independent and it’s up to app to be properly handle needed conversions.
If we think about ProRes or DNxHR/Cineform or any other intermediate codec as a high quality master/working format then they should be actually RGB based end always full levels. This would make things easier (and we would not waste time for YUVRGB conversion which keeps happening in any app/monitoring chain etc. sometimes actually many times). Limited levels concept is old and not needed these days. It’s a leftover from analog era. YUV for other hand is a different story. It’s needed for heavily compressed codecs as it allows to encode luma and chroma info differently, which is not possible with RGB. RGB is compression unfriendly way of storing image data due to luma been linked to chroma. This is probably why ProRes is YUV always internally (it lets it be more efficient). Thing is that those intermediate codecs are high bitrate, so we could move to RGB (like DNXHR does for 444 profile) and compensate any loss compensate by rising bitrate. YUV can be kept for h264/5 etc. YUV can be limited/full range and there is no really a rule that 444 must be full range and 422 limited (they can be any combination). It all should be just flagged properly and then no guessing would be needed. Problem is that we have such a big problem with metadata/flags- we put way not enough attention to it, just blindly focusing on actual video data.
But there is a big BUT (and I like big buts an′ I can not lie).
Even if you did encode RGB 4:4:4 data to a ProRes444 file, your software would still need to correctly interpret that data and not clamp values. This is why these newer 4:4:4 codecs are so confusing. Premiere in that case would clamp those full range values or assume the file was video range. You still might be able to access those values, but Premiere isn’t seeing them as intended.
DNxHR 444 is another codec that can confuse software. In my own tests with Resolve and Premiere, a auto level DNxHR 444 file rendered from Resolve will be interpreted as a full level file in Premiere. But Resolve actually encodes it to video levels with auto selected. Therefore the levels will be scaled twice leading to washed out luminance values in Premiere.
So one way to tell if you are having an issue with levels. If your black level is clamped on your scopes around 16 (8-bit scopes) or 64 (10-bit scopes,) you probably have incorrect level interpretation going in your software.
In practice, it makes sense that Premiere would assume that a 444 file would be full range. Full range files are 4:4:4. Resolve on the other hand should know that DNxHR should be 4:4:4 when it is rendering it. But it appears that it thinks a DNxHR quicktime should be video range, not full.
Files have levels information embedded within the file headers. This is how the software knows how to scale the levels. Sometimes this information is wrong or incorrect based on which piece of software is interpreting the file which makes things very confusing.
Since there are so many codecs, color spaces and different range values, it would be very time consuming to compare them all. What is most important is that you understand that 4:4:4 codecs can be tricky and it’s important to test out your workflows and file scalings before using them in production.
Especially for Windows users of Resolve and Premiere, it’s important to understand how to use DNx codecs to pass files back and forth properly since ProRes encoding isn’t possible.
Testing Levels with Color Bars
Generating color bars at the beginning of a shot or program is a great way to test out any issues with codecs or improper levels scaling. Then you’ll always know if what you’re seeing is correct.
The top example below is being scaled wrong as you can see from the parade scope. This is a video level file being interpreted as full by the software. The levels are being scaled to 64 and 940 which is a tell tale levels scaling issue. You can test this out from any file export and looking at a scope in any NLE.
Full level files being interpreted as video has the opposite issue. The values will go beyond 0 and 1023.
Working with Hardware and Levels
Understanding levels is key to setting up a proper viewing environment for your content. Even if you’re only making web videos on your computer without a dedicated video card, it’s important to understand the choice that you’re making.
Things are changing quickly with display technology. What is true about signal paths today might change tomorrow. So I’ll talk about the options for signal paths and how levels fit into that.
There are two major schools of thought when it comes to displaying and monitoring your video output:
For most people doing video work today, it’s important to have a video monitor. Why is this important even if you’re making web videos? A few reasons:
So a video monitor and a dedicated video card is still important. Today at least.
Does this mean we should only build video level signal chains?
Many video monitors today can display full range signals. Video cards can kick out full range signals.
So why should we stick with video range if we can do full?
For a few reasons. Most video codecs and video software still use color spaces based in the video world like rec709. Files like ProResHQ and DNxHD are video codecs with video levels at their core. Many post facilities are based around ProRes or DNx of some kind. Introducing an RGB signal chain into the mix is exciting in theory, but perfecting that setup would require a lot more effort for a small return.
That being said, RGB based workflows are becoming more popular. They might be the standard soon with all computer based workflows.
Usually in higher end workflows, quicktimes aren’t used. The files are most likely full range 10-bit DPX files or 16-bit float OpenEXR files which are much bigger containers than any signal chain or display technology currently available.
For film scanners and projects, full range signal chains are standard. In scenarios like this, it makes sense to build an RGB pipeline to maintain an unconverted, unscaled signal as the way through.
For CG heavy facilities or workflows, full range has its benefits.
For most editorial workflows, sticking with video range systems works the most painlessly for now. As codecs continue to increase in quality, drive speeds continue to go up at cheaper prices, RGB, full range files and hardware might begin to replace more traditional video based signal chains.
Best Practices for Working with Full and Video Levels
Now the big question. How do we use levels on a day to day practical basis?
Here are some practical rules of thumb for working with levels in your post production workflow:
Happy grading :-)
Promo Editor at Pocket FM | Freelance Post-Producer at Curiousmind | Passionate about Content & Marketing
6 个月Lance Phillips I'm on the same boat with you & trying to get my head around this solid dump by Frank Glencairn (Thank you, Frank!) I mostly work as editor/motion graphics artist, post producer & sometimes do basic correction using premiere or resolve. I picked up these 4 points: 1. Premiere pro assumes video level by default, which may lead to incorrect level interpretation. Resulting in over or underexposed clips, inconsistent grade & color consistency. So use monitoring tools like waverform, vectorscope (I am a basic user in this regard) 2. Use color bars to verify correct level scaling. Place color bars at the beginning of the timeline & check waveform, vecteroscope. White level at 100 IRE & Black level should be 0 IRE Adjust the properties on color bar to get the right values & then apply to your grade. 3. Full range is the space to be in for internet deliveries which is mostly what I have to do. 4. Premiere pro & 444 prores dont play well. So convert full range prores to limited range using encoder or resolve into a different codec like H.264 or DNxHR. I'll be re-reading this time and again for clarity and after testing out some methods.
Award winning Filmmaker @ Jack Abby Productions
6 个月Thanks for posting this Frank. Some of this was hieroglyphics for me, but it makes me want to learn more. I shoot Prores HQ on a Lumix S5iiX and edit on Premiere. I'm still not entirely sure what this article means for me, but I'll read it again and do some terminology research to help decipher it.