MORE

    What is HDR video and how does it work?

    - Advertisement -

    HDR is the new hot thing in movies and TV and it brings with it over five different competing standards. So lets break down for you what HDR is, what these standards mean and what to know before you buy your next HDR capable TV.

    HDR stands for High Dynamic Range video, with 4K now standard on most new TVs this is the new visual technology that TV makers are pushing. Essentially, HDR makes your TV display a more vivid life-like image and we think its a bigger deal than 4k was if you want the best looking movies on a TV.

    But HDR itself isn’t just one thing. All of the different variants of HDR out there are based on four different concepts. These include; Luminance, Dynamic Range, Color Space and Bit depth. Lets break them down each.

    - Advertisement -

    HDR Concepts

    Luminance

    This is the measure of how much light something emits measured in candelas per square meter which is equivalent to 1 lux– usually the terms “Nits” is used. Nits is therefore a measure of how bright your screen can get. Your office monitor should be able to give out about 120 Nits and laptops can go up to 300- 500 Nits to help with the bright sun. The old CRT TVs could give out about 100 Nits and incidentally 100 Nits is also what Standard Dynamic Range (SDR) video was calibrated to.

    - Advertisement -

    On a capable TV, HDR content can be much brighter– manufacturers usually refer this to as the peak brightness. In HDR video, things like reflections, lights in the dark, laser blasts should all look very bright.

    OLED LG C9
    The OLED LG C9 with HRD

    A TV like the OLED LG C9 can top to 800 Nits and some LCD TVs like the Samsung Q9F puts out over 1,700 Nits. So HDR TVs can get way brighter especially when playing HDR content, but how exactly that brightness is used brings us to our second part called Dynamic Range.

    Dynamic Range

    - Advertisement -

    This is basically what is known as contrast — in other words the difference between the brightest and darkest part of a scene. Dynamic Range is typically measured in “Stops” which is a term brought over from cameras. Each stop is twice as bright as the one before it. These days camera companies brag about how their new camera can go up to 14-15 stops of Dynamic Range.

    Dynamic Range Camera - Stops
    Sony camera measuring dynamic range in Stops

    So it might surprise you to know that Standard Dynamic Range content which is still the vast majority of what we see tops up to about 6 Stops. Manufactures usually report this as contrast ratio for example 5,000:1 or 10,000: 1 which is how much brighter the white a TV can display than the black. But, now with OLED screens that can produce a true super dark black ,TV makers are just claiming infinite contrast which isn’t any meaningful.

    High contrast

    Dynamic Range is partly dependent on the TVs contrast ratio, but at least one standard insists that a TV has to be capable of 13 stops of peak dynamic Range to be considered HDR ready though most TVs max out at 10 stops.

    Color Space

    This is also known as Gamut is simply the range of colors that a TV can produce. Most SDR video and digital content in general, most jpegs and media on the internet all use a color space called REC 709 or sRGB (ITU standards). To simplify a bit, color space defines the maximum saturation for the primary light colors Red, Green and Blue that a device can emit. The big HDR video standards all support a wider Gamut to a one called REC 2020 though a lot of the content uses the slightly less wide standard called DCI P3 which has actually been the standard for Apple’s displays for a few years now.

    Color Space - HRR

    The difference between a normal and wide color Gamut is kind of equivalent to luminance but for color. Instead of brighter highlights for example, an HDR screen should be able to show a more green-green with more pure hue and higher saturation.

    Bit Depth

    This is the about of data used to describe the brightness of color. With the exception of some professional video SDR content was all 8-bit. The easiest way to think about this is with RGB color where each pixel is a value for Red, Green and Blue. Lets get down to some bit-math. In an 8-bit signal you have 256 possible values for each color which is 2 to the power 8 – hence 8-bit. So 0,0,0 would be black and 255,255,255 is white and 0,0,255 is pure blue, etc.

    HDR used 10 bits of data for each channel expanding the possible range of values up to 2 to the power 10 which is about 1,024 hence 10-bit (its actually 1023 + 1 which is the meta data transmitted). This means HDR provides 4 times the data being used to describe each pixel on your screen. The big advantage here is more subtle gradients with less banding and providing all the data needed for those super bright highlights.

    The HDR standards

    Hybrid Log-Gamma (HLG) is kind of its own thing, but HRD10, HRD10+ and Dolby Vision are all based on something called PQ or the Perceptual Quantizer and I am sorry before I can explain this we have to talk about the Gamma curve.

    When a digital camera records an image, the light photons hit the CMOS sensor and get turned into voltage and the camera records that voltage. Twice the voltage and the pixel will be twice as bright. A gamma curve at its simplest is an equation that bans liner response in two similar shades of colors and brings it closer to how we actually see, thus taking data away from the bight-end of the spectrum and moves it into the darker end of the spectrum where we actually see more detail.

    Gamma curve
    The Gamma curve

    Cameras these days frequently use the curve to help them get more data to capture shadows when you actually take an image and your computer uses the opposite curve to make sure it displays correctly.

    The Perceptual Quantizer (PQ) is actually gamma curve on steroids. It was designed by Dolby and is based on how people actually perceive contrast at various luminous levels. It comes with a lot of complicated math that would be useless for the purposes of this article.

    But usual techjaja tradition let me explain how it works. PQ uses twice the data of all of SDR over 500 bits to describe the 0- 100 Nits in brightness but then it keep going with head room built-in to extend all the way to 10,000 Nits of brightness.

    So what are the differences of these 3 HDR standards based on PQ?

    HDR10

    This is an open standard which is the most popular form of HDR. If a TV says its supports HDR, you can bet its the basic HDR10. The downside is that its not rigidly defined, its more of a collection of suggested technology than a true standard. So the quality of HDR10 devices and the content can vary widely.

    But still at a minimum HDR10 should include; 10 bit video, a wide color gamut (Rec 2020) and a peak brightness of 1,000 nits.

    Dolby Vision

    Dolby Vision by contrast is completely controlled by Dolby. It supports up to 12 bit video, a wide color gamut (Rec 2020) and it is usually mastered to a peak brightness of 4,000 Nits. So if a company wants to support Dolby Vision on their device, they do have to pay a license fee and work with them to make sure their TVs or monitors will properly and accurately render and playback Dolby Vision content.

    The downside of PQ

    This does lead to one of the issues of PQ. Gamma curves were relative, they modified the brightness of an image but had no values attached to them. PQ is an absolute, every value between 0 and 1023 returns a specific brightness no matter the TV for example, an input of 519 bits should always display a brightness of 100 Nits no matter where you are showing it. But, this leads to a big problem– remember Dolby vision is mastered to 4,000 Nits and no consumer TV can come close to that. So how do you solve this? Metadata!!!!

    HDR10 video carries metadata that describes that average scene brightness and the maximum pixel brightness in whatever movie or show you are watching. TV makers will the include algorithms that try to map that maximum brightness onto the brightest shade your TV can actually produce. Its essentially a gamma curve onto of the PQ curve. The downside here if you can imagine a generally dark show (Think Final season Game of Thrones- The Battle of Winterfell) one really bright flash in that scene will cause the TV to account for that single bright light and it may shade the rest of the content darker to compensate.

    Dolby Dynamic Metadata
    Dolby Dynamic Metadata

    To try to fix this Dolby Vision adds dynamic metadata which shifts scene by scene or even shot by shot instead of being set once for the entire program.

    HDR10+

    Similarly HDR10+ is an attempt to add dynamic metadata to the HDR10 Standard. Since PQ is absolute, the idea is that when watching content from one of these 3 standards (especially Dolby Vision) it should look the same in your living room as it did to the show’s creators when they editing it. This is great in theory but problem is that it doesn’t take into account how bright the room where your TV is located and by default Dolby vision is intended to be watched nearly in the dark.

    The downside to any of these standards is that they generally look washed out and grey on older non-HDR TVs. Streaming services like Netflix can usual identify what your device supports and serve up and SDR version as needed, but that’s not possible normal broadcast TV, this is where HLG comes in.

    HLG

    HLG or Hybrid Log-Gamma is our last HDR standard and was cooked up by UK’s BBC and Japan’s NHK Television Networks in 2016 with the aim of delivering HDR content via broadcast. HLG supports a wide color gamut 10 bit video, Rec 2020 and modified gamma curve.

    HLG takes the standard Gamma curve that has been used in broadcast for nearly a century and modifies it as towards the brighter end it smoothly transmits into a logarithmic curve, compresses the highlights and allows for much larger dynamic range. The idea here is that older TV will just interpret the part of the picture where the standard gamma curve is and the highlights that where clipped away thus normal SDR content will look the same. But, in TVs that support HLG they will get more detail in the brightest parts of the image.

    Which Streaming providers support HDR?

    So there you have it, the various standards and how they work. There are a few newer technologies like Advanced HDR by Tecnicolor, but for now HDR10, HLG and Dolby Visions are the ones to watch. If a TV says its HDR ready it almost certainly supports HDR10 and probably HLG but Dobly Vision and HDR10+ are more of an either-or situation at least for now.

    It terms of providers Amazon offers HDR10 and HRD10+ content with some shows in Dolby Vision. Netflix has pretty-much skipped HDR10+ it has a lot of Dolby Vision and HDR10 content. BlueRay is a mix with some studios opting to include every format on the same discs, so check before you buy.

    So has HDR been a reason for you to upgrade? Have there any shows or movies in HDR that have blown you away? Lets us know in the comments below.

    - Advertisement -
    Roger Bambino
    Roger Bambino
    The love for gadgets and technology is deeply rooted in his DNA, he is a blogger and really obsessed with cool devices. Roger is the EIC at Techjaja and also he loves creepy movies, and takes you very, very seriously. May be!!

    Discover more from Techjaja

    Subscribe now to keep reading and get access to the full archive.

    Continue reading