HDR vs SDR, HDR10, HDR10+ Adaptive vs Dolby Vision…What are the differences?

HDR is awesome. In this article I want to teach you what it does and why HDR is such an important (r)evolution in reproducing realistic imagery. It is a big deal when you choose your next TV or monitor.

“SDR” with 8 or 10 bit color

Let’s start with SDR (Standard Dynamic Range), which is the standard for most content and typical screens. These screens work with RGB (red/green/blue) color depths of 8 or 10-bit.

  • SDR at 8 bit color gives you 256 different shades of Red (2 to the power of 8), 256 shades of Green and 256 shades of Blue.
    This means that every individual pixel can have 16,7 million different values, going from white to black and every color shade. This seems like a lot, but can still cause ‘banding’ in subtle gradients, where you can actually see the colors change in value.
  • SDR at 10 bit color gives you 1024 different shades of Red (2 to the power of 10), 1024 shades of Green and 1024 shades of Blue.
    This results in 1.073 billion(!) possible colors per pixel, resulting in much smoother gradients and tons of extra color nuance in high-end imagery.

How does HDR introduce even more colors than SDR?

Let’s think out of the color box now. The problem with most screens is still that they’re backlit by one big backlight (LED or CCFL). So the whole screen and all its pixels have pretty much the same brightness. Remember, SDR images, videocontent and videogames just render images with RGB values.

This is where HDR kicks in. Screens that truly support HDR can significantly increase or dim the brightness of individual pixels or pixel clusters (sometimes called dimming zones, coming from the days their purpose was just to achieve better black values). The maximum amount of higher brightness that can be achieved is called “Peak brightness” and is expressed in ‘nits’ (1 nit= 1 candela/m² or simply put: the amount of light 1 candle emits, spread over 1m²) .

The secret of HDR is the intensity of light

Most HDR TV’s and monitors offer about 12 stops of different brightness. This means that 1 specific shade of red (out of 16.7 million or even 1.073 billion colors) can now be shown at 12 different levels of light intensity, from very dark to super bright. The UHD Premium standard offers up to 15 stops and up to 1000 nits of ‘peak brightness’.

Of course I can’t show this to you in an image, but imagine, each of these 12 ‘brightness stops’ contains the entire 10 bit RGB spectrum, going from almost zero nits to whatever the maximum brightness capacity of your TV/monitor is (e.g. 1000 nits). Imagine a neon sign in a dark alley. All the fidelity will remain, both in very dark areas and in & around the neon sign. The sign will actually shine into your living room, and the darkness will be as dark as it should be, it’s a pretty awesome experience.

Now that you understand what HDR really can do, you can see why it’s impossible to show an accurate HDR image/video on a non-HDR screen, often the colors and contrast will look washed out.

What are HDR10, HDR10+ and HDR 10+ Adaptive, and what are the differences?

It’s pretty simple: HDR 10 aims to produce up to 1000 nits of peak brightness, whereas HDR 10+ supports up to 4000 nits. (this is very, very bright)
AND both standards support 10 bit colour depth.

HDR10+ Adaptive means your TV has a built-in ambient light sensor and will dynamically increase/decrease the HDR effects and brightness based on its surrounding environment. A dark and cozy room is a whole different viewing experience than a bright living room on a sunny day. It will use this dynamic metadata to optimize the image for each frame and scene.

What is Dolby Vision and how does it compare to HDR10?

Dolby Vision tries to be the absolute peak of HDR performance, it can show up to 12 bit (4096(!) shades of) color at up to 10.000 nits of peak brightness. But more importantly, it tries to dynamically optimize image quality based on your devices & platforms to deliver the best possible visual. Obviously, there are very few TV’s that support 12 bit colors and can reach nit values of 10,000. But nonetheless, many standard HDR television are able to play Dolby Vision Certified content (but… there can be issues in Windows)

Alright, that’s it for now. I hope you learned something from this article! Personally I use an OLED TV (LG C9) for all my media consumption. It has excellent HDR since each pixel is a light on its own. What kind of monitor do you use and do you often use HDR? I’d be happy to read in the comments!

Share this article via:

Leave a Comment

Your email address will not be published. Required fields are marked *

Donate Bitcoin

PowrUsr Bitcoin donation

Bitcoin Address
38nUJcf4993v6y8aLftstZ1dM9LYMhzXLc

A big thank you, I really appreciate it!
Want to let me know you donated? Or want to donate other crypto? Let me know in a comment and I’ll get back to you in private!