"8-bit HDR" Video

By David Meyer posted 03-08-2018 01:36

  


8-bit Cat

I’ve written about High Dynamic Range (HDR) before, but I couldn’t help myself as I’ve just seen another article circulating which suggests “8-bit HDR” is a thing. It’s not. It’s a contradiction. But if it does exist out there in content land, it got me thinking if this is the reason why some people have seen HDR and been underwhelmed? Have you experienced that? If so, it might have been 8-bit video, in which case it wasn't really HDR at all.

So why can 8-bit and HDR not coexist?

HDR is largely based on the work of Peter Barten in the Netherlands in the late 90’s. He found that human visual perception was capable of around 10,000:1 contrast in any given scene, which equates to about 14 stops (factors of 2). The Canadians then built upon his work to create a video system which became known as HDR.

By comparison, legacy video systems — now in hindsight referred to as SDR — are only capable of around 64:1, or 6 stops. Technical folks at the BBC have even reported it to be as low as 32:1, or 5 stops. Let’s just say it’s somewhere around that vicinity. Either way, it’s not at all exciting.

So the purpose of HDR is to deliver at least what human vision is capable of perceiving; a range of at least 10,000:1. This equates to approximately 700 levels from black to white. Anything at or above this be considered HDR. 8-bit video is 28  = 256 levels, not even remotely close to HDR. 10-bit is 210 = 1,024 levels. Now we’re talking!

Bottom line —8-bit video is completely incapable of delivering HDR as it is by default of limited dynamic range, regardless of what special metadata accompanies it. Claiming 8-bit HDR is like boasting 5.1 channel Atmos audio. A 10-bit signal is an essential minimum for HDR video, whereas 12-bit buys a ton of headroom and an even greater dynamic range.

3 comments
73 views

Permalink

Comments

03-09-2018 18:45

I'm glad you brought up the 4:2:0 issue, Damien. I did cover this in a previous blog, albeit more in the context of bandwidth. 4:2:0 certainly compromises color to some degree, but this matters way less than grayscale for the reasons covered by Dave Pedigo.

At 4K with 4:2:0, yes the color resolution is effectively 1080p. That's a good way to look at it, by the way - thank you. But this 1080p color res is pasted over a complete 2160p grayscale, so what we see is still full 4K. It then follows that 1080p 4:2:0 content in effect only has 540p color resolution. It's all relative.

4:4:4 is of course the best, but there's not many native sources; gaming and computer graphics... not much else. Most of what we watch is (unfortunately) 4:2:0. Personally I'd like to see 4:2:2 become the norm. That way we could get 12-bit without a bandwidth premium, and no reduction to vertical color resolution!


03-09-2018 07:21

Damien, while humans can perceive a 10,000:1 contrast, we do not with color. The eye has over 100 million rods which see in black and white and only roughly 7 million cones which are color. Thus, while it is true, we are reducing the color using chroma subsambling (4:2:2 or 4:2:0) the human eye typically does not detect the differences, particularly at 4:2:2.

03-09-2018 04:19

a nice clear bit of info there David, 

now to muddy it - 4:2:0/4:4:4 - if 4:2:0 is a grouping of 4 pixels, are you any better off than a 1080p resolution other than the fact you will have more colour gradients from the 10bit???