I’ve written about High Dynamic Range (HDR) before, but I couldn’t help myself as I’ve just seen another article circulating which suggests “8-bit HDR” is a thing. It’s not. It’s a contradiction. But if it does exist out there in content land, it got me thinking if this is the reason why some people have seen HDR and been underwhelmed? Have you experienced that? If so, it might have been 8-bit video, in which case it wasn't really HDR at all.
So why can 8-bit and HDR not coexist?
HDR is largely based on the work of Peter Barten in the Netherlands in the late 90’s. He found that human visual perception was capable of around 10,000:1 contrast in any given scene, which equates to about 14 stops (factors of 2). The Canadians then built upon his work to create a video system which became known as HDR.
By comparison, legacy video systems — now in hindsight referred to as SDR — are only capable of around 64:1, or 6 stops. Technical folks at the BBC have even reported it to be as low as 32:1, or 5 stops. Let’s just say it’s somewhere around that vicinity. Either way, it’s not at all exciting.
So the purpose of HDR is to deliver at least what human vision is capable of perceiving; a range of at least 10,000:1. This equates to approximately 700 levels from black to white. Anything at or above this be considered HDR. 8-bit video is 28 = 256 levels, not even remotely close to HDR. 10-bit is 210 = 1,024 levels. Now we’re talking!
Bottom line —8-bit video is completely incapable of delivering HDR as it is by default of limited dynamic range, regardless of what special metadata accompanies it. Claiming 8-bit HDR is like boasting 5.1 channel Atmos audio. A 10-bit signal is an essential minimum for HDR video, whereas 12-bit buys a ton of headroom and an even greater dynamic range.