Last year I wrote a blog about HDR and implications of delivery through HDMI. But then I just recently received a great message from Simon Fulstow of Sona Projects in the UK, which prompted the need for further clarification. He asks;
“I wonder if you could guide me to a resource (or resources) where I could really clarify the various video formats and data rates specifically as applicable to HDR and Dolby Vision. I understand the concept of resolution, refresh rate, colour depth and chroma sampling but am lost how this applies with HDR added in and how to apply this to video distribution products ... which claim to support video formats and resolutions that don't seem possible based on the data rates required.
We have a project at the moment where the client wants to ensure his system is Dolby Vision compatible - but trying to establish products that are, is proving to be an absolute minefield. [Some claim to] support 4K60, 4:4:4, HDR (which if I assume is 10bit has a data rate of just under 23Gbps) but not Dolby Vision (even though we could have a 4K60, 4:2:0: 12bit signal with a data rate of under 14Gbps) - Does Dolby Vision require a certain chroma sampling..?”
Thank you Simon! The fundamentals of our resulting conversation we both thought were worth sharing. But then since we had this conversation I've seen two more products highlighted in industry newsletters claiming to support 4K/60 4:4:4 with HDR, one of which included Dolby Vision. The problem with such claims is that they imply each of these parameters - 4K, 60fps, 4:4:4 chroma subsampling and HDR - are available together, but the current generation of products claiming such support can only do so with 3 of the 4 features at any given time. That is, 4K/60 4:4:4 but drop HDR, or 4K/30 4:4:4 with HDR (drop to 30fps), or 4K/60 4:2:2 or 4:2:0 with HDR (drop the 4:4:4).
So what to do?
Firstly, recapping the previous blog, there is no 8-bit HDR
. The minimum bit depth required for true HDR is 10-bit
. That's the "10" in HDR-10 or Samsung's HDR-10+. Dolby Vision is optimized for 12-bit, but can also work (really well, in fact) at 10-bit. Secondly, HDR including Dolby Vision can work with 4:4:4, 4:2:2 or 4:2:0 chroma subsampling. Update June 11, 2018 - There is some HDR content being delivered on an 8-bit stream, but this is to allow backwards compatibility. It's transitional, but not true HDR. Also, many current displays are dithering an 8-bit signal to produce an HDR..ish image (thanks to @Daniel Adams for that tip!).
- HDMI products supporting up to 9Gbps (maybe stated as 10.2Gbps High Speed) can support up to 4K/30 HDR with 4:2:0 or 4:2:2 only. HDBaseT is fundamentally compatible too.
- HDMI products supporting up to 18Gbps (those advertised as "4K/60 4:4:4") can also only support 4K/60 HDR with 4:2:0 or 4:2:2 only.
- There are currently no products to support 4K/60 4:4:4 with HDR.
- Formats operating above 9Gbps but less than 18Gbps require compression to work through HDBaseT. And here's the catch - the VESA Display Stream Compression (DSC) system that's employed is some products can't (yet) support 4:2:0, while in others it can. It depends on the version of DSC: version 1.1 only supports 4:4:4, but DSC 1.2 adds 4:2:2 and 4:2:0. Confusing huh?
- Update June 11, 2018 - I had previously stated that Dolby Vision can be supported over some compression methods. This was based on anecdotal info from some manufacturers that were claiming to do it, so that's what I reported. Technically it is possible, but the only current method currently available is using H.265 (aka HEVC), and limited to 10-bit. This is intended for internet streaming where video and audio are delivered concurrently, so the latency implications of H.265 compression are not an issue. But as an HDMI extender, such as from an AVR to display where lip sync becomes a major consideration, this may prove impractical. If you find such a product, you'd have to try it and see, or at the very least have access to accurate specs in order to predict and prepare for such eventualities.
- At time of writing, no compressed streams can support 12-bit Dolby Vision. HDBaseT systems with DSC enabled cannot support Dolby Vision in either 10- or 12-bit forms. Check with the vendor if other proprietary data reduction methods are employed.
I've created a Google sheet to show data rates with different combinations. See the updated screen grab below, or check it out the original sheet here
. ConclusionUpdate June 11, 2018 - HDMI 2.1 products will come out in due course, which will resolve this issue somewhat. But until then, if you have a client asking for Dolby Vision support, it's important for you to know the capabilities and possible limitations of all options. Standard CATx, and particularly fiber infrastructure can provide a scalable upgrade path if and when required, as it may entail just swapping out the boxes at each end. However, be aware that CATx options will inevitably involve compression, whereas we're only scratching the surface with the uncompressed bandwidth potential of fiber. If pulling long length active cables, including HDMI AOC, be aware of its data rate limit. The number and speed of lasers used in the AOC will determine its capability and future scalability. Those with 3x optical lanes at 6Gbps produces 18Gbps total, which in turn can support up to a maximum 4K/60 12-bit 4:2:2 (but not 4:4:4) with Dolby Vision. However, many AOC solutions already in the market today comprise 4x optical lanes — 3 are currently used for AV data, the 4th for being clock. If there's 4x 6Gbps lasers, then the cable could theoretically support the upcoming 24Gbps level of HDMI 2.1, as HDMI 2.1 will re-purpose the clock channel to a 4th AV data lane, and embed the clock. The good news is that the 24Gbps tier will be the most relevant for a long time yet, able to support up to 4K/60 4:4:4 with 12-bit Dolby Vision, all uncompressed. Brilliant!! Some AOC's are even using up to 12Gbps lasers per lane, which may prove to be fully compatible with the top tier of HDMI 2.1 at 48Gbps, unlocking further formats such as 8K up to 120fps. Please check with your vendor for details.
As for hardware, up to 30fps UHD Blu-ray with Dolby Vision is no problem as long as your UHD-BD player doesn't convert the output to 4:4:4, rather outputting the native 4:2:0 as it is on the disc. Gaming systems such as X-box One S/X do have a menu option to change output to 4:2:2 to give a free ride (from 8- to 12-bit) for HDR-10 or Dolby Vision content. Stay tuned regarding Dolby Vision, as uptake of HDMI 2.1 and its inherent support for dynamic HDR metadata may make our lives easier. We'll have to wait and see....
If anyone has any questions, feel free to post a message here on Community, or contact me directly.