In my last blog we looked into the possible trip wires of 4K/60 4:4:4 HDR. Now the rabbit hole is a little deeper with HDBaseT, with its native bandwidth equivalent to 10Gbps in HDMI. To support formats requiring up to 18Gbps in HDMI, HDBaseT introduces compression to reduce the data load. That can actually be a good thing, as long as you know what you're dealing with and can design systems accordingly. Read on for the technical low-down...
Recapping the 4K 60Hz 4:4:4 HDR dilemma, it's currently a matter of pick any three of the four featured elements: yes we can get 4K/60 with HDR but only if chroma is dropped to 4:2:2 or 4:2:0. Or we can have 4:4:4 if the frame rate is 30fps or less. But at 4K/60 4:4:4 it's only 8-bit, and that means no HDR. Something’s got to give. But just since writing that blog two weeks ago I was shown an advertisement for a product utilizing HDBaseT and claiming 4K/60 4:4:4 HDR. To further complicate things, it was also accompanied by third party endorsement as "Bandwidth Certified to 18Gbps." Hmmm.
So what's the problem?
Well, to answer that requires digging into some fundamentals of HDBaseT. Firstly, it's an 8Gbps transmission line which doesn't need the 20% encoding overhead that HDMI works with. Therefore, HDBaseT supports a data rate which equates to 10Gbps of HDMI. They state it as 10.2Gbps just to align with the marketing nomenclature of High Speed HDMI, which is totally legitimate as there's nothing going on in that top little bit anyway. All good. But what about 18Gbps?
Valens, inventors of HDBaseT technology, are working on a native 16Gbps solution to support uncompressed 18Gbps of HDMI, for which they anticipate a 2019 release. In the meantime they've ratified a compression solution to allow video formats which would otherwise require 10-18Gbps HDMI to fit through the current bandwidth capability of HDBaseT. The chosen solution is VESA's Display Stream Compression (DSC), which is a lightweight, very high performance codec with imperceptible latency (I've calculated in the order of 18μs) and image quality. I'm all for that.
But let's be clear — compression alone makes the signal load smaller, it does NOT make the pipe bigger. When employed in HDBaseT, the original 18Gbps HDMI signal is decoded at the start of the HDBaseT system, the video extracted and compressed, then sent over the CATx cable in using regular HDBaseT transmission. The far end receiver then decompresses the signal and builds what is effectively an all new 18Gbps HDMI signal to output to its final destination. So if the system is tested for bandwidth at the far end, it’s actually only the last connection that’s being measured, not the whole transmission line.
It does however seem a great idea to validate support for 4K/60 4:4:4 or 4K/60 4:2:2 with HDR by testing the integrity and quality of the video itself, particularly after compression. What’s it doing to the picture, and will we notice? Personally, I'd really like to see delivery of specific HDR formats being validated too, including HDR-10/10+, HLG and Dolby Vision.
The HDBaseT Alliance fully acknowledge the challenge of how to communicate the use of compression, and currently have a Working Group to define product feature and capability labelling guidelines. However, until they mandate and enforce such labelling, it’s at the discretion of manufacturers. Any HDBaseT Alliance adopters who are interested in participating in this working group should contact the Alliance for details.
Another consideration is that some manufacturers may choose to implement an alternative compression method to DSC, which may impact image performance, latency and HDR compatibility. While these different approaches can't be certified by HDBaseT Alliance, manufacturers are free to innovate. An annexed approach to compression is color space conversion (CSC). Quick overview — 4K/60 4:2:0 fits through 9Gbps HDMI, and 4K/60 RGB or 4:4:4 requires 18Gbps. Both are 8-bit. So converting a RGB/4:4:4 source to 4:2:0 is the easy way to ensure data reduction to fit within the confines of HDBaseT, but it still limits it to 8-bit, which means no HDR.
It seems solutions are only limited by imagination. But then system design requires knowledge, and knowledge requires disclosure.
- Native uncompressed HDBaseT = 10Gbps HDMI.
- HDMI sources up to 18Gbps can be compressed for compatibility with HDBaseT bandwidth — i.e.; up to 4K/60 4:4:4 8-bit SDR or up to 4K/60 4:2:2 12-bit HDR. But the transmission bandwidth is still equivalent to 10Gbps HDMI.
- Certifying bandwidth to 18Gbps is referring to the last HDMI hop after the HDBaseT stage, not the whole transmission line. I'll leave that to the reader to interpret.
Please don’t think I’m critical of the compression approach — I’m not. In fact I’m a fan of good compression and regard it as an inevitability. Even SMPTE and Hollywood Post Alliance (HPA) are embracing it. But what I would like to see is disclosure of such information. Is compression employed, and if so, which type? The more transparent info we get, the better the chances of designing and integrating the product into systems successfully. It’s win-win. No surprises, and we can be comfortable with the fear of obsolescence
conversation with clients.
For a comprehensive overview about HDBaseT technology, check out the 3hr HDBaseT Master Integrator
course, on August 22 during Integrate Expo, Sydney Australia, and Friday Sept 7 during CEDIA Expo, San Diego - details here