**UPDATE August 30th** For those who have read this before, more readers were interested in the 8K info below than they were in the HDMI 2.1 CTS announcement, so I simply changed the title.
On August 1 HDMI announced the release of the HDMI 2.1 compliance test specification (CTS). They did so with zero fanfare —
no press release, just a short few lines on hdmi.org
. So why does this announcement matter? Well, HDMI devices need to be compliant before they're released to market, and that can't happen if there isn't a compliance spec against which to test them. That's what's out now. It's the gateway to seeing HDMI 2.1 enabled products released.
I tweeted this news on Friday, expressing that I'm excited about it, and got a response saying "especially if you can't wait for 8K displays". The sarcasm wasn't lost on me. But it's not 8K that I'm excited about. Why, you may ask? Well I crunched some numbers about visual acuity and viewing distance, which we'll take a look at before circling back to what it is about HDMI 2.1 that I am
excited about.8K and Visual Acuity
Say we're installing a 65" display. At 4K its pixels equate to just under 68 DPI, or around 135 DPI for 8K. How much of that we can see depends on two factors: how good our vision is, and how far away we're sitting. CEDIA CEB23A recommendations calculate to a viewing distance of 8ft/2.4m
for a 65" display. THX standards call for 40° viewing angle, which actually works out a little closer at 6.5ft/2m
Normal human vision is expressed as 20/20. The very best vision is 20/8. Think fighter pilot. At the closest recommended distance (THX), with normal vision we can resolve 44 DPI on the TV, whereas the super human with 20/8 can see up to 110 DPI. Move to the more pragmatic CEDIA distance and 20/20 vision can see 35.8 DPI, or 89.5 with 20/8 vision. So even with 20/20 vision at the VERY close sitting distance recommended by THX, we can only see about half of the DPI required to fully resolve 4K. That is, it's closer to 1080p. In practicality, many living spaces may have a smaller TV (55"?), place a display of that size a little further away than proposed here, and also many people have poorer than 20/20 vision. So in normal home viewing, any benefit purported by 8K really is questionable.
8K will benefit three main applications;
- VERY large screens. ie; video walls,
- Large interactive screens. Turn the 65" display mentioned above into a touch-screen and stand within arms reach. Now we're talking!
- VR and AR. That is, very small screens with ultra-high pixel density.
No hurry.HDMI 2.1 Benefits
So back to the HDMI 2.1 CTS, and the two things I am excited about are eARC and a broadening of HDR.
Firstly, eARC — t
his is a complete re-imagining of audio return channel (ARC). eARC will enable full hi-res immersive audio to travel upstream through HDMI from a display back to the AVR, for example from the Netflix or Prime app in the smart TV. It doesn't use CEC at all either, instead self-discovering and configuring with its own autonomous "heartbeat" that it issues systemwide. Expect this to be one the first available features in HDMI 2.1 products, with Teledyne LeCroy and other test instrument manufacturers already boasting test protocols to aide in the development of eARC enabled products.
Secondly, HDR. HDMI 2.0 limits HDR to 4K/60 4:2:2, but seeing as there's not really any 4:2:2 content out there, in real terms it's 4K/60 4:2:0. To get 4K/60 HDR with 4:4:4 (eg; gaming) HDMI 2.1 is a must as the data rate reaches 24Gbps. This will comprise 4 channels at 6Gbps each, whereas existing 18Gbps HDMI cables are 3x 6Gbps. Any that are constructed with 4 identical lanes should, in theory, be able to support the 24Gbps level no problem. This simply means having the clock channel built the same as the main AV data channels. Many decent cables have that already.
The other aspect of HDR that it VERY important is the introduction of support for dynamic metadata,where HDMI 2.0b was static only. Dynamic allows the scene dynamic range to swing up and down the absolute range to optimize scene-by-scene rendering, rather than fixing it for the full program as static does. Dolby Vision has always been dynamic, but as HDMI didn't support it, Dolby instead embedded the metadata in a proprietary manner. While effective, it caused some interoperability challenges at times, particularly with any intermediary device which processed the video without the ability to handle the embedded magic. This includes AVRs, HDBaseT extenders with compression, scalers, etc. It will be interesting to see of Dolby revert to standardized metadata.
Watch this space.