As the world shifts to streaming video, we might not care about HDMI in the future. But today we still need it to deliver pristine video from Blu-ray players, cable/satellite boxes, gaming systems and other HDMI-connected devices … sometimes over long distances.
Extending these native signals over category cable has been pretty straightforward over the years, thanks especially to technologies like HDBaseT. But the game has changed now that 4K content has arrived with HDR imaging.
The video signal itself requires more bandwidth, but additional bandwidth is required to carry a massive amount of metadata related to HDR. How can you be sure that a video-distribution system supports all of these streams?
Integrators can expect to see many “4K with HDR” products at CEDIA Expo 2018, but this year it will be more important than ever to ask the right questions about their claims and certifications.
Here's why:
- HDBaseT has grown to dominate long-length HDMI connectivity.
- 4K/60 content has moved beyond incubation to be something we seriously need support.
- HDR has hit the stage, and we all need to be aware of what composes an HDR signal.
The challenge is that the first two only get along if the signal is 4K/60 8-bit 4:2:0, but that combo inherently precludes HDR. Why? Because HDBaseT — at least until their third-generation chip comes out next year — is limited to the equivalent of 10Gbps HDMI, which often isn’t enough to support the richest video quality.
In fact, the only supported 4K HDR combination under 10Gbps is 4K/24-30fps 4:2:0 or 4:2:2. Add a higher frame rate and/or 4:4:4 chroma and the transport window has to jump to 18Gbps or more.
World's Quickest Primer on Chroma Subsampling
4:4:4 is the mathematical equivalent of RGB with full color information.
4:2:2 halves the horizontal color resolution to reduce bandwidth, and 4:2:0 halves both horizontal and vertical color resolution to reduce it even further.
4:2:0 is what’s used by DVD, Blu-ray (including 4K), TV broadcast, cable/satellite and streaming sources.
4:4:4 is available from gaming consoles and graphic adapters.
Compression isn’t a Dirty Word
So it follows that when using HDBaseT for 4K/60 HDR, some manipulation of the signal is required to reduce the data load to the smaller sub-10Gbps pipe. There’s more than one way to do this, the most obvious being compression.
You may notice that the term “compression” seems to be conspicuously avoided in marketing. For years we’ve been taught that it’s a bad thing, with one of HDMI’s hallmarks being its ability to deliver uncompressed video. The main concerns about compression relate to picture quality degradation, and latency which impacts lip sync.
NEXT: Boccaccio: 4K HDR Compression and Compromises: Something’s Gotta Give
It’s 2018, however, and compression isn’t the dirty word it used to be. Modern light compression schemes, sometimes referred to as mezzanine compression, can deliver incredible image quality and imperceptible latency. It’s simply not an issue anymore. Well, with the good ones, that is.
The HDBaseT Alliance has ratified a compression scheme developed by VESA, called Display Stream Compression (DSC). It’s one of the best forms of visually lossless compression (VLC) available — the same as employed in DisplayPort 1.4 and HDMI 2.1.
Unfortunately, DSC has taken some time to productize into HDBaseT. Plus, it kills Dolby Vision’s embedded metadata.
In case you’re wondering what metadata is, entertainment technology commentator Debra Kaufman describes it aptly as being the digital version of a sticky note on a film canister: instructions for how the TV should render the image. In time, Dolby may well change to standardized metadata packaging to be compatible with all modes of HDMI 2.1, but until then DSC remains a problem for Dolby Vision.
Compression’s Wild West
Many HDBaseT product manufacturers have sidestepped compression in favor of their own “data-reduction” schemes. The result is something of a free-for-all, resulting in many different, seemingly proprietary solutions for “4K/60 4:4:4”, which by the way we should interpret to mean 18Gbps HDMI.
One such data reduction method for HDBaseT is commonly called color space conversion (CSC), which in fact is really just chroma subsampling. It may also go by other names or variations, depending on the manufacturer.
Most manufacturers swear their implementations don’t “technically” use compression, and don’t kill Dolby Vision metadata. Sounds great! BUT there’s a catch, and boy is it a big one.
CSC works like this: Any 4K/60 signal at 4:4:4 or 4:2:2 gets converted to 4:2:0, and if it’s 10- or 12-bit then it’s down-sampled to 8-bit. Similarly, at 4K 24-30fps 4:4:4, any 10- or 12-bit signal will be down-sampled back to 8-bit. The result is the following maximum formats over the link:
- 4K/30 8-bit 4:4:4, which HDBaseT natively supports anyway without CSC
- 4K/60 8-bit 4:2:0, which HDBaseT also natively supports anyway
Down-sampling with CSC ensures that high-bandwidth formats can squeeze through an 8Gbps HDBaseT pipe. At the end of the line, the content is re-inflated for transmission as an all-new HDMI signal which (in theory) emulates the original 18Gbps source.
In the process, however, while HDR metadata may be preserved, the original 10- or 12-bit HDR signals are not.
Put another way, CSC keeps the recipe but throws away the ingredients. Once you’ve scrapped three quarters of the color and dynamic range, there’s no getting it back!
Certification Caveats, Best Practices
Certification can be a wonderful thing in this Wild West of 4K video distribution. Some third-party organizations will verify that HDBaseT extenders support 18Gbps transmission by measuring bits in and bits out.
It’s a great start, but keep in mind that these tests simply measure speed and throughput, not the quality of video transmitted.
Does that make the tests worthless? Quite the contrary. They are important because they validate the native ability of products to deliver a picture where otherwise there might not be one.
Where “survival of the image” is the objective, 18Gpbs transmission tests are highly useful; however, it’s a different story for the primary displays where HDR performance is expected. Something better is needed.
Best practice for connecting a primary display is a transmission line that can natively support at least 18Gbps without needing compression or alternative manipulation. Examples are plentiful:
- Native HDMI cable, passive or active — Check with the vendor about how far they can go at 18Gbps. Cables with the clock channel constructed the same as the three primary data channels may even be compatible with the 24Gbps level of HDMI 2.1 … when it comes along. That could unlock 4K/60 4:4:4 with 12-bit Dolby Vision.
- HDMI active optical cable (AOC) — Similar to above, with the bonus of being thinner and longer. Again, check with the vendor.
- Optical fiber infrastructure cable with adapters on each end — Unlike HDMI-terminated cable, which has definitive bandwidth limitations, optical mode (OM) fiber can be terminated with adapters that provide an almost infinite upgrade path. Run a couple of OMx strands (duplex) and terminate them with LC-type connectors (for example), and then just swap out the boxes on each end as the need arises. The technology already exists to potentially run up to 400Gbps over a single OM3 (16 channels of CWDM at 25Gbps), so it won’t run out of steam.
Ask the Right Questions at CEDIA
It is the job of the technology integrator to question and interpret the information provided through marketing and specification sheets in order to make informed decisions about the products to be integrated into a system.
When strolling the aisles at CEDIA Expo, don’t be shy about asking vendors to qualify their specs. A fat pipe won’t necessarily deliver rich images and important metadata, and compression isn’t necessarily a terrible thing. It can be a good and necessary tool in every integrator’s bag of tricks.
If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our digital newsletters!