Networking & Cables

HDMI 2.1 Spec Officially Released: A Primer

HDMI 2.1 spec calls for 48GHz bandwidth supporting HDR, wider color gamut, 8K @ 60Hz, 4K @120Hz and 10K for commercial applications.

HDMI 2.1 Spec Officially Released: A Primer
The HDMI 2.1 format includes high dynamic range (HDR) support to help ensure improved video performance with improved depth, detail, brightness, contrast and wider color gamut capabilities on a scene-by-scene or frame-by-frame sequence. VIEW THE PRIMER IN THE SLIDESHOW BELOW

Photos & Slideshow

Robert Archer · November 29, 2017

The much-anticipated HDMI 2.1 specification has been released by HDMI Forum, Inc. The new spec, which HDMI Forum announced was coming during CES 2017 with only a few details, is now available to all HDMI 2.0 adopter. The new version supports a range of A/V formats, including 8K video at 60Hz and 4K video at 120Hz. The 48Gbps bandwidth speed that was one of the few leaked bits of information earlier this year, remains in the final spec.  

“The HDMI Forum’s mission is to develop specifications meeting market needs, growing demands for higher performance, and to enable future product opportunities,” says Robert Blanchard of Sony Electronics, president of the HDMI Forum.

New HDMI 2.1 Specification: Backwards Compatibility

Backwards compatible with with earlier versions of the HDMI format, the new version HDMI 2.1 specification was developed by the HDMI Forum’s Technical Working Group.

The HDMI 2.1 Compliance Test Specification (CTS) will be published in stages throughout the first three quarters of 2018. HDMI Forum says that it will notify adopters when they release CTS materials.

Check out the HDMI 2.1 primer in the slideshow

Increased High-Definition Compatibility

HDMI 2.1 is compatible with 4K at 120Hz, 8K at 60Hz, and 10K resolutions for commercial A/V applications.

The format includes high dynamic range (HDR) support to help ensure improved video performance with improved depth, detail, brightness, contrast and wider color gamut capabilities on a scene-by-scene or frame-by-frame sequence.

Other highlights include:

  • Ultra High Speed HDMI Cable supports 48Gbps bandwidth for uncompressed HDMI 2.1 support, with low EMI emission and backwards compatibility with earlier HDMI specifications.
  • The format’s eARC is designed to simplify connectivity, while providing ease-of-use, and it helps to ensure compatibility between audio devices and upcoming HDMI 2.1 products.
  • Improved refresh rates for smoother images and more seamless transitions for gaming, movies and video content.
  • Quick Media Switching (QMS) for movies; this feature eliminates delays that result in blank screens before content is displayed.
  • Quick Frame Transport (QFT) reduces latency for an improved gaming experience.
  • Auto Low Latency Mode (ALLM) enables the correct latency setting to automatically set for smoother, lag-free uninterrupted viewing.

To learn more, visit the HDMI Forum at the upcoming 2018 Consumer Electronics Show (CES) in Las Vegas on Jan. 9-12 in LVCC South Hall 1 booth 20542.



  About the Author

Bob is an audio enthusiast who has written about consumer electronics for various publications within Massachusetts before joining the staff of CE Pro in 2000. Bob is THX Level I certified, and he's also taken classes from the Imaging Science Foundation (ISF) and Home Acoustics Alliance (HAA). Bob also serves as the technology editor for CE Pro's sister publication Commercial Integrator. In addition, he's studied guitar and music theory at Sarrin Music Studios in Wakefield, Mass., and he also studies Kyokushin karate at 5 Dragons in Haverhill, Mass. Have a suggestion or a topic you want to read more about? Email Robert at [email protected]

Follow Robert on social media:
Twitter

Robert also participates in these groups:
LinkedIn · Google+

View Robert Archer's complete profile.



  Article Topics


Networking & Cables · HDMI · Audio/Video · Multiroom Video · Events · CES · News · Media · Slideshow · HDMI · Multiroom Video · Wire and Cable · All Topics
CE Pro Magazine

Not a Magazine Subscriber?
Subscribe Today...It's FREE!!

Comments

Posted by Eyal Kattan on December 3, 2017

Just like @NickJ wrote, the human eye cannot really see the difference between 1080 and 4K.
The benefit of higher resolution is really at the larger screens. Same as the megapixel myth in photography.
Taking a picture with higher resolution camera would not make any difference if you are looking at the picture on a small size screen or print. You only start to see the difference as you blow the picture to larger size. Only then you need the additional pixels to fill the larger frame. But then again, larger prints are meant to be watched from further away so you end up with basically the same experience.

That being said, I am not sure why HDMI is being continued to be developed while other, more advanced and less limited technologies such as fiber or HDBaseT (including IP) are being at the forefront of commercial applications. Maybe it’s time for manufacturers to start implementing these technologies inside the electronic equipment rather than creating higher specs for expensive cables that are limited for 50ft in the best case…..
(HINT: Look at high-end projectors…they already do this…)

Posted by NickJ on December 3, 2017

It can be mathematically proved (take the arc tangent of the angular maximum resolution of the human retina with one side of the triangle being the distance to the screen) that two lines of a 4K resolution image on a 60 inch screen will fall within the “same” smallest spot the eye can see as one line of a 1080 image at 8 feet.

So, no one can see a difference in 1080 and 4K at 8 feet.  And they won’t see any difference for 8K either. (There will just be 4 lines within that same maximum resolution spot.)

Closer than 8 feet, sure.  At 2 feet 4K looks better (and 8k will look even better) but surely no one watches TV from 2 feet.

I’m an engineer and have learned calculations work but only when you haven’t overlooked something so I bought a 4K TV and placed it overlapping my 1080 TV (same size and brand) and my wife and I watched it for months and never saw any difference from a normal viewing distance (7 to 8 feet).  Of course it looked better if we walked up close.

As engineers we used to use LOGIC when designing things for consumers but “marketers” have taken over as our managers and now tell us to produce “absurd” products just because they can convince consumers to WANT them even though they provide zero benefit to the consumer.  48 Gbs is absurd and actually hurts consumers since those data rates don’t benefit the quality of the image they will actually “see” (at a normal viewing distance) but taxes the technology to the point some worthwhile things can’t be done.

Example: My company has been producing lip-sync correction products for home cinema for over 10 years and all previous products align lipsync by delaying the audio.  We can’t help if the audio arrives delayed or a speaker system delays the audio more than the video so we were working on adding video delay capability to our next generation.  We determined we could do it cost effectively for 1080 but 4K drove the speed required past most FPGA’s and 8K isn’t even possible with today’s technology. So, I ask you do you think consumers benefit more from 8K images that look the same to them at normal viewing distances while putting up with mind boggling lip-sync error?  I’d take 1080 with perfect lip-sync any day but we’ve had a 1080 HDMI version of our product for almost 3 years that we haven’t announced because we think the market has moved on to 4K and views 1080 as obsolete and our chip vendor still doesn’t have their 4K transceiver in production.  I don’t even know if they will choose to produce an 8K version and we (Felston) may not ever produce even a 4K version if the market has moved to 8K before we even get a 4K chip in production.

Who benefits from this? Certainly not the consumer who is watching TV from a normal viewing distance.

Posted by jmcdermott1678 on November 30, 2017

It’s refreshing to see these specs now and a plan for compliance testing, versus the fiasco that was the introduction of 4K and developing standard over the course of a few years.  Great article and slideshow materials.  Now on to waiting for solutions so we can properly prepare homes for the future.

Posted by Robert Archer on November 30, 2017

Hi Kscheitler, we incorrectly used the term GHz, it should be Gbps. We have corrected the mistake and the story should reflect the edit shortly. Sorry for the inconvenience.

Posted by kscheitler msdhometheater.com on November 30, 2017

I’ve seen a lot of articles lately that quote both Ghz and Gbps, this one included. I would assume that they are referring to the Gbps potential of the cable and Ghz is used incorrectly. Is there a reason for this? Can anyone shed any light?

Posted by kscheitler msdhometheater.com on November 30, 2017

I’ve seen a lot of articles lately that quote both Ghz and Gbps, this one included. I would assume that they are referring to the Gbps potential of the cable and Ghz is used incorrectly. Is there a reason for this? Can anyone shed any light?

Posted by Robert Archer on November 30, 2017

Hi Kscheitler, we incorrectly used the term GHz, it should be Gbps. We have corrected the mistake and the story should reflect the edit shortly. Sorry for the inconvenience.

Posted by jmcdermott1678 on November 30, 2017

It’s refreshing to see these specs now and a plan for compliance testing, versus the fiasco that was the introduction of 4K and developing standard over the course of a few years.  Great article and slideshow materials.  Now on to waiting for solutions so we can properly prepare homes for the future.

Posted by NickJ on December 3, 2017

It can be mathematically proved (take the arc tangent of the angular maximum resolution of the human retina with one side of the triangle being the distance to the screen) that two lines of a 4K resolution image on a 60 inch screen will fall within the “same” smallest spot the eye can see as one line of a 1080 image at 8 feet.

So, no one can see a difference in 1080 and 4K at 8 feet.  And they won’t see any difference for 8K either. (There will just be 4 lines within that same maximum resolution spot.)

Closer than 8 feet, sure.  At 2 feet 4K looks better (and 8k will look even better) but surely no one watches TV from 2 feet.

I’m an engineer and have learned calculations work but only when you haven’t overlooked something so I bought a 4K TV and placed it overlapping my 1080 TV (same size and brand) and my wife and I watched it for months and never saw any difference from a normal viewing distance (7 to 8 feet).  Of course it looked better if we walked up close.

As engineers we used to use LOGIC when designing things for consumers but “marketers” have taken over as our managers and now tell us to produce “absurd” products just because they can convince consumers to WANT them even though they provide zero benefit to the consumer.  48 Gbs is absurd and actually hurts consumers since those data rates don’t benefit the quality of the image they will actually “see” (at a normal viewing distance) but taxes the technology to the point some worthwhile things can’t be done.

Example: My company has been producing lip-sync correction products for home cinema for over 10 years and all previous products align lipsync by delaying the audio.  We can’t help if the audio arrives delayed or a speaker system delays the audio more than the video so we were working on adding video delay capability to our next generation.  We determined we could do it cost effectively for 1080 but 4K drove the speed required past most FPGA’s and 8K isn’t even possible with today’s technology. So, I ask you do you think consumers benefit more from 8K images that look the same to them at normal viewing distances while putting up with mind boggling lip-sync error?  I’d take 1080 with perfect lip-sync any day but we’ve had a 1080 HDMI version of our product for almost 3 years that we haven’t announced because we think the market has moved on to 4K and views 1080 as obsolete and our chip vendor still doesn’t have their 4K transceiver in production.  I don’t even know if they will choose to produce an 8K version and we (Felston) may not ever produce even a 4K version if the market has moved to 8K before we even get a 4K chip in production.

Who benefits from this? Certainly not the consumer who is watching TV from a normal viewing distance.

Posted by Eyal Kattan on December 3, 2017

Just like @NickJ wrote, the human eye cannot really see the difference between 1080 and 4K.
The benefit of higher resolution is really at the larger screens. Same as the megapixel myth in photography.
Taking a picture with higher resolution camera would not make any difference if you are looking at the picture on a small size screen or print. You only start to see the difference as you blow the picture to larger size. Only then you need the additional pixels to fill the larger frame. But then again, larger prints are meant to be watched from further away so you end up with basically the same experience.

That being said, I am not sure why HDMI is being continued to be developed while other, more advanced and less limited technologies such as fiber or HDBaseT (including IP) are being at the forefront of commercial applications. Maybe it’s time for manufacturers to start implementing these technologies inside the electronic equipment rather than creating higher specs for expensive cables that are limited for 50ft in the best case…..
(HINT: Look at high-end projectors…they already do this…)