Networking & Cables

HDMI 2.1 Spec Officially Released: A Primer

HDMI 2.1 spec calls for 48GHz bandwidth supporting HDR, wider color gamut, 8K @ 60Hz, 4K @120Hz and 10K for commercial applications.

·

Read full article.

5 Comments
Posted by kscheitler msdhometheater.com on November 30, 2017

I’ve seen a lot of articles lately that quote both Ghz and Gbps, this one included. I would assume that they are referring to the Gbps potential of the cable and Ghz is used incorrectly. Is there a reason for this? Can anyone shed any light?

Posted by Robert Archer on November 30, 2017

Hi Kscheitler, we incorrectly used the term GHz, it should be Gbps. We have corrected the mistake and the story should reflect the edit shortly. Sorry for the inconvenience.

Posted by jmcdermott1678 on November 30, 2017

It’s refreshing to see these specs now and a plan for compliance testing, versus the fiasco that was the introduction of 4K and developing standard over the course of a few years.  Great article and slideshow materials.  Now on to waiting for solutions so we can properly prepare homes for the future.

Posted by NickJ on December 3, 2017

It can be mathematically proved (take the arc tangent of the angular maximum resolution of the human retina with one side of the triangle being the distance to the screen) that two lines of a 4K resolution image on a 60 inch screen will fall within the “same” smallest spot the eye can see as one line of a 1080 image at 8 feet.

So, no one can see a difference in 1080 and 4K at 8 feet.  And they won’t see any difference for 8K either. (There will just be 4 lines within that same maximum resolution spot.)

Closer than 8 feet, sure.  At 2 feet 4K looks better (and 8k will look even better) but surely no one watches TV from 2 feet.

I’m an engineer and have learned calculations work but only when you haven’t overlooked something so I bought a 4K TV and placed it overlapping my 1080 TV (same size and brand) and my wife and I watched it for months and never saw any difference from a normal viewing distance (7 to 8 feet).  Of course it looked better if we walked up close.

As engineers we used to use LOGIC when designing things for consumers but “marketers” have taken over as our managers and now tell us to produce “absurd” products just because they can convince consumers to WANT them even though they provide zero benefit to the consumer.  48 Gbs is absurd and actually hurts consumers since those data rates don’t benefit the quality of the image they will actually “see” (at a normal viewing distance) but taxes the technology to the point some worthwhile things can’t be done.

Example: My company has been producing lip-sync correction products for home cinema for over 10 years and all previous products align lipsync by delaying the audio.  We can’t help if the audio arrives delayed or a speaker system delays the audio more than the video so we were working on adding video delay capability to our next generation.  We determined we could do it cost effectively for 1080 but 4K drove the speed required past most FPGA’s and 8K isn’t even possible with today’s technology. So, I ask you do you think consumers benefit more from 8K images that look the same to them at normal viewing distances while putting up with mind boggling lip-sync error?  I’d take 1080 with perfect lip-sync any day but we’ve had a 1080 HDMI version of our product for almost 3 years that we haven’t announced because we think the market has moved on to 4K and views 1080 as obsolete and our chip vendor still doesn’t have their 4K transceiver in production.  I don’t even know if they will choose to produce an 8K version and we (Felston) may not ever produce even a 4K version if the market has moved to 8K before we even get a 4K chip in production.

Who benefits from this? Certainly not the consumer who is watching TV from a normal viewing distance.

Posted by Eyal Kattan on December 3, 2017

Just like @NickJ wrote, the human eye cannot really see the difference between 1080 and 4K.
The benefit of higher resolution is really at the larger screens. Same as the megapixel myth in photography.
Taking a picture with higher resolution camera would not make any difference if you are looking at the picture on a small size screen or print. You only start to see the difference as you blow the picture to larger size. Only then you need the additional pixels to fill the larger frame. But then again, larger prints are meant to be watched from further away so you end up with basically the same experience.

That being said, I am not sure why HDMI is being continued to be developed while other, more advanced and less limited technologies such as fiber or HDBaseT (including IP) are being at the forefront of commercial applications. Maybe it’s time for manufacturers to start implementing these technologies inside the electronic equipment rather than creating higher specs for expensive cables that are limited for 50ft in the best case…..
(HINT: Look at high-end projectors…they already do this…)

5 Comments
Posted by Eyal Kattan on December 3, 2017

Just like @NickJ wrote, the human eye cannot really see the difference between 1080 and 4K.
The benefit of higher resolution is really at the larger screens. Same as the megapixel myth in photography.
Taking a picture with higher resolution camera would not make any difference if you are looking at the picture on a small size screen or print. You only start to see the difference as you blow the picture to larger size. Only then you need the additional pixels to fill the larger frame. But then again, larger prints are meant to be watched from further away so you end up with basically the same experience.

That being said, I am not sure why HDMI is being continued to be developed while other, more advanced and less limited technologies such as fiber or HDBaseT (including IP) are being at the forefront of commercial applications. Maybe it’s time for manufacturers to start implementing these technologies inside the electronic equipment rather than creating higher specs for expensive cables that are limited for 50ft in the best case…..
(HINT: Look at high-end projectors…they already do this…)

Posted by NickJ on December 3, 2017

It can be mathematically proved (take the arc tangent of the angular maximum resolution of the human retina with one side of the triangle being the distance to the screen) that two lines of a 4K resolution image on a 60 inch screen will fall within the “same” smallest spot the eye can see as one line of a 1080 image at 8 feet.

So, no one can see a difference in 1080 and 4K at 8 feet.  And they won’t see any difference for 8K either. (There will just be 4 lines within that same maximum resolution spot.)

Closer than 8 feet, sure.  At 2 feet 4K looks better (and 8k will look even better) but surely no one watches TV from 2 feet.

I’m an engineer and have learned calculations work but only when you haven’t overlooked something so I bought a 4K TV and placed it overlapping my 1080 TV (same size and brand) and my wife and I watched it for months and never saw any difference from a normal viewing distance (7 to 8 feet).  Of course it looked better if we walked up close.

As engineers we used to use LOGIC when designing things for consumers but “marketers” have taken over as our managers and now tell us to produce “absurd” products just because they can convince consumers to WANT them even though they provide zero benefit to the consumer.  48 Gbs is absurd and actually hurts consumers since those data rates don’t benefit the quality of the image they will actually “see” (at a normal viewing distance) but taxes the technology to the point some worthwhile things can’t be done.

Example: My company has been producing lip-sync correction products for home cinema for over 10 years and all previous products align lipsync by delaying the audio.  We can’t help if the audio arrives delayed or a speaker system delays the audio more than the video so we were working on adding video delay capability to our next generation.  We determined we could do it cost effectively for 1080 but 4K drove the speed required past most FPGA’s and 8K isn’t even possible with today’s technology. So, I ask you do you think consumers benefit more from 8K images that look the same to them at normal viewing distances while putting up with mind boggling lip-sync error?  I’d take 1080 with perfect lip-sync any day but we’ve had a 1080 HDMI version of our product for almost 3 years that we haven’t announced because we think the market has moved on to 4K and views 1080 as obsolete and our chip vendor still doesn’t have their 4K transceiver in production.  I don’t even know if they will choose to produce an 8K version and we (Felston) may not ever produce even a 4K version if the market has moved to 8K before we even get a 4K chip in production.

Who benefits from this? Certainly not the consumer who is watching TV from a normal viewing distance.

Posted by jmcdermott1678 on November 30, 2017

It’s refreshing to see these specs now and a plan for compliance testing, versus the fiasco that was the introduction of 4K and developing standard over the course of a few years.  Great article and slideshow materials.  Now on to waiting for solutions so we can properly prepare homes for the future.

Posted by Robert Archer on November 30, 2017

Hi Kscheitler, we incorrectly used the term GHz, it should be Gbps. We have corrected the mistake and the story should reflect the edit shortly. Sorry for the inconvenience.

Posted by kscheitler msdhometheater.com on November 30, 2017

I’ve seen a lot of articles lately that quote both Ghz and Gbps, this one included. I would assume that they are referring to the Gbps potential of the cable and Ghz is used incorrectly. Is there a reason for this? Can anyone shed any light?