One of the vital signs for HDMI is the onboard supply voltage used to run the interface. It has been part of the specification since its conception and will unlikely never go away. As discussed countless times, the power on the bus is for system operation and was never intended to be a source of power for any ancillary products such as switchers, distribution devices, extenders and, of course, cables.
Yet, time and again we’re finding products that tend to, as they say in some legal terms, “touch your nose to the law.” It just means that some of the HDMI transmission products can slide by breaching supply boundaries without being detected.
This sounds silly but it is true and one reason is due to the fact that up until now there really has not been any HDMI Rev 2.0 specification for active cable devices.
So this means without a spec to follow, HDMI cable producers can pretty much do what they want with no real governing body to oversee it.
Why Is This So Important Now?
As the data rate increased from 10.2Gbps to 18Gbps the need for active electronics to compensate for insertion loss became a necessity when trying to stretch long distances, about 7 to 8 meters.
To use any active component some kind of power supply source is required. Now throw Rev 2.1 into the mix with a 48Gbps data rate and higher insertion loss numbers limiting distance by a huge amount.
In the past, products that were used had high supply current demands, some north of 250 milliamps (or a quarter of an amp). The minimum amount of current specified by the HDMI interface is very small, only 55ma.
It becomes pretty obvious that there can be a high degree of risk if any ancillary HDMI device pulled more than 55ma off the bus — this could starve the interface and shut it down, leaving the user with a blank screen.
We see a wide variety of products being made for HDMI, some active and some passive. The ones that are active with no option for external power can pull some very high currents. This goes for both copper and fiber products.
Then there are those that have the option to bring in external power, eliminating any risk of starving the interface.
Historically, the saving grace to all this has been manufacturers that build source products (Blu-ray, set-top boxes, AVRs, etc.), many of which provide more current than what HDMI minimums call for and that’s a good thing for everyone.
But the manufacturers don’t have to do that. More power spells more money, so there are limits to how much power any company sticks in its device. As long as the supply current was more than 55ma, every manufacturer was well within the specified limits.
We’ve really been pretty spoiled with this aspect, but times are changing.
DPL receives support calls on a daily basis, but recently they’ve increased with the influx of AOC (active optical cable) or fiber transmission products. These are no different than any other active device in that they too require some supply voltage.
Many such products use a self-harvesting type of circuit design that reduces the amount of current needed to operate, typically less than 100ma.
So Where Are the Problems Coming From?
Taking a closer look at these failed cable products, some newer devices in the HDMI source market have reduced the amount of current available to as low as 150ma. This was just a time bomb waiting to go off.
If source manufacturers know they can pull their supply currents down to as low as 55ma, why add more cost to the product?
Why the sudden change? One reason can be HDMI Rev 2.1. Since this is requiring a huge increase in data rate one would think extra power would be only natural to increase, because cable lengths will reduce substantially and the need for active products increases.
But nope, HDMI hung in there and did not change it. This issue is invisible to the user since supply current is one spec that is seldom published. This issue is one to keep an eye on.