Stop Using HDMI
Updated: Feb 16, 2018
By: Matt Sakatos
If Brick Tamland were a video standard, he would be HDMI. In this post, I'll give some interesting history on how HDMI was created, outline its myriad of issues, and provide solutions to solve (or prevent) your HDMI woes and hopefully save you time, money, and/or stress.
It was a dark and stormy night... What? Isn't that how all horror stories begin?
HDMI is, in a way, a technical horror story. To understand why, we need to start with the standard it was created from: one of the worst video standards in history, DVI.
Back when the computer industry was beginning to transition from analog to digital, they wanted to find a way to develop a new digital video standard to replace analog VGA (RGBHV) that was popular at the time. So, a group of organizations and individuals got together to create a new standard called the Digital Visual Interface (DVI).
To make a long story short, DVI was the result of death... I mean “design” by committee. To keep everyone happy, it took on many different forms. It could run analog video, digital video, and even a mixture of both analog and digital. There were FIVE different types of connectors, and it was difficult to tell which ones worked with which components. It was quickly apparent that DVI was created with an identity crisis, and it only got worse.
The different connectors for various DVI formats
When designing DVI, one major issue was that the engineers and designers did not consider longer cable runs. In their world, computer monitors lived just a few feet away from the computer, so longer distances didn't become a factor in the process. Their solution for longer cable runs, which was basically just an afterthought, was simply to run DVI over twisted-pair cable (also known as Cat 5).
Then, in 2002, along came the content providers and movie studios. They needed a digital standard for consumer televisions and media players that included audio (DVI did not have embedded audio options) and copyright protection for their content. This was a perfect time to create a new and better digital video standard. Alas, they decided to simply improve upon DVI to create their new format, the High-Definition Multimedia Interface (HDMI).
This new group, called the HDMI founders, actually took big strides to improve DVI. They added embedded audio and simplified it down to a purely digital signal with only one type of port/connector. While that seems like a decent improvement, all they really did was put some lipstick on the pig and made it lose a few pounds. The underlying issue was actually how the physical signal traveled down the wires. It had to be run on 100 ohm twisted pairs, and this proved to be one of the biggest problems.
To understand the situation fully, it helps to understand a contrasting standard that the broadcasting industry created to solve the same need. In the late 1980's, they also needed a new digital standard to send video and audio reliably from point to point. The result was called Serial Digital Interface (SDI). Using one coaxial cable, various forms of SDI could send video, audio, and data hundreds of feet without the need for any additional hardware repeaters or boosters. It was robust, simple (one cable, one connector), and versatile.
The technical details involved with HDMI, and running uncompressed video over twisted pairs, can get nerdy and detailed. So if you would like to read more, check out the notes at the bottom of this article. I won't go into all those details here, but I will provide a simple illustration to help you understand the differences between HDMI (video over balanced twisted pairs) and SDI (video over unbalanced coax).
Imagine a long tunnel. This is your video cable. Now, imagine a bunch of boxes. Each one holds a single frame of video. Put all of those boxes on a train, where box 1 (frame 1) goes in car 1, box 2 (frame 2) goes in car 2, and so on. As you send that train through the tunnel, it comes out the other end in the same order and condition as it entered the tunnel. The links between the train cars assure the cars can't get ahead or behind, and that they always stay the correct distance apart. This is an example of how SDI and serial signals work. The cars carrying the boxes are the main signal, and the links between the cars are the “clock signal”, which keeps everything in sync.
Now imagine another tunnel, but instead of a train track it's a two-way road. Also, this time divide up each frame of video into 4 separate boxes: 3 color channels (red, green, blue) and a clock signal. Put those 4 boxes on the back of 4 individual bicycles, piloted by couriers being paid minimum wage, and send them through the tunnel side by side in their own lanes. Couriers 1-4 (frame 1) leave first, couriers 5-8 (frame 2) leave next, and so on. Most of the time, everything runs smoothly. Couriers 1-4 come out the other end first, followed by couriers 5-8, etc. However, the chances of problems happening in the tunnel go way up. On their way through the tunnel, maybe a courier slides on some gravel or loses their balance momentarily, jostling a box or even losing a box in the process. Or maybe the tunnel is a longer distance than the couriers are used to riding and they begin to get tired, causing some to get out of timing with the others or simply giving up altogether. This is an example of HDMI (or HDMI over Cat 5) and how parallel signals work.
Now, I know the courier illustration was perhaps a little over the top. However, it helps to show how much has to go right for HDMI signals to work flawlessly, especially over distances longer than a few meters. Failures can easily happen more often and more frequently when factors like length, cable quality, and interference come into play.
Now that we understand the flawed past of HDMI, let's move on to what these flaws actually mean in real world applications.
As I mentioned in the history above, DVI/HDMI was not designed to run long distances. It was designed for connecting computers to monitors and media players to televisions. The longer cables get, and the more bandwidth/data required for the signal, the worse the signals running through the cable become over distance.
Another factor is the classification of the cable speed. HDMI currently classifies cables into two separate categories:
Standard (or “Category 1”) HDMI cables have been tested to perform at speeds of 75Mhz or up to 2.25Gbps, which is the equivalent of a 720p/1080i signal.
High Speed (or “Category 2”) HDMI cables have been tested to perform at speeds of 340Mhz or up to 10.2Gbps, which is the highest bandwidth currently available over an HDMI cable and can successfully handle 1080p and 4K signals including those at increased color depths and/or increased refresh rates from the Source.
This is what the official HDMI website says about long cable runs:
We have seen cables pass "Standard Cable" HDMI compliance testing at lengths of up to a maximum of 10 meters without the use of a repeater.
HDMI.org Knowledge Base
Notice that they only mention Standard cables and not High Speed, so this means 10 meter long cables can only handle 720p or 1080i signals, not 1080p or larger. Further down that same HDMI.org page, they go on to say this about cable lengths:
Q: What is the current Category 1, Type A maximum cable length? 15 meters for a AWG22, 12 meters for AWG24, and 10 meters for a AWG26. For Category 2, the maximum seems to be 5-8 meters.
HDMI.org Knowledge Base
I find it funny that they use the phrase “seems to be” when describing cable lengths and reliability. The HDMI people themselves don't even know for sure what lengths are reliable! Haha, SMH, LOL, ROFL...
When a cable manufacturer wants to sell their HDMI cables, their products incorporating the HDMI trademarks must have passed Authorized Testing Center (ATC) compliance testing at the longest length placed on the market. However, there is a common problem that many manufacturers claim to have compliant products, but don't actually receive certification from ATC.
The biggest issue with non-compliant/non-certified cables is length. As stated earlier, the longest certified Standard speed cables are somewhere around 15 meters. Any cable longer than that is most likely not compliant/certified.EDID & HDCP
Extended Display Identification Data (EDID) is data provided by digital displays (TVs, monitors, projectors, etc.) to describe what resolutions and formats they can accept. This is actually a nice thing to have in many situations, especially for consumers who know nothing about video formats. However, as I will explain below, this can be a nuisance (or even detrimental) to the reliability of the video signal.
High-bandwidth Digital Content Protection (HDCP) is a form of digital copy protection to prevent copying of digital audio and video content as it travels across connections. It provides for an encrypted connection between a source (Blu-ray player, computer, cable box, streaming device, etc.) and a destination (TV, monitor, projector, AV receiver, etc.).
EDID and HDCP are the two leading causes of HDMI connectivity issues. Because they are two-way communications, there are many factors that can cause disruption in these connections. Splitters, distribution amplifiers, and extenders can all cause EDID and/or HDCP signals to get stripped, lost, or broken. This causes all kinds of problems like flashing images, missing colors, distorted or misaligned images, and even no image at all.
This is what makes SDI the preferred standard in nearly all broadcast studios, film sets, and professional media applications. The SDI signal is a one-way solution. There is no communication required between sources and destinations, whether for formatting or copyright protection. The source sends a compliant serial signal, the cable transmits it efficiently, and the destination receives it. Plain and simple.Splitting & Distribution
Now that we've established just how flawed HDMI is, it gets even worse if you want to split an HDMI signal or distribute it to multiple locations. Because of the issues listed above, splitting the signal becomes very tricky.
EDID is the biggest factor in the splitting process. Because it is only a single two-way communication between one display and one HDMI source, once you split the signal you then have to make sure each separate connection maintains its own EDID handshake, or all bets are off. For example, a single TV connected to a single HDMI source requires one EDID signal. However, once you add one more TV, it then requires THREE separate EDID signals to be maintained: one between the HDMI source and the splitter or distribution amplifier, and a separate EDID connection between the splitter or distribution amplifier and each TV. Many cheaper HDMI splitters and distribution amplifiers have no (or very poor) EDID management, which can cause the issues I mentioned back in the EDID/HDCP section.
One of the biggest issues with running uncompressed video over balanced twisted pairs is bandwidth. Uncompressed video takes up a lot of space. So, as the amount of information (resolution, frame rate, color depth, etc.) goes up, the smaller each bit of information has to be, and consequently, the smaller the window becomes for each bit to be registered at the receiving end. This goes back to Standard speed vs High Speed cables. Many Cat 5/6 extenders are only rated up to a certain distance and resolution for this reason.
Most protocols that also run over twisted-pair wire are two-way communications that have error correction. A packet that doesn't arrive on a computer network connection can be re-sent without problems. However, uncompressed video is like that train in the tunnel. It is a one-way stream of data that doesn't stop. And because video needs to be real-time, it doesn't have the luxury of being able to repair its mistakes. It simply runs non-stop, regardless of what's happening at the other end of the cable.
If there is any impedance mismatch between the source, load, and cable, it can cause parts of the signal to be reflected back and forth in the cable. Variations in impedance within cables cause significant degradation in the signal, and they degrade in a way that can't easily be fixed, EQ'd, or amplified.
As an example, high quality coax cables have a specified impedance tolerance of +/- 1.5 ohms, which is just 2% of the 75 ohm spec. And that tolerance is a conservative figure, with the actual impedance of the cables rarely being off by more than 0.5 ohm (less than 1%) off-spec. Twisted pair comes nowhere close to those stats. Most Cat 5/6 that gets within 10-15% impedance tolerance is excellent, and even the best bonded-pair cables only get within about 8 ohms of the 100 ohm spec.
To put it simply, Cat 5/6 cable is 4 to 30 TIMES more prone to accidental signal degradation or loss than SDI.
“I have an active 200 foot such-and-such HDMI cable and it works great.”
“I use these $20 HDMI over Cat 5 extenders and they work perfectly.”
I hear things like this all the time. And yes, it is possible to get HDMI signals over longer distances using active cables, HDMI over Cat5/6 extender baluns, and HDMI over fiber cables. However, there are two major flaws with these arguments:
What people are really saying is “If you have the exact same equipment, formats, and circumstances of my situation, the method I use will work perfect for you.” The problem is that HDMI is highly circumstantial. If you change one piece of that puzzle there's a good chance that it will cease to work. Many times I have people come to me, or I see them post on forums/groups, and say that they've had this HDMI system for a few years and it worked flawlessly. Then, seemingly out of nowhere, it quits working and never works properly again. As we've gone over in this post, there are so many stars that have to align properly for HDMI to work flawlessly. All it takes is one star to be slightly out of alignment for the whole thing to fail entirely.
Hopefully you get the picture after reading everything up until now. When you use HDMI, you are automatically starting out with a huge disadvantage. You can bring a rubber chicken to a battle, or you can bring a sharpened sword. The choice is yours.What About HDBaseT?
HDBaseT is a connectivity standard for transmission of uncompressed HD video, audio, power, networking, Ethernet, USB, and some control signals over a common Cat5e (or above) cable.
You probably know my answer already. HDBaseT is a standard built upon sending uncompressed video over twisted pairs, which was built upon HDMI, which was built upon DVI. It's a generational curse, unfortunately. While HDBaseT methods and equipment have gotten better over the last few years, they're still just a band-aid for the underlying issues I've explained above.
If you are using resolutions of 1080i or less, you can use most certified HDMI cables up to 30' in length.
If you are using resolutions of 1080p or more, you can use most certified HDMI cables up to 15' in length.
I understand that HDMI is inevitable for most situations. Many computers, and now even a lot of broadcast equipment, have HDMI ports. We can't make HDMI go away, but we can make it work with us. So, given everything that's been discussed here, there is only one guaranteed way for sending HDMI over distances longer than 15' reliably: convert it to a broadcast standard SDI signal.
If you are in a situation where you need HDCP compliance over distances further than 15', then staying in HDMI is going to be your only choice. There are some decent HDMI over fiber cables like the SlimRun AV cables from Monoprice or the Digital Ribbon hybrid cables from FSR.
Stop using HDMI cables longer than 15', ever. It's that simple. If you need to run HDMI sources further than 15', there are great options out there, and they're no more expensive than solutions like HDBaseT extenders. I always say: Do it right, do it once.