In the 1956 sci-fi movie thriller Invasion of the Body Snatchers, residents of a small California town awaken one day to discover that humans are rapidly being replaced with emotionless, look-alike aliens hatched from giant seed pods – and there's nothing they can do to stop it from happening.
Many people who work in the professional AV world feel the same way about the transition to digital. Ten years ago, we were getting comfortable with the red, green, and blue RCA component video connections for HDTV receivers and regularly skinning our knuckles attaching VGA plugs to computer monitors, switchers, and distribution amplifiers.
Today, that's all changed. Go into your local Best Buy or Wal-Mart and check out the latest Blu-ray or DVD players. Notice anything unusual on the back panel, such as the absence of component video jacks?
Now, head over to the TV aisle and scope out the input jacks on a 55-inch LCD. You'll probably find one set of red, green, and blue RCA input jacks at most.
What you will find on both the Blu-ray player and the TV is a 19-pin HDMI jack – in fact, as many as four of them on the TV. And you'll also spot it on PlayStation and Xbox gaming consoles, cable TV and satellite set-top boxes, digital point-and-shoot cameras, and camcorders.
Go next door to Staples or OfficeMax and check out their line of business projectors. Chances are; you'll spot at least one HDMI connector on the input panel. Is there an Apple store nearby? Give the MacBooks and Mini Macs a close up, and you'll see nary an analog video connection – just a small, rectangular jack marked DisplayPort. Or pick up an Apple TV box and look at its TV connection. Surprise – it's HDMI!
At the end of Body Snatchers, Kevin McCarthy manages to escape the town of ‘pods' and makes his way to a busy highway where he runs through traffic yelling, “They're already here! You're next!”
If you feel the same way, it's time to calm down. Like it or not, digital display interfaces are here to stay. But unlike the movie, we don't need to fear them. We do, however, need to better understand them so we can manage, switch, and distribute these signals reliably, just like we've done in the good old days of analog.
Believe it or not, there are plenty of good reasons to move to digital interfaces, such as automatic setup of displays and monitors, higher bandwidth for 2K and 4K image resolution and beyond, consolidation of different signals types into fewer connectors for lighter and thinner computers (i.e. tablets), and easier interfacing with structured wire and optical fiber.Let's face it - the VGA connector has had a good run, but it's long past time to put it out to pasture.
WHERE IT ALL STARTED
The roots of today's digital display interfaces go back to 1987, when the familiar 15-pin D-sub ‘VGA' connector was introduced. VGA stands for Video Graphics Array and was the first high-resolution display system for the IBM personal computer, offering a ‘staggering' 640x480 pixels that were progressively scanned.
In just a few years, video card resolutions increased to SVGA (800x600 pixels) and XGA (1,024 x 768 pixels). CRT monitors got bigger, and the first flat-screen LCD and plasma monitors made an appearance. Silicon Graphics launched an even-higher display standard for its workstations known as Super XGA (SXGA, 1,280 x 1,024 pixels). That was soon trumped by a new Windows version (1,400 x 1,050).
As a result, the Video Electronics Standards Association (VESA) decided in the late 1990s that adding some ‘smarts' to the VGA connection was long overdue. This they did by devising a system known as Extended Display Identification Data, or EDID, that would be programmed into every new monitor sold after 1994.
EDID is a table of information saved in non-volatile memory that tells the computer what signal timings and pixel resolutions are needed to drive a computer monitor correctly. The computer's video card reads EDID and in theory, generates a ‘standard timing' that the monitor will be happy with. In the meantime, you don't need to fiddle with menu adjustments to center the image or size it correctly – it's all done automatically.
So far; so good. But VESA wasn't done yet. The next order of business was to develop a 100% digital signal interface from computer to screen; one that could handle higher resolutions but without the enormous variations in red, green, and blue channel voltage levels that an analog interface would require.
And thus was born the Digital Visual Interface (DVI) in 1999, also known later as the Digital Video Interface. DVI kept the EDID system for automatic monitor set-up, but used a clever way to minimize voltage changes in red, green, and blue signal levels while still achieving a full range of luminance values from black to white.
This signaling system is known as Transition Minimized Differential Signaling (TMDS), and allows small diameter cables to transport display signals with very high resolutions – a challenge that was becoming increasingly more difficult for the VGA connector.
DVI kept the EDID system for automatic monitor set-up, and unveiled an entirely different connector than we'd seen before: It had pins arranged in three rows of eight, plus a large solitary flat grounding blade for orientation. Because we love backwards compatibility, the DVI connector came in two versions – DVI-D (100% digital), and DVI-I (hybrid analog and digital). The DVI-I connector added four tiny pins to provide the usual red, green, and blue analog display signals, along with a composite (horizontal and vertical) sync signal.
NO STOPPING NOW
The DVI connector made its debut as flat-screen LCD monitors were starting to replace the bulky CRT monitors of the past. But it was soon adopted for a myriad of other uses, including LCD and plasma HDTV sets, DVD players, game consoles, and set-top boxes for cable and satellite TV reception.
For consumer applications, DVI had a major drawback – it couldn't transport audio signals, only display signals. As a result, Silicon Image (the company that developed much of the underlying DVI architecture) convened another working group to come up with a next-generation, consumer-friendly digital video connector; one that would not support analog signals in any way, but could transport digital video and audio, streams of data, and even control signals.
This new connector wouldn't require any screws to attach. It would be smaller than the DVI connector and have a much flatter profile. And it would come in several different styles to work with a wider range of consumer electronics devices. Most importantly (at least, to TV networks and Hollywood), the new connector would support an elaborate copy-protection system to cut down on pirated movies and TV shows.
The result of this working group was the High Definition Multimedia Interface, or HDMI. It launched in 2002 and like kudzu, spread rapidly to a wide range of consumer electronics and also professional devices. Its standard has been updated numerous times (the latest is version 1.4b), and it can support 3D and Ethernet as well as video, audio, and control signals.
In its present iteration, the HDMI connector – which retains the TMDS signaling system from DVI – is capable of streaming display, audio, and control data at a maximum rate of 10.2 gigabits per second (Gb/s). That's easily fast enough to show video at 1 920 x 1,080 resolution, progressively scanned, with a refresh rate of 60 times per second. And those images can have as many as 16 bits of color per channel (48-bit color), representing literally trillions of colorshades.
HDMI also incorporates a copy-protection system that requires a constant exchange of 56-bit ‘keys' between the video source and the monitor or TV showing the video. This system, known as High-bandwidth Digital Copy Protection (HDCP), is used by DVD and Blu-ray players, set-top boxes, and gaming consoles anytime a movie or TV show is played back.Copy protection is the primary reason that those familiar red, green, and blue RCA jacks have largely disappeared from consumer electronics equipment.
AN INTERFACING PUZZLE
The inventors of HDMI never intended it to be used for anything other than a simple, peer-to-peer connection between a Blu-ray player (known as a ‘source' in the HDMI language) and a TV (also known as a ‘sink'). To accommodate home theater enthusiasts, a ‘repeater' circuit was added to AV receivers to allow switching of multiple HDMI inputs to a single TV or a projector, decrypting and re-encrypting content as it passed through the receiver. But the repeater design was intended for one connected display, and no more.
In the past few years, we've seen commercial AV matrix switchers and distribution amplifiers come to market to move around and distribute HDMI just like component video or VGA signals. But it's not quite that simple: Successfully connecting HDMI ‘sinks' and ‘sources' requires active management of two things – (1) EDID, and (2) HDCP keys.
Those early attempts at matrixing and distributing HDMI expanded on the original source-repeater-sink architecture.Unfortunately, repeaters don't work very well with two or more connected displays. Copy protection keys might not be read correctly, or dropped, resulting in a blank screen. And EDID information for one connected display wouldn't be at all correct for another.
MANAGING EDID
The first step in taming the HDMI ‘beast' was to design a way for switchers and distribution amplifiers to capture EDID from connected displays and store it in memory. Having EDID memorized makes it possible to switch with faster connect times. In addition, the switcher or DA can look up the EDID for each connected display and determine the highest common display resolution for all displays, ensuring every projector, monitor, and/or TV will show a picture.
Another reason to store EDID is to keep connected computers from going to sleep, which many laptops do in the absence of a display hookup. A low-voltage logic line on one pin of the HDMI connector signals the computer when a monitor or TV is connected. This toggling voltage tells the signal source to read EDID and make the display connection.
As long as that logic line is in a ‘high' state, any connected computer will think there is an active display at the other end of the cable, and won't power down or go into low-power mode even if it is switched out of the circuit for an indefinite period. (EDID detection with DVD and Blu-ray players happens only once when the monitor, projector, or TV is connected. The DVD/Blu-ray player will play out a movie even if the display is subsequently disconnected.)
When designing an HDMI distribution system, it's an absolute must to identify all of the places in the circuit where an EDID handshake occurs and to make sure that EDID data makes it all the way back from every display to a connected source. This applies even when HDMI extenders are employed. EDID is transmitted on pins 12 and 15 of the HDMI connector and those pins should be active everywhere in the switching or distribution system.
One extremely useful tool for reading EDID is a freeware program available from Entech. It's called Monitor Asset Manager and will read EDID from any connected monitor, TV, projector, AV receiver, or signal interface through DisplayPort, HDMI, DVI, and even VGA connections. You can download it free at http://www.entechtaiwan.com/util/moninfo.shtm.
MANAGING HCDP
The joke goes: “Why do they call it HDMI? Because all of the other four-letterwords were taken.” Well, there's no doubt that the HDCP layer of the HDMI specification has created plenty of headaches for systems integrators and end-users. Examples would be blank screens (or screens with digital ‘hash') and no audio, all because copy-protected content was played back through the system but one or more displays did not support HDCP or could not provide valid keys.
HDCP also screwed up early attempts at matrixing and distributing HDMI because the repeater configuration doesn't work with multiple sets of keys. If one key is bad, the entire system shuts down. Yes, you might get lucky and get a connection one time, but drop it the next time. There had to be a better way.
The solution was to make every connection to and from an HDMI matrix switcher or distribution amplifier into a ‘source-sink' connection. This way, the source of the video content would see a maximum of one sink. And each output of the switch or distribution amplifier would see only one sink at the other end. Repeaters, begone.
If a non-compliant display (or one with corrupted keys) was connected to the switch, only that display would be shut down. All other compliant displays would work correctly, as keys were being exchanged normally. Displays could even be ‘hot swapped' while live and still function correctly.
Apple computers present a special challenge to matrixing and distribution. (Apple products largely use DisplayPort connections and not HDMI.) Their HDCP circuitry is turned on and off based on the connected display, and once set, can only be re-set by rebooting the computer and disconnecting the external monitor.
Here's an example: Your MacBook has a copy of a brand-new movie you've downloaded from iTunes. The movie plays back just fine on your LCD TV and also directly to your computer monitor. Now, you take that same MacBook into a conference room and connect to an HDMI distribution system, through a DisplayPort to HDMI adapter.
Now, the iTunes movie won't play back. While the connected video monitor(s) or projector(s) might support HDCP correctly, the HDMI switching system may not. As a result, nothing will appear on screen. While this will certainly create lots of aggravation, the system is working exactly as it is supposed to. What's more, nothing will be seen from an analog format-converted display connection on the HDMI switching system.
The solution is to make sure that all components in the system are HDCP-compliant if there is even the slightest chance of copy-protected content playback. On some models of presentation switchers, you can also set the HDCP function to follow the output (i.e. a connected display or displays) or follow the input. You can also disable HDCP completely when no copy protection is required.
DEEP TROUBLE
HDMI version 1.3 brought along a new set of headaches, mainly Deep Color. This is a way to transmit many more shades of color that the newest LCD and plasma displays can easily support, not to mention projectors. (The current standard for Blu-ray discs is 8-bit color.)
Pushing that many more colors through an HDMI system means greater demands on bandwidth, which can easily be exceeded for HDMI extenders using Cat 5 cable. Unfortunately, a Blu-ray player may be told to transmit Deep Color by the EDID it reads from the connected display. The only way to get around this problem is to disable Deep Color mode in the player's Settings menu, or use a slower frame rate such as 1080p/24.
Another way to get around the problem is to capture EDID from an older but similar display; one that does not support Deep Color. A gadget known as an EDID emulator can then be connected between the source and sink and ‘spoof' the source out of Deep Color mode.
VANILLA VS. CHOCOLATE?
DisplayPort was mentioned earlier in this article. This is another digital display/audio/data interface that is aimed primarily at computers and not at the video marketplace. DP, which launched in 2006, is a standard developed by VESA and offers a much higher data rate (17.2 Gb/s) than HDMI. At CES 2012, VESA was able to drive a 4K display (3,840 x 1,260 pixels) with a 120 Hz refresh rate, using 30-bit color. That's a lot of data.
The smaller version of the DisplayPort connector, Mini DisplayPort, is quite common on MacBook computers and is starting to make inroads to Windows-based laptops, too. Two years ago, Intel and AMD both announced that they would begin phasing out support for the VGA connector in 2012, shifting instead to DisplayPort (and/or HDMI, if laptop manufacturers wanted it).
The advantage of DisplayPort is that it packs a lot of punch in a tiny connector. This connector density enabled a new generation of super-slim, ultra lightweight computers called ‘ultrabooks' that were exhibited at CES 2012. DisplayPort can also carry Ethernet and control signals, and a souped-up version with the moniker Thunderbolt doubles as a 10 Gb/s data bus, using the PCI Express standard.
That means that one connector can now do everything – data, video, audio, and Ethernet. At CES and NAB, several companies showed Thunderbolt breakout boxes, extracting (among other things) USB, FireWire, Ethernet, HDMI, and audio, all from the same tiny connector. In fact, Intel showed a concept of a mobile 4K video acquisition and editing system, interconnecting all components through daisy-chained Thunderbolt cables. That included several RAID video storage drives and a pair of high-resolution monitors.
THE WRAP-UP
Which system will prevail, HDMI or DisplayPort? That's entirely up to the computer, TV, Blu-ray, set-top box, and console manufacturers. But it appears that laptop and ultrabook brands will rapidly switch over to DisplayPort, leaving HDMI to home entertainment products and digital cameras / camcorders.
That means you will have to start interfacing HDMI and possibly DisplayPort in the near future. Manufacturers are already making the decision for you by removing analog video connections from their products. The trend towards using super-large LCD TVs instead of projectors for some conference room and classroom applications is only accelerating the move to digital.
There's good news in all of this. Silicon Image, who collects royalties on the use of the HDMI connector through its subsidiary HDMI Licensing LLC announced a new HDMI Forum in 2011 to research and develop the next version of HDMI (V2.0).
Most of the listed participants in this forum come from the consumer electronics world, but at least one company manufactures commercial AV switchers and distribution systems. As a result, we may yet see locking HDMI connectors, higher data speeds, and faster connections through EDID and HDCP handshakes in the near future. Stay tuned.
Related Articles:
How To Guide to Image Magnification (IMAG)
NAB Show Offers a Host of New Tools for Church Technical Directors