It’s happening. After years of tedious technical groundwork, the gorgeous future of PC displays finally—finally—materialized at CES 2017. At this year’s gadget show, a wave of fresh standards emerged to bring luscious high dynamic range image support to computers.
So what’s the big deal? A quick glance at any current HDR TV—like the Samsung 9800—should make the technology’s benefits instantly apparent. High dynamic range greatly expands a display’s contrast and color range, resulting in vibrant, more accurate colors that “pop” against HDR’s deeper, more accurate blacks. Indeed, to most peoples’ eyes, the visual impact of HDR is far more impressive than the sheer pixel mass of a 4K resolution—not that the difference matters much, as industry sources hint that many (but not all) PC-bound HDR monitors will sport 4K resolutions as well.
Before we dive into the hardware, let’s dig into the software and other technical groundwork that’s finally making HDR on PCs possible.
Laying the pipe
You could see this coming if you were paying attention.
Before HDR monitors could happen, graphics cards needed to be able to actually display HDR images.
HDR first appeared in the tightly integrated world of TVs long ago, but the technology's arrival to the messier, wide-open world of PCs is a more recent development. Both Nvidia’s GeForce GTX 10-series and AMD’s Radeon RX 400-series graphics cards baked in HDR rendering capabilities when they launched last summer. The new generation of graphics cards was quickly followed by Shadow Warrior 2 launching in October as the first-ever PC game with HDR (it also included Nvidia’s wonderful multi-res shading technology). Then, in early December, AMD’s Crimson ReLive software unlocked the rival Dolby Vision and HDR-10 standards in Radeon hardware.
Thus the stage was set for HDR’s arrival on PCs at CES 2017. But an important part of the debut is, well, yet more groundwork.
An HDMI Forum illustration of the differences between HDR types.
Crucially, the HDMI Forum revealed the HDMI 2.1 specification, which includes support for dynamic metadata. (The HDMI Forum calls it “dynamic HDR,” which means “dynamic high dynamic range,” which makes my brain hurt, so the more-accurate “dynamic metadata” it is.)
Whereas HDMI 2.0a sticks to using a single HDR grade for a video, dynamic metadata allows displays to optimize individual scenes and even frames to the capabilities of your specific hardware—meaning you’ll always see the best brightness, contrast, color gamut, et cetera rather than a one-size-fits-all HDR implementation. It makes gorgeous displays even more beautiful, in other words.
Sign up for Computerworld eNewsletters.