When Choosing A HDR Capable TV Or Monitor, You Need To Choose Wisely

HDR or High Dynamic Range is something that TV and monitor buyers are starting to want on their next TV or monitor. A display that has HDR will give you deeper blacks and more vibrant colors. Which if you’re watching a movie that supports HDR will make that movie look amazing. But you need to be careful because HDR isn’t a uniform standard…. If you want to call it that. Only a handful of TVs and monitors can actually do true HDR. And if you’re not careful, you might get something that doesn’t actually do HDR. Thus to save you the effort of trying to figure out what can and can’t do HDR, I’ve created this primer to help you spot a good HDR monitor or TV. Let’s start with what the display needs to have to be even in the conversation in terms of being called an HDR display:

  • HDR images are brighter than SDR, so a true HDR display needs to hit at least 1000 nits of brightness.
  • It needs to support very high contrast so that high brightness elements can be displayed alongside deep, rich shadow detail on screen simultaneously.
  • It needs to support a wide color gamut, allowing for a greater range of colors to be displayed.
  • It needs to follow HDR encoding systems like the use of the PQ gamma curve and minimum 10-bit processing.

If the display doesn’t check all those boxes, look elsewhere. Now not all TV and monitor companies disclose all that info. So there’s another way to check to see if the display that you want is HDR capable. And that is what display panel tech they use. Let’s start with best to worst:

  • OLED: Organic Light Emitting Diode displays are the top tier of displays. Because each individual pixel is self lit, the display can show perfect HDR images. So if the TV or monitor that you want has an OLED panel, you’re going to get HDR by default. There are two catches to this, cost being the biggest one as OLED monitors or TVs are not cheap. The second is burn in. Static images can create image retention issues (AKA burn in) which can make your really expensive TV or monitor look bad. But if you have the cash for an OLED display, this is the way to go to get a great HDR experience.
  • Mini-LED/Full Array Local Dimming LED: This is the next best option as this technology is much cheaper than OLED and produces great HDR visuals. It does that by having a number of “zones” for the backlighting system that can individually adjust to give you the level of light required to display the content as dark or bright as required. And the number of “zones” is what you need to pay attention to. I would say that if a display doesn’t have something north of 500 zones, you won’t get the same level of HDR quality as OLED. Or put another way, the more zones, the better the HDR performance. One thing that some people bring up as an issue is something called “blooming” where light from a zone that is bright leaks onto an adjacent zone that is dark making the dark zone not as dark as I could be. Now I have a MacBook Pro with a Mini-LED display and I have never noticed this. But others have. Thus I would also advise testing it out with a variety of content to ensure that this isn’t a potential issue for you.
  • Edge Lit LED: This is a very low cost LED technology that promises good HDR performance, but it mostly doesn’t deliver. Because the way that edge lit works is that much like Mini-LED, it has zones that can be individually be darkened or brightened. But the problem is that the zones in question either run from the top to bottom of the screen, or from left to right. And there are typically only as few as four or as much as sixteen zones. Thus there’s just no way you can get good HDR performance via this setup. Thus this is a type of display that I would avoid.

Hopefully that helps you to avoid the potential pitfalls of buying a monitor or TV. If you still have questions, please leave a comment below and I will do my best to answer whatever questions you have.

Leave a Reply

%d bloggers like this: