4K televisions have dropped greatly in price since their arrival on the market. However, many consumers who’ve been tempted by the good deal have been disappointed to discover that the higher resolution isn’t significantly better than the 1080P TV they just replaced. What gives?
Short answer: home theater enthusiasts are learning what early adopter digital photographers learned last decade: more pixels do not automatically equal a better image.
Remember when camera ads screamed “8 megapixels,” “12 megapixels,” et cetera? Most people shared photos via email, or printed out pics on 4×8 or 5×7 photo paper, and didn’t see any difference in quality; but they sure paid for those extra pixels. The same concept applies to television screens.
The truth is that from a distance of 10 feet away from a screen of 50” or under, the human eye can’t ascertain much—if any—difference between 1080 lines of visual data and 2160 lines of visual data. You have to go to a 90” screen size (or walk up to around 4’ away from a 50” screen) before you’d see a difference in resolution. But even then, the perceived improvement may not match the price you’d pay for the new display.
So why are some 4K sets astoundingly superior to HDTV, while others are just meh? The difference is HDR: High Dynamic Range technology.
HDR is truly revolutionary! It allows an almost unbelievable amount of fine color gradation, much closer to what the human eye perceives. How does HDR achieve this? Hang on, we’re about to get technical.
The reason you can tell the difference between certain shades of red (for example) while watching a sunset on your TV is how much—or how little—light is coming along for the ride on that color wavelength from the TV to your eye. The difference is known as “contrast.”
Contrast is measured by comparing the difference between the whitest whites your TV can display to its blackest blacks. It’s measured in candelas per meter squared (cd/m2). Commonly, the results of this measurement are called “nits.” (Don’t ask me why; just go with it.)
So, back to 4k TV: that bargain 4k TV doorbuster special on Black Friday may have higher resolution, but it still might only produce between 0 and maybe 300 nits. So the color rendition isn’t any better than most 1080p HDTVs. In fact, it could even be worse! Some high-end OLED HDTVs already top out at 500 nits.
But 4k TVs with HDR can display up to 4,000 nits! That means every red pixel (back to our sunset scene) can have up to 4,000 levels of brightness at 12-bit color depth. So can every yellow pixel. And every orange one. You get the idea.
Of course, with any new and emerging technology, there are different (and competing) standards. HDR is no different. So it pays to read the fine print on the box before you plunk down that credit card. Dolby’s HDR standard is the tops (4,000 nits at 12-bit depth), but not every TV supports it. There’s also an industry coalition standard that maxes out at 1,000 nits at 10-bit color depth. While not as theoretically capable as Dolby’s tech, it still delivers an obviously superior picture compared to 1080p HDTV, regardless of screen size. It’s commensurately cheaper than the Dolby standard, too.
There’s a great rundown of all the technical details here.
Whew; that was a lot of info! But hopefully, you now have the information needed to make a smart purchasing decision. But we’re not done! While your TV is the centerpiece, a home theater has many pieces. We have several upcoming blog posts intended to help you select the perfect components for your ultimate home theater. Stay tuned for our next installment: cables!
In the meantime, mount that beautiful new set, and see what it can do!
Adam Best is a digital marketing professional with over 20 years in the industry, from start-ups to Fortune 500 companies. A self-professed geek, he enjoys taking complex technical concepts and breaking them down to easily understood applications.