News in English

What is HDR? High-dynamic-range TVs and formats, explained

When you buy through our links, Business Insider may earn an affiliate commission. Learn more

HDR TVs, like the Samsung S90D (pictured), can produce images with enhanced contrast and colors.

High dynamic range (HDR) is a feature found on many TVs sold today, and it can dramatically enhance picture quality. HDR allows a display to produce images with higher brightness, better contrast, and more vivid colors. In other words, HDR can make images pop more intensely and appear more realistic and closer to how content creators intended them to look. 

However, though HDR is now a common feature on many of the best TVs, performance varies greatly across different models. Just because a display says it supports HDR doesn't mean it's capable of really taking advantage of the feature. To get the most impressive HDR viewing experience, you'll need a top OLED display, like the LG G4, or a bright QLED set, like the TCL QM7

To help you sort through all the ins and out of high-dynamic-range display technology, we've put together a handy guide detailing everything you need to know about HDR TVs and different HDR formats.

What is HDR?

An HDR video displayed on an LG G4 OLED TV.

High dynamic range (HDR) is a display technology that enables a TV or monitor to produce enhanced contrast levels when playing HDR-encoded videos. This allows bright elements of an image to look brighter and dark elements to look darker while preserving more detail in all the steps between both ends of the spectrum. 

The enhanced contrast of HDR can create an added sense of depth with more visible details and realistic intensity in specular highlights and shadows. A good example would be a scene featuring sunlight glimmering off the ocean. Without HDR, a sequence like this could look comparatively dull with lost detail and clipped highlights, but with the expanded range that HDR allows, extreme highlights like this are given room to breathe and really pop. The reflecting light can look more radiant and more detailed simultaneously. 

Brightness levels for HDR are measured in a unit called nits. The more nits a TV can produce, the brighter its HDR highlights can look. In general, most HDR movies and TV shows are mastered with a max of 1,000 to 4,000 nits in mind, though HDR videos can technically be mastered for up to a whopping 10,000 nits. However, very few consumer displays can hit such a high number. Most premium HDR TVs currently max out at around 1,000 to 3,500 nits, while midrange models can achieve 700 to 1,000 nits, and entry-level options offer around 400 to 700 nits.

What is Wide Color Gamut?

HDR TVs, like the Sony Bravia 9 above, can display a wider range of colors than a typical TV.

HDR is often bundled with another display feature called Wide Color Gamut (WCG), which enables a TV to produce an expanded range of colors. Though technically two separate things, a wide color gamut is almost always used when mastering a video in HDR, so WCG is often considered part of HDR tech.

When using WCG, HDR videos are encoded within a color space called Rec. 2020, though most HDR movies and TV shows are actually graded for a narrower standard called DCI-P3. DCI-P3 is the same range of colors used for modern digital theater projection, while Rec. 2020 can offer an even wider range than that.

So, with HDR and WCG, you can enjoy the full spectrum of colors you see in theaters at home.

How does HDR compare to SDR?

The above demo from the Spears and Munsil 4K Benchmark disc displays the same video in HDR (left) and SDR (right) simultaneously.

Before the first consumer HDR displays hit the market in 2015, TVs were built to adhere to a standard dynamic range (SDR) specification. SDR displays and content are designed with a max of only 100 nits in mind, and they're typically produced for a limited color gamut called Rec. 709. Compared to the DCI-P3 color space used on most HDR videos, Rec. 709 offers a much more restricted gamut of colors.

This means that SDR displays produce comparatively dim and low-contrast images with a narrower range of colors when compared to an HDR TV playing HDR programming. As a result, an SDR version of a movie or TV show will often look a bit flat and slightly muted compared to its HDR counterpart. 

SDR TVs in HD and 4K resolution are still manufactured today, and all HDR-compatible TVs can still display SDR signals accurately. While HDR mastering has become a popular choice for new on-demand streaming content and 4K Blu-ray movies, SDR is still the norm for cable, satellite, live TV streaming, and over-the-air broadcasts. 

How do I watch HDR movies and TV shows?

The Normal Dolby Vision preset (pictured above) looks good, but the Dolby Vision Dark preset crushes shadow detail when watching content from the TV’s built-in apps.

To watch HDR videos, you need an HDR-capable display and access to HDR-encoded content. Every element in your home entertainment chain also has to support HDR, so if you watch videos through a streaming stick, it needs to be HDR compatible, and if you connect your media player to your TV through an AV receiver or soundbar, those components need to support HDR passthrough. Likewise, all devices need to be paired together using premium- or ultra-high-speed HDMI cables. Check out our guide to the best HDMI cables for top picks.

HDR videos are available on all the best streaming services, including Disney Plus, Netflix, Hulu, and Amazon Prime Video. HDR is also used on most 4K Ultra HD Blu-ray discs. Some live sporting events are also shown in HDR through certain providers, but the vast majority of cable, satellite, and live TV streaming broadcasts are still presented in SDR.

What should I look for in an HDR TV?

Samsung's S95D is the brightest OLED we've tested, enabling excellent HDR performance.

All of the best 4K TVs include some level of HDR support, and some HDTVs even have HDR capabilities. However, performance can vary dramatically between cheaper models and more expensive displays.

If you want the best HDR performance, you'll want to buy either an OLED TV or an LED display with local dimming and expanded colors. LED TVs with those features are often branded as QLED, Neo QLED, or QNED. All of these display options offer the necessary contrast control, peak brightness, and color capabilities that are needed to really show off the benefits of HDR playback.

When it comes to brightness, you'll want to choose a display model that comes close to outputting 1,000 nits or higher to see the full benefits of HDR content. However, TVs with that level of performance tend to be a bit pricey, and you can still get worthwhile entry-level HDR performance out of cheaper models that max out in the 500 to 600 nits range. 

Some top HDR TVs available right now include the Samsung S90D, LG G4, TCL QM7, Sony Bravia 9, and Samsung S95D. Meanwhile, the Hisense U6N and Roku Plus Series are both great options on a budget. For more recommendations check out our various TV buying guides:

Are there different HDR formats?

The video content on the TV pictured above is being played in the Dolby Vision HDR format.

There are four primary HDR content formats: HDR10, HDR10+, Dolby Vision, and HLG.

HDR10 is the most basic and common HDR format. It's supported on all HDR TVs and is used as the standard HDR format on all 4K Ultra HD Blu-ray discs and streaming apps with HDR content. In other words, it can be thought of as an HDR base layer that more advanced HDR formats can be added to.

HDR10 videos can be mastered for a peak of up to 10,000 nits, though most HDR10 content has been graded for 1,000-4,000 nits. HDR10 videos are encoded with information called "static metadata." This metadata tells a TV what colors to show and how bright the TV's images are supposed to look. Static metadata is a bit limited, however, as it can only provide information that addresses the video as a whole instead of each individual scene.

When an HDR10 movie is played on an HDR TV that can't support the full range of brightness and color that its metadata calls for, the display must adapt on its own to scale highlights and color volume to land within its capabilities. This kind of adjustment is called "tone mapping," and different display manufacturers handle tone mapping differently.

In practice, this can lead to issues with certain scenes in HDR10 videos appearing too blown out or too dark since a TV's tone mapping may not match what a content creator intended. And this is where Dolby Vision and HDR10+ come in.

Dolby Vision and HDR10+ are both "dynamic metadata" HDR formats. This means that HDR brightness and color information can be detailed on a scene-by-scene or even shot-by-shot basis. As a result, Dolby Vision and HDR10+ videos can provide more detailed tone mapping instructions to a TV so that the creators' original intent is more accurately preserved. These formats are supported on select TVs, discs, and streaming services.

Finally, HLG is an HDR format used for TV broadcasts. This format does not use metadata at all, and it's backward compatible with SDR displays, so broadcasters can send an HLG signal to all customers and have it look correct on both SDR and HDR TVs. If an HDR TV broadcast were sent to an SDR TV using any of the other HDR formats we've discussed, it would display with inaccurate colors and contrast.

Are Dolby Vision and HDR10+ better than HDR10?

Yes, Dolby Vision and HDR10+ can both deliver a more accurate high-dynamic-range image than the standard HDR10 format. However, their improvements are often subtle and are best appreciated on entry- and midlevel HDR TVs, which can benefit the most from their dynamic tone mapping instructions.

Popular HDR TV models typically include support for one or both of these dynamic metadata formats. But in most cases, we don't think it should be a dealbreaker if one is included and the other isn't.

Is Dolby Vision better than HDR10+?

Dolby Vision and HDR10+ offer the same primary benefits compared to HDR10, and neither format has a major technical advantage over the other. However, Dolby Vision has an edge when it comes to industry support. 

Six of the seven major TV brands in the US sell TVs with Dolby Vision capabilities, while five of those seven support HDR10+. Here's a chart detailing which TV brands sell models with Dolby Vision and HDR10+ support.

BrandDolby VisionHDR10+HDR10
Hisense✓ 
LG 
Panasonic
Samsung 
Sony 
TCL
Vizio

When it comes to content, Dolby Vision movies and TV shows are also more prevalent than HDR10+ programs. More studios support Dolby Vision on Ultra HD Blu-ray discs, and more streaming services use it. Here's a breakdown of Dolby Vision and HDR10+ support among major streaming services.

Streaming serviceDolby VisionHDR10+HDR10
Amazon Prime Video
Apple TV Plus
Disney Plus 
HBO Max 
Hulu
Paramount Plus
Peacock 

Do video games support HDR?

Select PS5 games, like Marvel's Spider-Man, support native HDR output.

HDR gaming is supported on PS4, PS4 Pro, PS5, Xbox One X/S, Xbox Series X, Xbox Series S, and compatible computers. However, only select games are designed with native HDR output.

Windows PCs, Xbox Series X, and Xbox Series S can support games using HDR10 and Dolby Vision formats, while the PS5 is currently limited to HDR10 only.

If you're using a PC, you'll need an operating system, monitor, and graphics card that all support HDR to display HDR games properly.

Read the original article on Business Insider

Читайте на 123ru.net