What is HDR?
We explain the principles of HDR screens and why it’s set to take the world by storm
HDR refers to high dynamic range. In the most basic sense, it means richer colours, higher levels of contrast between darks and lights and an overall better reproduction of real life on a screen.
It works on the premise that our eyes can detect brighter whites and darker blacks than a TV or screen normally provides and our eyes can also interpret a wider range of colours than the standard Rec. 709 format.
What makes HDR, HDR?
HDR screens take into account a number of factors, including contrast and colour.
Contrast refers to the difference between light and dark, the lighter the light and the darker the darks (and the difference between both), the more dynamic range.
TV brightness is measured in an age-old scale, called nits. To give you an idea of the level of brightness, one nit represents the light a standard candle gives off. An HDR display must meet minimum levels of darkness and brightness for it to be regarded as HDR.
The guidelines specify that for a screen to be regarded as HDR, it must have 1000 nits of peak brightness and less than 0.05 nits black level, or more than 540 nits peak brightness and less than 0.0005 nits black level.
Colour balance is another crucial element of HDR standards is the colour balance. Colour is measured in bit depth and for a screen to be regarded as HDR, it must have a bit depth of 10, which means it can display up to a billion individual colours. A Blu-Ray DVD in comparison is 8-bit, which can handle 16 million different colours. HDR displays must also meet colour standards - specifically 90% of the P3 colour standard and the BT.2020 colour representation.
Get the ITPro. daily newsletter
Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.
Content is key
Like 4K, HDR content needs to be recorded in that particular format and the standard to look out for at the moment is HDR10, although Dolby has also introduced its own variation called Dolby Vision. Samsung has also introduced HDR10+ making everything a little more complicated. Hybrid-Log Gamma is yet another standard, while Advanced HDR is the newest option, just to confuse things even more.
What's important to note though is they all adhere to the same standards, but different content providers have just opted to use different technologies. For example, Amazon Prime Video will use Samsung's HDR10+, while Netflix supports both Dolby Vision and UltraHD Premium. What you will need to ensure though is your screen supports the standard you're most likely to be using.
Some TVs, such as LG's HDR range, will support all standards, but others may only support a few of them.
Why is HDR better than 4K?
4K refers to the resolution of a screen in pixels. However, HDR is more about the quality and richness of the content being displayed. Although 4K will theoretically mean details should be sharper (just think about the difference between HD and standard resolution when HD TVs were first released years ago), when put side by side by 4K resolution content, HDR should theoretically show up more detail because the colours make details a whole lot more intricate.
Combine the two and 4K HDR is set to blow your mind.
Clare is the founder of Blue Cactus Digital, a digital marketing company that helps ethical and sustainability-focused businesses grow their customer base.
Prior to becoming a marketer, Clare was a journalist, working at a range of mobile device-focused outlets including Know Your Mobile before moving into freelance life.
As a freelance writer, she drew on her expertise in mobility to write features and guides for ITPro, as well as regularly writing news stories on a wide range of topics.