New technologies force us to be familiar with numerous technical concepts. This must also be taken into account when purchasing an LED display. A type of product around which certain terms that are not always easy to understand revolve: 4K, 8K, HD, Dolby Vision, pixel pitch… We know how complex this can be. That’s why, just as we’ve dealt with how to choose the right pixel pitch of an LED display, today we want to focus on another very important aspect: what are nits and why are they so important?
Although image quality in advertising is an essential aspect, both for professionals and advertisers, and for the recipients of the advertisements or message, not everyone knows what requirements are needed to achieve it. There is no single answer, but there is no doubt that nits play a fundamental role. Do you want to know what they are? If so, we recommend you read on.
The term nit comes from the Latin verb nitere, which means ‘to shine’. In the field of image, a nit – also known as candela per square meter (cd/m²) – is the unit of luminance adopted by the International System of Units (SI). As we have seen, this unit is calculated from two variables, the candela and the square meter. Or what is the same: the light intensity and the screen area, respectively. As a measure of the light emitted on a particular surface, this unit is often used to specify the brightness of a screen or monitor.
But why is it called candela? This is so because the light that, on average, is emitted by a candle or a candlestick is taken as a reference. Thus, this amount of light, distributed over one square meter, is a nit. Thus, the higher the number of nits, the brighter the screen surface.
Some LED display manufacturers specify the maximum brightness that each of their monitors can offer in nits. Therefore, this value reflects the ability of a display to make certain image details perceptible, especially in environments with a large amount of light.
Typically, a cinema screen will achieve about 50 nits, while an older home television may have between 100 and 400 nits. Modern TVs can offer higher brightness. Thus, HDR televisions – an acronym for high dynamic range as we’ll see a little further down – can have around 1,500 nits. In the next few years, we will probably see even higher light outputs. In fact, two years ago a well-known Japanese multinational presented a prototype screen that reached a maximum luminosity of 10,000 nits, a figure that is far from today’s values. Today, 10,000 nits is the limit for HDR technologies. In the case of LED signs and signboards, it is possible to reach up to 40,000 nits.
At the same time, the HDR video is mastered to a maximum of 4,000 nits, which is the top of the Dolby Pulsar production monitor, a product that is only used at a professional level. If we want to compare it to a PC monitor, the Display HDR standard has a limit of 1000 nits. On the other hand, computer screens are usually around 300 nits, a low-end mobile, 600 nits, and a high-end one, between 700 and 900 nits.
Now that we’ve seen what nits are, and if you’re familiar with LED display technologies, the following question probably arises: How are they different from lumens? These two concepts are related to the light emitted by a monitor, so this can create confusion. Let’s look at the differences below.
The lumen, a Latin expression meaning ‘light’, is the SI unit for luminous flux. It is equivalent to the luminous flux emitted by a uniform point source in a given location and whose intensity is equivalent to the light from a candle. It would therefore be the amount of light that an appliance emits to illuminate another object – such as a light bulb or a fluorescent lamp – while nits refer to the brightness of an object that the user is going to look at directly, such as a television or a mobile phone. Therefore, if we want to know how a technological device that gives off light is going to look like during daylight, we have to know the number of nits it has.
Having clarified this doubt related to lumens, we go back to nits to expand two concepts that have already mentioned: HDR and Dolby Vision. Explained in a very brief way, these are two technologies that allow to show an image with a more accentuated difference of brightness between its darkest part and the brightest one. Since the human eye is very sensitive to variations in light intensity, we perceive these changes immediately.
The format is divided into several categories, which are specified below.
– HDR 10. This is a static, high dynamic range HDR format where the image settings are the same throughout the video content.
– HDR 10+. This is the dynamic format. It allows frame-by-frame calibration for higher quality visual content.
– Dolby Vision or HDR Dolby Vision. This is the format that provides the highest quality, thanks to human intervention. This is why it is considered an enhanced version of HDR 10+. With 12-bit encoding, Dolby Vision HDR technology delivers 4096 tones per color and provides metadata associated with each image, ensuring the most accurate image reproduction possible. The system is designed to reproduce a light range from 0 to 10,000 nits.
At this point, the only question that remains is: how many nits does an LED display need to have for optimal image quality? To calculate this, it must know whether it is an outdoor LED display or an indoor LED display and at what distance, and from which location, the contents will be broadcast.
Do you want to know more about LED displays? In that case, do not hesitate to contact us by phone, calling (+34) 977 271 074; by e-mail, sending us a message to email@example.com, or by completing our form. We will be happy to assist you in an honest and professional way.