Gamma : The legacy light patch

Gamma, gamma space, linear space, gamma correction …

What is that confusing Gamma ?

The reason why gamma began to exist :

Pixels of CRT monitors emits light with a non linear response from voltage applied.

The first display of digital images where done with CRT monitors : An electron gun shoots the back of a screen where phosphorescent pixels diffuse light when hit. More electrons fired result in more light emitted. More voltage result in more electrons fired. More color intensity in image data result in more voltage. A black pixel data will not produce voltage.
Guns are fun but there is a problem : Due to some physical limitation, the beam intensity from an electron gun is not proportional to the voltage applied. It follows a law of power 2.2

On the screen, the luminosity of an image appears wrong : the dark shades have become even more darker.  Play with the slider to see the difference with what is expected (left) and the result (right). A gradient is added to visualize the mid-tone shifting.

The solution chosen was to not touch the costly hardware, and simply compensate the predicted luminosity loss by encoding more lightness in the dark shades of the image data. It is done by elevating each pixel’s color value to the power 0.45, which is the reciprocal of power 2.2   (1 / 2.2 = 0.45)

Now the image’s luminosity will looks correct on the screen.

The exponents of both power functions are called “gamma”, and the process of lightness compensation is referred as “gamma correction”.

The reason why gamma still exist :

The human eye is more sensitive to differences of luminosity in dark shades.

Biological Evolution is keeping only what you need. Lighting a torch in broad daylight makes no difference of what you see, but it does at night. The more we are in the dark, the more we need sensitivity to light. This is probably why the human eye has a better light sensitivity to dark shades.

The relation between luminance and perceptual lightness approximately follows a power function :

The luminance means the physical intensity of light from a surface, it can be measured with instruments, and lightness is the perceptual intensity of this luminance by the human eye : the “relative luminance”.

For a sheet of paper in daylight :
If the paper reflect 100 % of light you will say its color is white. If the paper reflect 0 % of light you will say its color is black.
But if the paper reflect 50 % of light you WILL NOT say its color is middle grey, but something far more bright, near white.

Relieved Rick And Morty GIF by HULU - Find & Share on GIPHY

We just went through an inter-dimensional portal, where gamma in color management is still unknown, and CRT monitor don’t exists. The first display just invented is an LCD monitor, and it can render luminosity data with great accuracy, without correction. Its response to color data is linear.
As in the original dimension, for a digital image, each color of each pixel is encoded on 8 bits. This create a range for 256 tones of light. Much more of what an eye is capable of discerning.

A digital camera takes a picture of this painting

"gradient" artist unknow

And the result on the perfect linear LCD monitor

on the LCD monitor

The general appearance looks good, but something is wrong within the low tones. It’s looks like there are data loss.
The problem come from the way the light values are encoded : The human eye can very well discern dark tones, and the encoding “depth” is not enough at this level of low luminance.
The painting shows a linear gradient, but since we perceive lightness in a non linear way, the original luminance of the gradient is lower than what we see. The encoding lacks of precision in low tones, and most of the data is wasted in high shades that can’t even be discerned.

original luminance

These artifacts have catastrophic effects on dark pictures :

A first solution is to encode pixels with more bits depth, but computer at this moment aren’t that powerful. We have to stay with 8 bits.
So the best solution is to encode perceptual lightness, not luminance. This is done by elevating each luminance data to the power 0.45, and then writing it into 8 bits depth.

This is the same process as gamma correction for CRT monitors, and it’s a coincidence.
The LCD screen now only has to “gamma reverse” the image data, otherwise it would be too luminous. This is how standard LCD displays work in our present.

Today, the gamma correction is one of the standardized characteristics used to read and write digital images, in a “color space” format.
The most wildly used for the internet is the sRGB format. It use the 8 bits depth, as it can save memory space and bandwidth.
More format with more bits exists, for higher quality images production.

This embedded Unity application can help to visualize how the basics of gamma works :

(works better on desktop computer)

In software, images manipulation must be done in linear space to remain physically correct. That means calculations must be done on the physical luminance side of the image, not the perceptual lightness side.

Gimp by default allow image manipulation in linear space, Photoshop and Krita default are gamma space, and Blender paint tools are locked gamma space. You can track the status of this problem that I reported to the Blender community here.

For rendering images with Unity, there are two modes for color space : gamma and linear.
-The gamma mode will calculate light and shaders in the old school way, giving approximate results.
-The linear mode will “gamma reverse” textures and will apply light, shaders effects, and post effects all in a linear pipeline, then re-apply gamma correction for the final render. This gives more realistic results at the cost of more calculation time.

The main reason why gamma will continue to exist :

Backward compatibility.

Most of the existing images on the internet are gamma corrected, and most of the existing display are designed to consider the image’s gamma correction. You can’t change one side without breaking the other. We have to continue to use gamma, at least for the internet, as long as humans see in a non linear way.

The full science of recording and displaying light is complex. It involves biological processes, physical measurements, eye perception, human subjectivity, hardware and software , norms and standardization…

It’s a white rabbit hole and it can quickly make us feel stupid if we try to go deep down. Plus there’s no light down there.

Animated GIF - Find & Share on GIPHY