The world around us is more or less colorful. At least that’s how we perceive living beings (some this way, others differently). But what do we actually see and how do we look? If we start by limiting ourselves to the physical properties of light – what we perceive with human eyes and call we see »visible light ”is an electromagnetic wave of approximately 400 to 700 Nm. This part of the wave is perceived by our eyes as light. By increasing or decreasing the frequency, it is continuously “divided” into color shades.
Thus we find purple light (approx. 400 Nm) at the lower end of the visible spectrum, and red light (approx. 700 Nm) at the other end of the visible spectrum. In the meantime, all the other colors of the “rainbow”. Due to our genetic make-up, our eyes are deprived of other colors, such as infrared (IR) or ultraviolet (UV). Some animals also have receptors in their eyes for this radiation. This helps them e.g. when stalking prey or hiding from predators.
Do we really see what we see?
Of course, the sensation of a color is primarily a manifestation of our brain, which sends electrical impulses from the retina to our “processor”, ie the brain. In the retina, we have cells that are sensitive to certain colors, which we call plugs and rods – plugs are of three types and are responsible for the perception of colors such as red, green and blue (RGB). All intermediate shades are a combination of the intensity of these colors – the camera sensor works in a similar way. Sticks, on the other hand, are more intended for light itself (light-dark) and are not sensitive to colors, they are used in low light and consequently we do not see such a rich palette of colors. Everything is kind of gray. At dusk, it is so difficult to distinguish the colors of an object and we do not distinguish well whether it is colored red or green.
Because the perception of light depends on our brains, which adapt wonderfully to different situations, it is of course difficult to build a standard based on subjective perception. Therefore, for the purposes of reproduction, they developed numerical standards based on physical properties. Such a standard is e.g. built into any digital camera. This, however, is different from our feelings. That is why we are often surprised when we look at beautiful colors in nature, and the captured image is somehow washed away or, on the other hand, oversaturated with colors.
We forget that a sensor is a machine that perceives the wavelength of light ALWAYS THE SAME with mathematical accuracy. But the brain adapts to the situation and changes that feeling. As an example, we can take a situation where we are surrounded by a slightly reddish light (classic incandescent light bulb). Under this light, our leaf e.g. the paper looks white as it should be, and shot with a camera will look pink. The brain has found a new CENTRAL REFERENCE FOR WHITE and this is how we perceive it – as white. So we see that we cannot exaggerate in our eyes.