In Ian Bogost’s Atlantic piece “Your Phone Wasn’t Built for the Apocalypse” he describes the difficulty in photographing the accurate color of the otherworldly orange skies blanketing the West Coast after devastating wildfires. This phenomenon is also a reminder that human eyes weren't built to fully account for the apocalypse either, as our brain works very hard to correct for the different colors of light cast on our world.
Many West Coast residents found themselves having issues capturing an “accurate” photo of the intensity of the sky, as their phone camera kept attempting to automatically correct the color to something more reflective of the usual hues expected in a human world. This is due to cameras correcting for something called white balance, where “color casts” or the color of the light shone on scenes and objects, is removed so that objects that would appear white in our world appear white in the photo.
As Bogost describes, “Camera sensors are color-blind—they see only brightness, and engineers had to trick them into reproducing color using algorithms… Most cameras now adjust the white balance on their own, attempting to discern which objects in a photo ought to look white by compensating for an excess of warm or cool colors.” As frustrating as this aspect of digital cameras can be, they can show a very interesting challenge for living beings to handle in seeing under various environmental conditions.
Human vision has its own version of a "white balance" where (as we think we understand it so far) the brain will dial how much of certain wavelengths of light it chooses to perceive so that the color spectrum still makes sense even if say, the light cast on it is tinted orange or yellow. The tendency humans have to be able to refer to colors we see correctly regardless of the color of light cast on them is referred to as color constancy.
This effect can be seen in illusions created to show the persistence of the brain in attempting to perceive the full spectrum, like this edited photo of a strawberry tart. Would you believe that there are actually no red or reddish pixels in this image? By using the appearance of blue tint on the image of berries, our brain believes that by comparison, the grey pixels in this image must be red.
The illusion gets even stronger when looking at the same image edited to simulate multiple other colors of light. Indeed in each of these images, the pixels of the strawberry are still grey.
While a blown out white-balance function can be a negative quality to have in camera technology we’re attempting to use to accurately document the color of light after a wildfire, this ability in the human brain is incredibly useful. It ensures we can see color and differentiate objects regardless of the cast of light set on them at different times of day or under different weather conditions like the orange sky of a natural disaster.
If color constancy intrigues you, there are several more examples you can test out for yourself by watching this fascinating talk by researcher Beau Lotto on what optical illusions have to teach us about how we see:
So while we find flaws like an overactive white balance in our cameras to be frustrating, examining how the same behavior can be found even in biological vision might help us appreciate the intentionality and care we have to use in trying to “objectively” observe our changing world.