Skip to main content
You have permission to edit this article.
Edit
Back to class: neuroscience of photography

Back to class: neuroscience of photography

  • 0
{{featured_button_text}}
Technological Singularity

Artificial intelligence to manage everything

You might not be an undergrad anymore, but you can still give learning the old college try. At the University of Richmond, L. Andrew Bell, Ph.D., teaches a first-year seminar called "Neuroscience of Photography." We asked for some insight.

***

“A picture is worth a thousand words” – what does this old saying tell us about our visual brains?

The visual brain takes the basic building blocks of a photo – color, lines and shapes – and translates them into rich experiences that invoke memories and emotions.

This experience is the result of millions of interconnected neurons communicating with each other. For example, neurons (photoreceptors) in the eye’s retina sense light, and neurons in the brain’s inferior temporal lobe of the cortex recognize faces.

The visual brain synthesizes information about not only what is in a photograph, but the feelings we have about its contents. It is the cognitive power of the visual brain that enables a photo to convey so much information, which underpins the adage that "a picture is worth a thousand words."

Is there a sequence or method to how our brains process an image?

The mental construction of an image depends on the coordination of two types of visual processing: “bottom-up” and “top-down” processing.

Bottom-up processing consists of basic visual elements, such as lines and shapes. Top-down processing refers to applying higher-level “thinking,” as in attaching meaning to objects.

Tell us about perception vs. reality in how we might interpret an image.

Something I always tell my students is that reality – the physical nature of something like a photograph – is uniquely experienced because everyone’s algorithm for perception is unique.

This unique perception is attributed to both physical and experiential differences. Physical differences can include the number of color-sensitive cones in the eye's retina – errors of which lead to colorblindness. Experiential differences can include, for example, whether you’ve seen the Grand Canyon or felt the warm embrace of your grandmother.

The act of perception is creating an internal representation of the external world. This internal representation, sometimes referred to as your “internal model,” is your cognitive fingerprint, and it’s uniquely yours.

Is there a misunderstanding about how our brains work visually?

In many textbooks, the visual system is illustrated as a serial circuit: It transmits information downstream with a terminal location (perception) – similar to the way cameras work (light, then lens, then sensor).

In reality, the visual system is full of reciprocal connections among interconnected functional areas of the brain. This creates feedback loops for visual information to be refined and processed.

The bottom-up and top-down processing that I mentioned above do not work as if they were lined up like dominoes. Instead, both forms of processing are heavily influenced by each other.

So how well does a camera lens compare to the human eye?

The function of lenses – in cameras and our eyes – is essentially the same: to resolve an image on a photosensitive surface.

The focal length of a lens dictates the field of view. With cameras, the field of view they capture can vary. But the eye has a very specific field of view.

For cameras, focal length can range from a wide field (wide-angle lens) to a narrow field (telephoto lens). The use of different lenses used to be the exclusive domain of high-end cameras with interchangeable lenses, but new smartphones have multiple cameras and lenses to give users different field-of-view options. For example, the new iPhone 11 Pro has three cameras with different focal-length lenses.

Typically, a photograph taken with a wide-angle lens (commonly referred to as a “fisheye”) is distorted – objects on the periphery appear larger than they actually are. Our eye actually has a wide-angle lens, too – the focal length is 22 mm – but our perception isn’t distorted, as it is with the photo. That’s due to evolutionary pressures on the way we process our visual field.

Our peripheral vision does not impact our perception like the central 60 degrees of our field of vision. Instead, our peripheral vision is important for detecting motion – like an attacking lion.

A quick demonstration of this effect is to try to read something that is in your peripheral vision – impossible, right? This selective processing means that, effectively, our perceptual field of view is closer to a “normal” focal length of 50 mm.

Trillions of photos are shared electronically every year. What are potential ramifications?

When film was used more predominantly, the photograph was heralded as the “true” representation of reality. But this no longer holds true in the current state of digital photography technology.

Today, photographs are easily manipulated with tools such as Photoshop and even new augmented-reality filters, such as those found in Instagram and Snapchat.

As a result of these perception-altering technologies, there can be a gap between the person who appears in your mirror and the person who appears in your photos. Because our internal model of reality is constantly being revised by experience, these augmented-reality technologies have the potential to greatly impact our self-identity and perception.

It is no surprise that plastic surgeons have seen an increase in patients wanting to change their appearance to match their Snapchat-filtered photos. Even if you aren’t taking selfies and using Facetune, I think it’s useful to understand the basics of photography and our visual system so that you can critically interpret reality.

L. Andrew Bell is a technology consultant in the University of Richmond’s Teaching and Scholarship Hub, where he consults with faculty on integrating digital tools into teaching and scholarship. He also teaches courses in neuroscience and data analysis in the psychology department.

Related to this story

Get up-to-the-minute news sent straight to your device.

Topics

Breaking News