What you know about an object influences how you see it

  • The ability to visually perceive objects with accuracy allows for effective interactions with the surrounding environment.
  • Processing visual information involves several areas of the brain, yet how this occurs is not fully understood.
  • Recent research from George Washington University has found that what a person knows about an object may influence the brain pathways used in processing visual perception, influencing what they see.
  • Study authors suggest their findings could have important implications for medical displays, product design, and technologies including augmented reality.

How individuals perceive what they see, hear, taste, or smell is incredibly diverse.

For instance, when observing a cloud-filled sky, one person may see intricate shapes that resemble animals or objects, while another only sees clouds.

Still, research investigating why humans perceive visual input differently is limited. But scientists are gaining a deeper understanding of visual processing and its relation to how an individual perceives and acts on visual stimuli.

Recently, researchers from George Washington University’s Attention and Cognition Laboratory uncovered clues about how the brain processes an object in the visual system and what regions in the brain this takes place.

Specifically, they found that the object’s purpose influences where visual processing occurs in the brain and that knowledge and experience with the object may impact how well it’s perceived.

The findings suggest that what a person knows about an object directly influences perception.

Their research appears in the journal Psychological Science.

Investigating visual processing pathways

Visual object perception may involve several areas in the brain.

Study authors Dick Dubbelde, a recent Ph.D. graduate and adjunct professor at George Washington University, and professor Sarah Shomstein, Ph.D., professor of cognitive neuroscience at GWU’s Department of Psychological & Brain Sciences told Medical News Today:

“Usually, when we talk about vision, especially for more complex processes like object recognition, we’re talking about the occipital lobe, the inferior temporal lobes, and parts of the parietal lobe.”

What’s more, previous research from 2016 suggests that the visual perception process may involve two separate but interacting pathways in the brain — the dorsal and ventral pathways.

The ventral pathway is believed to be responsible for identifying an object — while the dorsal pathway helps determine where or how to use the object. Yet, it is less clear if behavioral ramifications influence the pathway used to process specific items.

The study authors hypothesized that a manipulatable object, such as a tool, processes through the dorsal pathway with higher temporal resolution — while visual processing of a non-manipulatable item, such as a potted plant, occurs in the ventral pathway with higher spatial resolution.

What you know changes what you see

To test their theory, the researchers conducted five experiments investigating spatial and temporal resolution across manipulatable and non-manipulable objects in college-age adults.

The participants viewed images of easily manipulated objects, including a snow shovel, coffee mug, and screwdriver, and non-manipulated items such as a potted plant, water fountain, and fire hydrant.

The scientists used gap-detection and object-flicker-discrimination tasks to determine the processing pathways in the study participants while they observed the images.

After compiling the data, the researchers found that when the participants recognized an object as a tool, it was perceived faster but with less detail. In contrast, when participants identified an item as a non-tool, it was perceived slower with more detail.

However, when the scientists made the items less recognizable by turning them upside down — differences in speed and detail vanished.

The results suggest that what an individual knows or understands about an object determines where and how quickly it is visually processed in the brain.

Implications of the research

For humans, rapidly determining if an object is a tool may be critical for survival.

Dubbelde and Prof. Shomstein explained:

“Tools are important to us as organisms. One of the most important things to us humans is how we can manipulate things with our hands, and so based on studies like this one, it seems that we process objects which often occur near our hands in a different way than objects which don’t often occur near our hands in order to best facilitate interacting with those objects.”

In addition, Dubbelde and Prof. Shomstein believe their research “has some real implications for how we display information in augmented reality displays.”

“There are some real applications for augmented reality in giving you real-time information as you need it, but as we start to incorporate this sort of technology into our lives, we have to keep in mind that different sorts of stimuli, like the difference that we’ve shown between tools and non-tools, can alter your perception in subtle ways,” the authors said.

“If you’re doing a high-risk task, like driving a car or even something like surgery, then something like the icon you choose to represent the site of the scalpel or position of the drone can slow down neural processing enough to cause a traffic accident or worse.”

— Dick Dubbelde and Prof. Sarah Shomstein, co-authors of the study

Other factors involved in visual perception: ‘The dress’ phenomenon

In 2015, differences in individual perception were brought to the forefront when a Twitter post questioning the color of a dress attracted intense attention and debate among viewers. The tweet showed an image of a blue and black dress with the caption, “my house is divided over this dress.”

What followed was a viral phenomenon.

“The dress” received more than 4.4 million tweets over the course of 2 days, with wildly different perspectives on color. According to research from 2015, out of 1,401 people surveyed, 57% described the dress as blue and black, 30% described it as white and gold, 11% as blue and brown, and 2% as something else.

Another study from 2017 investigated “the dress,” and suggested the difference in dress color perception may be due to viewer assumptions about lighting conditions.

MNT asked Dubbelde and Prof. Shomstein if beliefs about environmental factors can also affect the perception of an object, to which they replied:

“Absolutely. This study touches on a concept in cognitive psychology called ‘affordances’ which are the things that you know you can do with an object.”

The study authors further explained:

“When you see some tool like a hammer, you are not just seeing the colors and values which make up that image, you are also starting to process how you can interact with that object. Things like lighting can affect affordances, most obviously in making things harder to recognize, but also it changes how you relate to the hammer.”

In addition to assumptions about environmental factors, Dr. Julian C Lagoy, a board-certified psychiatrist with Mindpath Health, told MNT:

“Our education and upbringing [have] a huge influence on how we perceive objects around us. For instance, an engineer will view the world differently compared to an artist. Our education, upbringing, and overall knowledge [have] a huge influence on how each human being perceives their environment.”

Although emotions may also play a role in object perception, Prof. Shomstein and Dubbelde noted:

“There are known connections between these purely ‘visual’ regions and the parts of the brain which we tend to consider emotional, such as the amygdala. Most areas within the neural network of visual processing are interconnected, and the amygdalae play a role, although perhaps not a primary one, in object recognition.”

Source: Read Full Article