Eye movement science is helping us learn about how we think

Eye movement science is helping us learn about how we think
Your eyes may reveal more than you think. True Touch Lifestyle/Shutterstock

For most of human history if you wanted to know what was going on behind someone’s eyes you had to make your best guess. But since the 1960s scientists have been studying the way eye movements may help decode people’s thoughts. The ability to eavesdrop on the details of people’s daydreams and internal monologs is still science fiction. But research is helping us learn more about the connections between our eyes and our mental state.

Most recently, research in Germany showed eye tracking could help detect where someone is at in their thinking process.

This kind of research is about more than general nosiness. Imagine you are a pilot trying a tricky maneuver which takes your full concentration. Meanwhile you’ve missed the flashing alarm needing your attention. Technology is only helpful if it’s in sync with the way humans think and behave in the real world.

Being able to track thought processes can avoid life threatening disconnects between humans and computers. If you combined psychology research on eye tracking with AI, the results could revolutionize computer interfaces and be a game changer for people with learning disabilities.

Eye movement tracking goes back to the 1960s when the first versions of the technology were developed by pioneering scientist Alfred Yarbus. Back then, uncomfortable suction caps were attached to participants’ eyes and reflected light traced their point of focus.

Yarbus found we are constantly shifting our gaze, focusing on different parts of the scene in front of us. With every eye movement, different parts of the scene come into sharp focus, and other parts in the edge of our vision become blurry. We cannot take it in all at once.

The way we sample the scene is not random. In Yarbus’s famous 1967 study, he asked people to look at a painting.

This painting, “They Did Not Expect Him,” was used in Yabus’s study. Ilya Repin/Wikimedia

He then asked participants “how rich the people were” and “what the relationship between the people was”? Different patterns of eye movements emerged according to the question asked.

Making progress

Since then infrared cameras and computer programs have made eye tracking easier. In the last few years research has shown eye tracking can reveal what stage someone is at in their thinking. In cognitive psychology experiments people are often asked to find an object in an image –- a where’s Wally puzzle.

People’s intentions influence how their eyes move. For instance, if they are looking for a red object, their eyes will first move to all the red objects in the scene. So, a person’s eye movements reveal the contents of their short-term memory.

The 2022 German study showed eye tracking can distinguish between two phases of thinking. The ambient mode involves taking in information. Focal processing happens in the later stages of problem solving.

In ambient mode, the eyes move rapidly over large distances for rough impressions of interesting targets. It is used for spatial orientation. Then, we focus on information for longer periods of time as we process it more deeply.

Before this study, these changes in gaze patterns had been studied in the context of changes in a visual stimulus. But the German study was one of the first to find our eyes change between these pattern of movements in response to a thought process.

The test subjects were asked to assemble a Rubik’s cube according to a model. The visual stimulus did not change but participants’ eye movements showed they were in ambient mode when information was taken in. The pattern of participants’ eye movements switched as they moved onto different parts of the task, such as selecting a puzzle piece.

Looking ahead

This research suggests technology intended to work together with a human operator could use eye tracking to track their user’s thought process. In my team’s recent work we designed a system that presented many different displays in parallel on a computer screen.

Our program tracked people’s eye movements to trace which information participants were looking at and guide where they should look, using AI to generate arrows and highligts on the screen. Applying AI methods to eye tracking data can also help show whether someone is tired or detect different learning disorders such as dyslexia.

Eye movements may also hold clues about someone’s emotional state. For example one study found low mood leads people to move their eyes to look at negative words such as “failure” more. A study analyzing the results of many experiments found people with depression avoided looking at positive stimuli (such as happy faces) and people with anxiety fixated on signs of threat.

Tracking eye movements can also help people learn by monitoring where someone is stuck in a task. One study involving cardiologists learning to read electrocardiograms used AI based on their eye movements to decide if they needed more guidance.

In the future AI may be able to combine eye tracking with other measures such as heart rate or changes in brain activity to get a more accurate estimate of someone’s thinking as they solve a problem. The question is, do we want computers to know what we are thinking?

Provided by
The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Source: Read Full Article