Skip to content

Because of a lapse in government funding, the information on this website may not be up to date, transactions submitted via the website may not be processed, and the agency may not be able to respond to inquiries until appropriations are enacted.
The NIH Clinical Center (the research hospital of NIH) is open. For more details about its operating status, please visit cc.nih.gov.
Updates regarding government operating status and resumption of normal operations can be found at OPM.gov.

NEI-funded study explores how thoughts influence what the eyes see

A surprising study could point to new approaches for AI systems
April 21, 2025
Neuroscience
Grantee

When you see a bag of carrots at the grocery store, does your mind go to potatoes and parsnips or buffalo wings and celery? 

It depends, of course, on whether you’re making a hearty winter stew or getting ready to watch the Super Bowl. 

Most scientists agree that categorizing an object — like thinking of a carrot as either a root vegetable or a party snack — is the job of the prefrontal cortex, the brain region responsible for reasoning and other high-level functions that make us smart and social. In that account, the eyes and visual regions of the brain are kind of like a security camera collecting data and processing it in a standardized way before passing it off for analysis. 

However, a new study led by biomedical engineer and neuroscientist Nuttida Rungratsameetaweemana, Ph.D., an assistant professor in the Department of Biomedical Engineering at Columbia University, shows that the brain’s visual regions play an active role in making sense of information. Crucially, the way it interprets the information depends on what the rest of the brain is working on.

If it’s Super Bowl Sunday, the visual system sees those carrots on a veggie tray before the prefrontal cortex knows they exist. 

Published April 11 in Nature Communications, the study provides some of the clearest evidence yet that early sensory systems play a role in decision-making — and that they adapt in real-time. It also points to new approaches for designing AI systems that can adapt to new or unexpected situations.

To read more, go to Columbia Engineering