Brain’s sensory flexibility may inspire adaptive, context-aware AI models for smarter decision-making.

Dynamic categorization rules alter representations in human visual cortex
Go to source). Led by biomedical engineer and neuroscientist Dr. Nuttida Rungratsameetaweemana, an assistant professor at Columbia University, the research reveals that the brain’s visual system—long thought to passively record incoming information—actually plays an active and adaptive role in categorizing and interpreting what we see, based on the task at hand.
‘Humans adapt quickly when rules change. We have to replicate that in #AI—systems that truly understand context, not just process data. #machine_learning’

Published in Nature Communications, the study presents some of the most compelling evidence to date that the brain's early sensory systems are not just observers but key players in real-time decision-making. 




“Our findings challenge the classic model of visual processing,” says Dr. Rungratsameetaweemana. “We show that even regions of the brain closest to the eyes are highly flexible and can reshape their activity depending on what you’re trying to do.”
The research team wanted to test just how adaptable the brain’s visual cortex is when facing constantly changing tasks. Using functional magnetic resonance imaging (fMRI), they recorded the brain activity of participants who were asked to sort abstract shapes into categories. The twist? The rules for categorizing those shapes kept changing.
The scientists used machine learning tools like multivariate classifiers to detect how patterns of brain activity shifted across tasks. What they found was striking: the primary and secondary visual cortices, which process raw visual input from the eyes, reorganized their neural activity depending on how participants were instructed to categorize the same shapes.
In short, the brain wasn’t just seeing the shapes—it was strategically interpreting them to meet a specific goal.
Advertisement
Not only did the brain activity change with the categorization rules, but the clarity of those neural patterns predicted how well participants performed. When people struggled to categorize shapes that sat near the “gray area” between two categories, the visual cortex stepped up, showing more distinctive activation patterns that seemed to aid the decision-making process.
Implications for AI and Mental Health
The findings open exciting new doors—not just in neuroscience, but in the field of artificial intelligence.Today’s AI systems often falter when faced with new or unexpected rules. By contrast, humans are remarkably adept at flexibly interpreting information in new contexts. Understanding how the brain’s sensory systems contribute to this flexibility could inspire next-generation AI models capable of adaptive learning and context-aware decision-making.
The study also has implications for understanding neurocognitive disorders such as
Dr. Rungratsameetaweemana and her team aren’t stopping here. They’re now diving deeper—beyond fMRI—to explore what’s happening at the level of individual neurons and neural circuits. New experiments will use direct neural recordings to understand how individual cells in the visual cortex contribute to decision-making.
At the same time, they’re partnering with AI researchers to explore how insights from flexible visual coding can improve machine learning systems.
“Humans adapt quickly when the rules change,” says Rungratsameetaweemana. “That’s what we want to replicate in AI—systems that don’t just process data but truly understand context.”
Rewriting the Brain’s Blueprint
The study redefines how we understand perception and decision-making, revealing the visual system as a dynamic, intelligent participant in cognition—not just a passive recorder.So next time you look at a bag of carrots, remember: your eyes might already be helping you decide what comes next—before your “thinking brain” even gets involved.
Reference:
- Dynamic categorization rules alter representations in human visual cortex - (https://www.nature.com/articles/s41467-025-58707-4)
Source-Medindia