Research Snapshot: Developing A Computational Model for Understanding Brain Activity
There are theories for everything. Newton’s Law is a pillar of physics. Supply and demand is a pillar of economics. But what about neuroscience?
Right now, there aren’t any comprehensive laws in neuroscience. Researchers don’t have a standard to fall back on when interpreting behavior in the brain. That’s due, in part, to the complexity of the brain and the complexity of human cognition which are incredibly difficult to study. With his colleague Jason Yeatman, Ph.D., assistant professor at the University of Washington, Kay has set out to change that.
They’re creating such a mathematical model to analyze the visual cortex, the part of the brain that processes visual stimuli. The visual cortex connects the dots and gives meaning to what the eyes see.
Kay and Yeatman used functional magnetic resonance imaging (fMRI) to measure neural responses in the brain when study participants viewed different images and acted in response to those images. They were particularly interested in a part of the visual cortex involved with functions like reading and face recognition. This area requires both visual stimuli and internal cognition working together.
“Using fMRI experiments, we have developed the first comprehensive model of how this high-level area of the visual cortex works and how humans process visual input,” said Kay.
The data and proposed model were recently published in eLife.
The model has already shown promise in interpreting how the brain represents stimuli, makes decisions about the stimuli, and how neural responses change when the subject is asked to make a decision about those stimuli.
“It’s really quite novel in its scope,” Kay said. “In our field, it’s rare for people to take this modeling approach and we hope others will join.”
Future research could use this model to provide a standard for analyzing data in neuroscience. It could give insight for developing models of other areas of the brain dealing with emotions, language, motor function, other senses and more.
Down the road, it could also help physicians diagnose visual deficits.
“Right now, we can identify that there is a problem: this child has a perceptual disorder. Let’s say they have trouble recognizing faces. With the right brain measurements, our model might allow someone to isolate which specific processes within the brain appear to be functioning abnormally,” Kay said.
That could lead to insight on how certain brain conditions work, and help researchers develop new treatments. Kay and Yeatman plan to continue collaborating on this research direction. They hope to capitalize on the model to understand how children’s brains change as they learn to read, and why some children struggle learning this skill.
The model is still in the very early stages of development. But, Kay and his colleagues are optimistic about the potential. They hope it can be used by neuroscience researchers around the world – not just the University – to better understand the brain.
“This project is a continual endeavor. We will keep developing better and better models and as we do, we’ll get better and better understanding of the human brain,” Kay said.