Statistics Professor Babak Shahbaba has been awarded a $1.7 million National Institutes of Health (NIH) grant that could have far-reaching implications for future efforts to address memory impairment. The research involves electrophysiological experiments in rats to study how a brain structure — the hippocampus — supports our ability to remember the daily events of our life. Furthermore, the research should lead to new methodologies for handling huge amounts of complex data. The five-year grant, “Scalable Bayesian Stochastic Process Models for Neural Data Analysis,” is a multidisciplinary collaboration between Shahbaba and fellow Statistics Professor Hernando Ombao and Associate Professor of Neurobiology and Behavior Norbert Fortin.
Understanding Memory Mechanisms
Shahbaba has been working with Fortin for a couple of years now, and he finds the research fascinating because these unique experiments in rats are cleverly designed to tell us how thehuman brain works. “It’s different from most rodent memory studies I’ve seen, which, for practical reasons, measure spatial memory using different kinds of mazes or environments,” says Shahbaba. The problem with this approach is that human tasks often aren’t spatial. “We’re not moving around while reading a book or watching a movie,” he explains.
In Fortin’s experiments, the rats memorize a sequence of five different odors, and if the sequence is then presented out of order, the rat needs to withdraw its nose to receive a reward. Everything happens at the same location, so there’s no spatial aspect. “If we can understand the neural basis of encoding and retrieving these sequences of events,” says Shahbaba, “it has a much easier connection to what we do as human beings.” Specifically, using this nonspatial approach, the team hopes to determine whether spatial coding properties (considered fundamental to hippocampal memory function) extend to the nonspatial domain.
New Methodologies for Handling Big Data
Shahbaba and Ombao’s role in this work is twofold: define a statistical model that can explain the complex structure of the neural data and develop computational methods for fast statistical inference. Most existing statistical models can’t handle these kinds of big data problems. Shahbaba notes that “you have basically a data point each millisecond — whether a neuron was firing or not — and when you do that over time, with multiple electrodes, multiple trials, and multiple rats, the data accumulates very fast.”
Shahbaba and Ombao will develop flexible multivariate Gaussian process models that explain how the neurons work together to keep track of the correct sequences of events, and they’ll develop efficient algorithms for training that model. Shahbaba says that while they’re motivated by the specific problem presented in Fortin’s work, they hope their results extend to other applications. “We’re developing methodology that should go beyond this specific application and have contributions to the field of statistics.”
Furthermore, they are hoping that the hypothesis they generate can be tested in human beings as an extension of this project. “We don’t have human data at this time, but we hope that our future work will provide insight into the cause of memory impairments.”
Moving Forward
For now, with the funding they’ll receive over the next five years, they plan to hire two research scientists — one in Fortin’s lab and the other in the Department of Statistics — so they can run more experiments and collect more data. They also plan to support a Ph.D. student who will be dedicated to the project and work on it as part of his or her Ph.D. thesis. The additional support will make it much easier for them to accomplish their objectives in a timely manner, and Shahbaba is eager to move forward. “I’m very excited. I truly believe in the quality of the work we’re doing and its potential.”
— Shani Murray