Posted on Sep 2, 2020

We just received the official award notice that our NIH/NEI grant R01 EY021833 Stochastic Models of Visual Decision Making and Visual Search has been renewed for $1,583,958 for four years.

Project Summary: Support is requested to advance an innovative, productive collaboration aimed at linking mind, brain, and behavior using performance, neurophysiological, and electrophysiological measures from monkeys and humans performing visual search and visual decision making tasks. The general goal is to derive the connections from spike trains in monkeys to behavior in humans using computational models that specify mental states mathematically, link them to brain states in particular neurons, and explain how the neural computations produces behavior. Our Gated Accumulator Model (GAM) assumes a stochastic accumulation of evidence to threshold for alternative responses. Model assessment involves quantitatively testing alternative model architectures on predictions of behavioral measures, response probabilities and distributions of correct and error response times, as well as neural measures and how these change with set size and target-distractor discriminability in previously collected data from monkeys performing visual search. While our previously funded research aimed to understand the architecture of evidence accumulation in GAM and the relationship of model accumulators to the observed dynamics of movement-related neurons in FEF, our newly proposed research aims to understand computationally the nature of the evidence that drives that accumulation and its relationship to the measured dynamics of visually-responsive neurons in FEF. Aim 1 compares the quality of salience evidence in lateralized EEG signals and neural discharges from visually-responsive neurons in monkeys performing visual search as input evidence to a network of stochastic accumulators to predict behavior. Aim 2 addresses a major challenge to the neural accumulator framework by determining whether movement neuron dynamics in FEF actually ramp or step. Aim 3 evaluates alternative architectures for an abstract Visual Attention Model (VAM) of the evidence driving accumulation to jointly predict observed behavior and the measured dynamics of visually-responsive neurons. Aim 4 extends VAM to more complex visual tasks involving filtering and selection. The result will be a broader and deeper understanding of the visual processes that select targets and control eye movements. Computational models like VAM and GAM may be at the “just right” level of abstraction. They capture essential details of the computation in ways that explain neural activity and behavior in single participants, whether monkey or human. These models can be used to understand normal behavior as well as illness, disability, and disease; the best-fitting parameters can characterize individual differences in behavior and provide markers for brain measures. These models can also inform neurological conditions that have a biophysical basis at the level of individual neurons and neural circuits, offering insight into what neurons and circuits compute and how they do it.