ALERT: This system is being upgraded on Tuesday December 12. It will not be available
for use for several hours that day while the upgrade is in progress. Deposits to DukeSpace
will be disabled on Monday December 11, so no new items are to be added to the repository
while the upgrade is in progress. Everything should be back to normal by the end of
day, December 12.
Neural Network Evidence for the Coupling of Presaccadic Visual Remapping to Predictive Eye Position Updating.
Abstract
As we look around a scene, we perceive it as continuous and stable even though each
saccadic eye movement changes the visual input to the retinas. How the brain achieves
this perceptual stabilization is unknown, but a major hypothesis is that it relies
on presaccadic remapping, a process in which neurons shift their visual sensitivity
to a new location in the scene just before each saccade. This hypothesis is difficult
to test in vivo because complete, selective inactivation of remapping is currently
intractable. We tested it in silico with a hierarchical, sheet-based neural network
model of the visual and oculomotor system. The model generated saccadic commands to
move a video camera abruptly. Visual input from the camera and internal copies of
the saccadic movement commands, or corollary discharge, converged at a map-level simulation
of the frontal eye field (FEF), a primate brain area known to receive such inputs.
FEF output was combined with eye position signals to yield a suitable coordinate frame
for guiding arm movements of a robot. Our operational definition of perceptual stability
was "useful stability," quantified as continuously accurate pointing to a visual object
despite camera saccades. During training, the emergence of useful stability was correlated
tightly with the emergence of presaccadic remapping in the FEF. Remapping depended
on corollary discharge but its timing was synchronized to the updating of eye position.
When coupled to predictive eye position signals, remapping served to stabilize the
target representation for continuously accurate pointing. Graded inactivations of
pathways in the model replicated, and helped to interpret, previous in vivo experiments.
The results support the hypothesis that visual stability requires presaccadic remapping,
provide explanations for the function and timing of remapping, and offer testable
hypotheses for in vivo studies. We conclude that remapping allows for seamless coordinate
frame transformations and quick actions despite visual afferent lags. With visual
remapping in place for behavior, it may be exploited for perceptual continuity.
Type
Journal articlePermalink
https://hdl.handle.net/10161/12072Published Version (Please cite this version)
10.3389/fncom.2016.00052Publication Info
Rao, HM; San Juan, J; Shen, FY; Villa, JE; Rafie, KS; & Sommer, MA (2016). Neural Network Evidence for the Coupling of Presaccadic Visual Remapping to Predictive
Eye Position Updating. Front Comput Neurosci, 10. pp. 52. 10.3389/fncom.2016.00052. Retrieved from https://hdl.handle.net/10161/12072.This is constructed from limited available data and may be imprecise. To cite this
article, please review & use the official citation provided by the journal.
Collections
More Info
Show full item recordScholars@Duke
Marc A. Sommer
Professor of Biomedical Engineering
We study circuits for cognition. Using a combination of neurophysiology and biomedical
engineering, we focus on the interaction between brain areas during visual perception,
decision-making, and motor planning. Specific projects include the role of frontal
cortex in metacognition, the role of cerebellar-frontal circuits in action timing,
the neural basis of "good enough" decision-making (satisficing), and the neural mechanisms
of transcranial magnetic stimulation (TMS).

Articles written by Duke faculty are made available through the campus open access policy. For more information see: Duke Open Access Policy
Rights for Collection: Scholarly Articles
Works are deposited here by their authors, and represent their research and opinions, not that of Duke University. Some materials and descriptions may include offensive content. More info