Browsing by Subject "Space Perception"
Now showing 1 - 13 of 13
- Results Per Page
- Sort Options
Item Open Access Activation in mesolimbic and visuospatial neural circuits elicited by smoking cues: evidence from functional magnetic resonance imaging.(Am J Psychiatry, 2002-06) Due, Deborah L; Huettel, Scott A; Hall, Warren G; Rubin, David COBJECTIVE: The authors sought to increase understanding of the brain mechanisms involved in cigarette addiction by identifying neural substrates modulated by visual smoking cues in nicotine-deprived smokers. METHOD: Event-related functional magnetic resonance imaging (fMRI) was used to detect brain activation after exposure to smoking-related images in a group of nicotine-deprived smokers and a nonsmoking comparison group. Subjects viewed a pseudo-random sequence of smoking images, neutral nonsmoking images, and rare targets (photographs of animals). Subjects pressed a button whenever a rare target appeared. RESULTS: In smokers, the fMRI signal was greater after exposure to smoking-related images than after exposure to neutral images in mesolimbic dopamine reward circuits known to be activated by addictive drugs (right posterior amygdala, posterior hippocampus, ventral tegmental area, and medial thalamus) as well as in areas related to visuospatial attention (bilateral prefrontal and parietal cortex and right fusiform gyrus). In nonsmokers, no significant differences in fMRI signal following exposure to smoking-related and neutral images were detected. In most regions studied, both subject groups showed greater activation following presentation of rare target images than after exposure to neutral images. CONCLUSIONS: In nicotine-deprived smokers, both reward and attention circuits were activated by exposure to smoking-related images. Smoking cues are processed like rare targets in that they activate attentional regions. These cues are also processed like addictive drugs in that they activate mesolimbic reward regions.Item Open Access Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus.(Journal of neurophysiology, 2012-07) Lee, Jungah; Groh, Jennifer MVisual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered, <1% were head centered, and only 13% exhibited a hybrid of both reference frames. For auditory-evoked activity observed within 20 ms of the saccade (N = 154), the proportion of eye-centered response patterns increased to 69%, whereas the hybrid and head-centered response patterns dropped to 30% and <1%, respectively. This pattern approached, although did not quite reach, that observed for saccade-related activity for visual targets: 89% were eye centered, 11% were hybrid, and <1% were head centered (N = 162). The plainly eye-centered visual response patterns and predominantly eye-centered auditory motor response patterns lie in marked contrast to our previous study of the intraparietal cortex, where both visual and auditory sensory and motor-related activity used a predominantly hybrid reference frame (Mullette-Gillman et al. 2005, 2009). Our present findings indicate that auditory signals are ultimately translated into a reference frame roughly similar to that used for vision, but suggest that such signals might emerge only in motor areas responsible for directing gaze to visual and auditory stimuli.Item Open Access Chimpanzees and bonobos exhibit divergent spatial memory development.(Dev Sci, 2012-11) Rosati, Alexandra G; Hare, BrianSpatial cognition and memory are critical cognitive skills underlying foraging behaviors for all primates. While the emergence of these skills has been the focus of much research on human children, little is known about ontogenetic patterns shaping spatial cognition in other species. Comparative developmental studies of nonhuman apes can illuminate which aspects of human spatial development are shared with other primates, versus which aspects are unique to our lineage. Here we present three studies examining spatial memory development in our closest living relatives, chimpanzees (Pan troglodytes) and bonobos (P. paniscus). We first compared memory in a naturalistic foraging task where apes had to recall the location of resources hidden in a large outdoor enclosure with a variety of landmarks (Studies 1 and 2). We then compared older apes using a matched memory choice paradigm (Study 3). We found that chimpanzees exhibited more accurate spatial memory than bonobos across contexts, supporting predictions from these species' different feeding ecologies. Furthermore, chimpanzees - but not bonobos - showed developmental improvements in spatial memory, indicating that bonobos exhibit cognitive paedomorphism (delays in developmental timing) in their spatial abilities relative to chimpanzees. Together, these results indicate that the development of spatial memory may differ even between closely related species. Moreover, changes in the spatial domain can emerge during nonhuman ape ontogeny, much like some changes seen in human children.Item Open Access Cortical Brain Activity Reflecting Attentional Biasing Toward Reward-Predicting Cues Covaries with Economic Decision-Making Performance.(Cereb Cortex, 2016-01) San Martín, René; Appelbaum, Lawrence G; Huettel, Scott A; Woldorff, Marty GAdaptive choice behavior depends critically on identifying and learning from outcome-predicting cues. We hypothesized that attention may be preferentially directed toward certain outcome-predicting cues. We studied this possibility by analyzing event-related potential (ERP) responses in humans during a probabilistic decision-making task. Participants viewed pairs of outcome-predicting visual cues and then chose to wager either a small (i.e., loss-minimizing) or large (i.e., gain-maximizing) amount of money. The cues were bilaterally presented, which allowed us to extract the relative neural responses to each cue by using a contralateral-versus-ipsilateral ERP contrast. We found an early lateralized ERP response, whose features matched the attention-shift-related N2pc component and whose amplitude scaled with the learned reward-predicting value of the cues as predicted by an attention-for-reward model. Consistently, we found a double dissociation involving the N2pc. Across participants, gain-maximization positively correlated with the N2pc amplitude to the most reliable gain-predicting cue, suggesting an attentional bias toward such cues. Conversely, loss-minimization was negatively correlated with the N2pc amplitude to the most reliable loss-predicting cue, suggesting an attentional avoidance toward such stimuli. These results indicate that learned stimulus-reward associations can influence rapid attention allocation, and that differences in this process are associated with individual differences in economic decision-making performance.Item Open Access Efficient coding of spatial information in the primate retina.(The Journal of neuroscience : the official journal of the Society for Neuroscience, 2012-11) Doi, Eizaburo; Gauthier, Jeffrey L; Field, Greg D; Shlens, Jonathon; Sher, Alexander; Greschner, Martin; Machado, Timothy A; Jepson, Lauren H; Mathieson, Keith; Gunning, Deborah E; Litke, Alan M; Paninski, Liam; Chichilnisky, EJ; Simoncelli, Eero PSensory neurons have been hypothesized to efficiently encode signals from the natural environment subject to resource constraints. The predictions of this efficient coding hypothesis regarding the spatial filtering properties of the visual system have been found consistent with human perception, but they have not been compared directly with neural responses. Here, we analyze the information that retinal ganglion cells transmit to the brain about the spatial information in natural images subject to three resource constraints: the number of retinal ganglion cells, their total response variances, and their total synaptic strengths. We derive a model that optimizes the transmitted information and compare it directly with measurements of complete functional connectivity between cone photoreceptors and the four major types of ganglion cells in the primate retina, obtained at single-cell resolution. We find that the ganglion cell population exhibited 80% efficiency in transmitting spatial information relative to the model. Both the retina and the model exhibited high redundancy (~30%) among ganglion cells of the same cell type. A novel and unique prediction of efficient coding, the relationships between projection patterns of individual cones to all ganglion cells, was consistent with the observed projection patterns in the retina. These results indicate a high level of efficiency with near-optimal redundancy in visual signaling by the retina.Item Open Access Functional parcellation of attentional control regions of the brain.(J Cogn Neurosci, 2004-01) Woldorff, Marty G; Hazlett, Chad J; Fichtenholtz, Harlan M; Weissman, Daniel H; Dale, Anders M; Song, Allen WRecently, a number of investigators have examined the neural loci of psychological processes enabling the control of visual spatial attention using cued-attention paradigms in combination with event-related functional magnetic resonance imaging. Findings from these studies have provided strong evidence for the involvement of a fronto-parietal network in attentional control. In the present study, we build upon this previous work to further investigate these attentional control systems. In particular, we employed additional controls for nonattentional sensory and interpretative aspects of cue processing to determine whether distinct regions in the fronto-parietal network are involved in different aspects of cue processing, such as cue-symbol interpretation and attentional orienting. In addition, we used shorter cue-target intervals that were closer to those used in the behavioral and event-related potential cueing literatures. Twenty participants performed a cued spatial attention task while brain activity was recorded with functional magnetic resonance imaging. We found functional specialization for different aspects of cue processing in the lateral and medial subregions of the frontal and parietal cortex. In particular, the medial subregions were more specific to the orienting of visual spatial attention, while the lateral subregions were associated with more general aspects of cue processing, such as cue-symbol interpretation. Additional cue-related effects included differential activations in midline frontal regions and pretarget enhancements in the thalamus and early visual cortical areas.Item Open Access Graded Neuronal Modulations Related to Visual Spatial Attention.(The Journal of neuroscience : the official journal of the Society for Neuroscience, 2016-05) Mayo, J Patrick; Maunsell, John HRUNLABELLED:Studies of visual attention in monkeys typically measure neuronal activity when the stimulus event to be detected occurs at a cued location versus when it occurs at an uncued location. But this approach does not address how neuronal activity changes relative to conditions where attention is unconstrained by cueing. Human psychophysical studies have used neutral cueing conditions and found that neutrally cued behavioral performance is generally intermediate to that of cued and uncued conditions (Posner et al., 1978; Mangun and Hillyard, 1990; Montagna et al., 2009). To determine whether the neuronal correlates of visual attention during neutral cueing are similarly intermediate, we trained macaque monkeys to detect changes in stimulus orientation that were more likely to occur at one location (cued) than another (uncued), or were equally likely to occur at either stimulus location (neutral). Consistent with human studies, performance was best when the location was cued, intermediate when both locations were neutrally cued, and worst when the location was uncued. Neuronal modulations in visual area V4 were also graded as a function of cue validity and behavioral performance. By recording from both hemispheres simultaneously, we investigated the possibility of switching attention between stimulus locations during neutral cueing. The results failed to support a unitary "spotlight" of attention. Overall, our findings indicate that attention-related changes in V4 are graded to accommodate task demands. SIGNIFICANCE STATEMENT:Studies of the neuronal correlates of attention in monkeys typically use visual cues to manipulate where attention is focused ("cued" vs "uncued"). Human psychophysical studies often also include neutrally cued trials to study how attention naturally varies between points of interest. But the neuronal correlates of this neutral condition are unclear. We measured behavioral performance and neuronal activity in cued, uncued, and neutrally cued blocks of trials. Behavioral performance and neuronal responses during neutral cueing were intermediate to those of the cued and uncued conditions. We found no signatures of a single mechanism of attention that switches between stimulus locations. Thus, attention-related changes in neuronal activity are largely hemisphere-specific and graded according to task demands.Item Open Access Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.(PloS one, 2013-01) Pages, Daniel S; Groh, Jennifer MA general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.Item Open Access Sensory information and associative cues used in food detection by wild vervet monkeys.(Anim Cogn, 2014-05) Teichroeb, Julie A; Chapman, Colin AUnderstanding animals' spatial perception is a critical step toward discerning their cognitive processes. The spatial sense is multimodal and based on both the external world and mental representations of that world. Navigation in each species depends upon its evolutionary history, physiology, and ecological niche. We carried out foraging experiments on wild vervet monkeys (Chlorocebus pygerythrus) at Lake Nabugabo, Uganda, to determine the types of cues used to detect food and whether associative cues could be used to find hidden food. Our first and second set of experiments differentiated between vervets' use of global spatial cues (including the arrangement of feeding platforms within the surrounding vegetation) and/or local layout cues (the position of platforms relative to one another), relative to the use of goal-object cues on each platform. Our third experiment provided an associative cue to the presence of food with global spatial, local layout, and goal-object cues disguised. Vervets located food above chance levels when goal-object cues and associative cues were present, and visual signals were the predominant goal-object cues that they attended to. With similar sample sizes and methods as previous studies on New World monkeys, vervets were not able to locate food using only global spatial cues and local layout cues, unlike all five species of platyrrhines thus far tested. Relative to these platyrrhines, the spatial location of food may need to stay the same for a longer time period before vervets encode this information, and goal-object cues may be more salient for them in small-scale space.Item Open Access Spatial and temporal scales of neuronal correlation in visual area V4.(J Neurosci, 2013-03-20) Smith, Matthew A; Sommer, Marc AThe spiking activity of nearby cortical neurons is correlated on both short and long time scales. Understanding this shared variability in firing patterns is critical for appreciating the representation of sensory stimuli in ensembles of neurons, the coincident influences of neurons on common targets, and the functional implications of microcircuitry. Our knowledge about neuronal correlations, however, derives largely from experiments that used different recording methods, analysis techniques, and cortical regions. Here we studied the structure of neuronal correlation in area V4 of alert macaques using recording and analysis procedures designed to match those used previously in primary visual cortex (V1), the major input to V4. We found that the spatial and temporal properties of correlations in V4 were remarkably similar to those of V1, with two notable differences: correlated variability in V4 was approximately one-third the magnitude of that in V1 and synchrony in V4 was less temporally precise than in V1. In both areas, spontaneous activity (measured during fixation while viewing a blank screen) was approximately twice as correlated as visual-evoked activity. The results provide a foundation for understanding how the structure of neuronal correlation differs among brain regions and stages in cortical processing and suggest that it is likely governed by features of neuronal circuits that are shared across the visual cortex.Item Open Access Spatial imagery preserves temporal order.(Memory, 1996-09) Watson, ME; Rubin, DCLine drawings were presented in either a spatial or a nonspatial format. Subjects recalled each of four sets of 24 items in serial order. Amount recalled in the correct serial order and sequencing errors were scored. In Experiment 1 items appeared either in consecutive locations of a matrix or in one central location. Subjects who saw the items in different locations made fewer sequencing errors than those who saw each item in a central location, but serial recall levels for these two conditions did not differ. When items appeared in nonconsecutive locations in Experiment 2, the advantage of the spatial presentation on sequencing errors disappeared. Experiment 3 included conditions in which both the consecutive and nonconsecutive spatial formats were paired with retrieval cues that either did or did not indicate the sequence of locations in which the items had appeared. Spatial imagery aided sequencing when, and only when, the order of locations in which the stimuli appeared could be reconstructed at retrieval.Item Open Access The neural dynamics of stimulus and response conflict processing as a function of response complexity and task demands.(Neuropsychologia, 2016-04) Donohue, Sarah E; Appelbaum, Lawrence G; McKay, Cameron C; Woldorff, Marty GBoth stimulus and response conflict can disrupt behavior by slowing response times and decreasing accuracy. Although several neural activations have been associated with conflict processing, it is unclear how specific any of these are to the type of stimulus conflict or the amount of response conflict. Here, we recorded electrical brain activity, while manipulating the type of stimulus conflict in the task (spatial [Flanker] versus semantic [Stroop]) and the amount of response conflict (two versus four response choices). Behaviorally, responses were slower to incongruent versus congruent stimuli across all task and response types, along with overall slowing for higher response-mapping complexity. The earliest incongruency-related neural effect was a short-duration frontally-distributed negativity at ~200 ms that was only present in the Flanker spatial-conflict task. At longer latencies, the classic fronto-central incongruency-related negativity 'N(inc)' was observed for all conditions, but was larger and ~100 ms longer in duration with more response options. Further, the onset of the motor-related lateralized readiness potential (LRP) was earlier for the two vs. four response sets, indicating that smaller response sets enabled faster motor-response preparation. The late positive complex (LPC) was present in all conditions except the two-response Stroop task, suggesting this late conflict-related activity is not specifically related to task type or response-mapping complexity. Importantly, across tasks and conditions, the LRP onset at or before the conflict-related N(inc), indicating that motor preparation is a rapid, automatic process that interacts with the conflict-detection processes after it has begun. Together, these data highlight how different conflict-related processes operate in parallel and depend on both the cognitive demands of the task and the number of response options.Item Open Access The short and long of it: neural correlates of temporal-order memory for autobiographical events.(J Cogn Neurosci, 2008-07) St Jacques, Peggy; Rubin, David C; LaBar, Kevin S; Cabeza, RobertoPrevious functional neuroimaging studies of temporal-order memory have investigated memory for laboratory stimuli that are causally unrelated and poor in sensory detail. In contrast, the present functional magnetic resonance imaging (fMRI) study investigated temporal-order memory for autobiographical events that were causally interconnected and rich in sensory detail. Participants took photographs at many campus locations over a period of several hours, and the following day they were scanned while making temporal-order judgments to pairs of photographs from different locations. By manipulating the temporal lag between the two locations in each trial, we compared the neural correlates associated with reconstruction processes, which we hypothesized depended on recollection and contribute mainly to short lags, and distance processes, which we hypothesized to depend on familiarity and contribute mainly to longer lags. Consistent with our hypotheses, parametric fMRI analyses linked shorter lags to activations in regions previously associated with recollection (left prefrontal, parahippocampal, precuneus, and visual cortices), and longer lags with regions previously associated with familiarity (right prefrontal cortex). The hemispheric asymmetry in prefrontal cortex activity fits very well with evidence and theories regarding the contributions of the left versus right prefrontal cortex to memory (recollection vs. familiarity processes) and cognition (systematic vs. heuristic processes). In sum, using a novel photo-paradigm, this study provided the first evidence regarding the neural correlates of temporal-order for autobiographical events.