Browsing by Subject "Head Movements"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus.(Journal of neurophysiology, 2012-07) Lee, Jungah; Groh, Jennifer MVisual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered, <1% were head centered, and only 13% exhibited a hybrid of both reference frames. For auditory-evoked activity observed within 20 ms of the saccade (N = 154), the proportion of eye-centered response patterns increased to 69%, whereas the hybrid and head-centered response patterns dropped to 30% and <1%, respectively. This pattern approached, although did not quite reach, that observed for saccade-related activity for visual targets: 89% were eye centered, 11% were hybrid, and <1% were head centered (N = 162). The plainly eye-centered visual response patterns and predominantly eye-centered auditory motor response patterns lie in marked contrast to our previous study of the intraparietal cortex, where both visual and auditory sensory and motor-related activity used a predominantly hybrid reference frame (Mullette-Gillman et al. 2005, 2009). Our present findings indicate that auditory signals are ultimately translated into a reference frame roughly similar to that used for vision, but suggest that such signals might emerge only in motor areas responsible for directing gaze to visual and auditory stimuli.Item Open Access Locomotor head movements and semicircular canal morphology in primates.(Proc Natl Acad Sci U S A, 2012-10-30) Malinzak, Michael D; Kay, Richard F; Hullar, Timothy EAnimal locomotion causes head rotations, which are detected by the semicircular canals of the inner ear. Morphologic features of the canals influence rotational sensitivity, and so it is hypothesized that locomotion and canal morphology are functionally related. Most prior research has compared subjective assessments of animal "agility" with a single determinant of rotational sensitivity: the mean canal radius of curvature (R). In fact, the paired variables of R and body mass are correlated with agility and have been used to infer locomotion in extinct species. To refine models of canal functional morphology and to improve locomotor inferences for extinct species, we compare 3D vector measurements of head rotation during locomotion with 3D vector measures of canal sensitivity. Contrary to the predictions of conventional models that are based upon R, we find that axes of rapid head rotation are not aligned with axes of either high or low sensitivity. Instead, animals with fast head rotations have similar sensitivities in all directions, which they achieve by orienting the three canals of each ear orthogonally (i.e., along planes at 90° angles to one another). The extent to which the canal configuration approaches orthogonality is correlated with rotational head speed independent of body mass and phylogeny, whereas R is not.