+353-1-416-8900REST OF WORLD
+44-20-3973-8888REST OF WORLD
1-917-300-0470EAST COAST U.S
1-800-526-8630U.S. (TOLL FREE)

The Auditory System and Human Sound-Localization Behavior

  • Book

  • April 2016
  • Elsevier Science and Technology
  • ID: 3627050

The Auditory System and Human Sound-Localization Behavior provides a comprehensive account of the full action-perception cycle underlying spatial hearing. It highlights the interesting properties of the auditory system, such as its organization in azimuth and elevation coordinates. Readers will appreciate that sound localization is inherently a neuro-computational process (it needs to process on implicit and independent acoustic cues). The localization problem of which sound location gave rise to a particular sensory acoustic input cannot be uniquely solved, and therefore requires some clever strategies to cope with everyday situations. The reader is guided through the full interdisciplinary repertoire of the natural sciences: not only neurobiology, but also physics and mathematics, and current theories on sensorimotor integration (e.g. Bayesian approaches to deal with uncertain information) and neural encoding.

Please Note: This is an On Demand product, delivery may take up to 11 working days after payment has been received.

Table of Contents

1. Introduction 2. The nature of sound 3. Linear systems analysis 4. Nonlinear systems analysis 5. The cochlea 6. The auditory nerve 7. Cues for human sound localization 8. Assessing auditory spatial performance 9. The gaze orienting system 10. The midbrain colliculus 11. Coordinate transformations in the brain 12. Sound localization behavior and plasticity 13. Audiovisual integration 14. The Auditory System and Human Sound-Localization Behavior

Authors

John van Opstal Professor of Neuroscience & Biophysics, Donders Institute for Brain, Cognition and Behavior, Radboud University, Nijmegen, Netherlands. Dr. Van Opstal is a professor of Biophysics, studying sound localization behaviour of human and non-human primates, and in patients. He regards sound localization as an action-perception problem, and probes the system with fast, saccadic eye-head gaze-control paradigms, to study the very earliest correlates of the underlying neurocomputational mechanisms.