Neuronal Circuits for Auditory Perception and Learning
The long-term goal of our research is to identify the neuronal circuits and neuronal codes that support hearing and auditory memory and learning in complex acoustic environments. Auditory perception is shaped by the interaction of sensory inputs with our experiences, emotions, and cognitive states. Decades of research have characterized how neuronal response properties to basic sounds, such as tones or whistles, are transformed in the auditory pathway of passively listening subjects. Much less well-understood is how the brain creates a perceptual representation of a complex auditory scene, i.e., one that is composed of a myriad of sounds, and how this representation is shaped by learning and experience. Over the last nine years, our laboratory has made transformative progress in the quantitative understanding of neuronal circuits supporting dynamic auditory perception, through a combination of behavioral, electrophysiological, optogenetic and computational approaches.
Cracking cortical and sub-cortical circuits for hearing
The fundamental quest of sensory neuroscience is to link a specific function in sensory processing to an sensory circuit. One of the most striking aspects of neuronal morphology in the sensory cortex is the astonishing diversity of inhibitory interneurons. This diversity is thought to underlie the brain’s ability to process and respond to complex and varied everyday sensory environments. By exhibiting differential patterns in their connectivity to excitatory and other types of inhibitory cells within and across cortical layers, interneurons contribute to the formation of a variety of microcircuits supporting potentially complex sensory processing functions.
Do inhibitory neurons play a role in auditory perception and sound processing? First, we asked the question whether inhibitory neurons contribute causally to auditory perception. We optogenetically suppressed or activated inhibitory neurons while the mouse performed a frequency discrimination task (Aizenberg, 2015). We found that, as we predicted, inhibitory neurons bi-directionally controlled frequency discrimination. Increasing their activity improved frequency discrimination (think of this as instantly improving one’s sense of pitch), whereas decreasing their activity impaired frequency discrimination. We were able to show that while the effects differed across animals, with some mice showing more improvement than others, this was due to the specific parameters of optogenetic activation (Briguglio, 2018), further confirming that inhibitory neurons play a causal role in auditory perception. Interestingly, our earlier work on learning found that auditory learning can alter frequency discrimination bi-directionally. As such, we are now testing the hypothesis that inhibitory neurons play a key function in learning.
What is the function of many different inhibitory neuronal types? We focused on testing the difference between somatostatin-positive interneurons, which target the distal dendrites of excitatory neurons, and the parvalbumin-positive interneurons, which target the cell bodies of excitatory neurons. These neurons have not just morphological, but also may physiological differences. We found that the SSTs, but not PVs, contributed to auditory adaptation to simple (Natan, 2017) and complex (Natan, 2015) sound sequences. In other words, the SSTs, but not PVs, selectively controlled sensitivity of neurons to expected and unexpected sounds. Our current work investigates whether and how inhibitory neurons shape the dynamics of neuronal networks.
We are also interested in the function of not just cortical, but also sub-cortical inhibition, which we recently found to play a crucial role in auditory learning. We are testing the role of the thalamic reticular nucleus in auditory processing.
Neuronal code for complex auditory perception
Natural auditory scenes consist of complex spectro-temporally diverse sounds. As we navigate in our acoustic environment, we perform a number of functions in order to identify specific auditory objects, and to understand their timing structure. Speech is an example of a complex acoustic object, which exhibits statistical regularities across the frequency spectrum and time. Whereas the auditory physiology field has acquired an excellent understanding of how individual neurons encode information about simple acoustic stimuli, our goal has been to push the currently available experimental tools and computational methods to understand how ensembles of neurons encode information about complex sounds and acoustic environments. We discovered the neuronal mechanism of sensitivity of neurons in the primary auditory cortex to con-specific vocalizations (Carruthers, 2013). Extending the project to understand the transformation of vocalizations of sounds across cortical areas, we found that neuronal populations transform the representation of vocalizations by increasing invariance to basic acoustic transformations of sounds between primary and non-primary cortical areas (Carruthers, 2015). We also found that neurons in the auditory cortex exhibited adaptation to sounds with varying spectro-temporal statistical structure (Natan, 2017; Blackwell, 2016). Furthermore, we found that specific sub-cortical neuronal projections (connections to areas earlier in the sensory pathway) control sound representation (Blackwell, 2020).
Adaptation to sound statistics has long be hypothesized in neuroscience as a form of efficient coding. Whereas numerous previous studies found that neurons indeed adapt to the statistics of the environment, the connection of the neuronal responses to behavior has proved more tenuous. We are currently testing the role of efficient coding in behavior by measuring how changing the statistics of background noise affects the ability of the subject to detect a target in noise, and how the perception is related to neuronal responses. We are also testing whether and how the brain adapts to neuronally and behaviorally to complex sound patterns, such as music melodies. You can listen to sound of the stimuli we use with and without regularity, and an excerpt from music by Phillip Glass, who has exploited adaptation to higher level statistics of sounds in his minimalist music.
Sound used to test adaptation to unexpected sounds.
In the natural world, we encounter sounds not in isolation, but concurrent with visual and olfactory inputs. We have multiple projects to investigate the neuronal basis for auditory-visual and auditory-olfactory integration. Dr. Geffen’s Ph.D. explored the mechanisms for visual (Geffen, 2007) and olfactory (Geffen, 2009) temporal processing. Because we are able to selectively inactivate specific pathways in the brain, we are able to test where and how signals are integrated across multiple sensory modalities.
Mechanisms for auditory learning and memory
When we carry out a conversation or listen to a musical piece, we are drawing on our auditory memory. How does our brain learn and remember sounds? Classical studies demonstrated that auditory learning drives plastic changes in sound-evoked responses in the auditory cortex. However, a causal relationship between these learning-driven changes in cortical activity and auditory perception had not been established. We discovered a new association between a basic form of associative learning (auditory fear conditioning) and auditory perceptual acuity (Aizenberg, 2013). We found that changes in the sensory acuity depend on how specific associative learning is to the sensory cue, and that this differential effect was strong: if learning was specific to the conditioned cue, sensory acuity was improved, whereas if learning generalized over other, similar, cues, sensory acuity was impaired. This bi-directionality introduced a new set of questions about how these changes are implemented and controlled by cortical circuits.
We are now studying how emotional learning is controlled by and affects plasticity in the auditory cortex using chronic two-photon imaging of neuronal responses over weeks (Wood, 2020). This project will solve a long-standing question of how neuronal plasticity contributes to learning.
Beyond the auditory cortex, we are interested in understanding the mechanisms for hearing under uncertainty (such as for example when talking to someone in a crowded bar). We are testing the circuits and the mechanisms the brain uses to resolve uncertainty. We are also studying the function of cortico-striatal feedback loop in auditory learning, aiming to relate the plasticity in the brain to credit assignment problem in deep learning.