The auditory sense

The ear-brain connection – the key to the auditory sense


Hearing is the result of the ear-brain connection. The ear picks up sound waves and transmits the vibrations to the cochlea. There, hair cells convert them into electrical signals that are transmitted to the brain via the auditory nerve. This is where the sounds are processed, interpreted and stored.

To ensure its functions, the ear is divided into three parts: the external ear consists of the pinna (the visible part) and the auditory canal which leads to the eardrum. Its role is to capture, amplify and focus sounds to the middle ear. When sound waves strike the eardrum, it begins to vibrate. These vibrations reach the middle ear, which is made up of small jointed bones. These bones transmit the vibrations to a membrane called the oval window at the entrance to the inner ear. This contains the cochlea, a spiral-shaped structure of 15,000 hair cells that transform vibrations into electrical signals that are transmitted to the brain by the auditory nerve.

The organ of hearing is an extraordinarily complex and remarkable organ, capable of translating sound waves into nerve information transmitted to the brain via the auditory nerve.

Maturational data of the auditory system


There is a neuronal organisation in the brain that follows physiological maturation processes of the auditory nerve pathways. These are the result of morphological changes in the cell extensions, the formation of synaptic connections and the multiplication of dendrites, as well as the biochemical activity of the neurons at the transmission and reception levels, and the myelination of axons, which play an important role in the speed of transmission of nerve messages.

The auditory nerve pathways of children are structured during the fetal period. By the seventieth day of embryonic life, the membranous labyrinth has completed its shaping and by the sixth month of gestation, the structures of the inner ear are functional as highlighted by fetal hearing imaging studies [Dunn et al., 2015].

From the first hour of life, the observed cochlear-muscular reflexes are characteristic of the subcortical reflex loop. The maturation of the auditory pathways is maximal in the first months of life. The density of nerve cell connections in the auditory cortex is greatest between 6 and 24 months. During this period, the central auditory areas are organised into frequency and stereophonic ranges. Significant and persistent peripheral damage can result in irreversible damage. This period is called the “critical period”. [Knudsen, 2004]

The myelination involved in speech production is more or less rapid depending on the cortical level: the prethalamic pathway completes its maturation before 12 months, while the postthalamic pathway completes its maturation around the fifth year of life. [Loundon et al., 2018]

The fetus reaches for the ear



From the 20th week of intrauterine life, babies perceive and respond to sounds. By 25 weeks, they are able to listen, as their auditory circuit is operational. The fetus, capable of hearing, is immersed in a constant auditory environment where bass sounds dominate. It can be deduced from this that the rhythms of maternal body sounds and the maternal voice are two essential elements to which the young organism is subjected. The late gestation fetus can establish a certain representation of spectral and melodic regularities when stimulated for very long periods with complex sounds of long duration, either carried by the maternal voice or coming from the environment [Granier-Deferre & Schaal, 2005].

By the 35th week, baby abilities approach those of an adult [Lecanuet et al., 2000]. Some engramming of the sound frame exists, as prosodic cues heard in utero remain in memory and are recognised for a few weeks after birth [Querleu et al., 1986].

The Hearing/Language Interface


Hearing provides the baby with the necessary ingredients for language learning.

The newborn is able to distinguish between human voice and noise. Between 1 and 5 months of age, intonational categorisation of sounds is possible. Babies are able to recognise a syllable in different utterances and to respond to changes in intonational contours. They are able to visually identify a correct auditory-articulatory association [Kuhl & Meltzoff, 1996]. Between 5 and 7 months, sensitivity to syntactic and prosodic categories of the mother tongue develops [Imafuku et al., 2014]. At this time, mouth movements are mapped to vowel perception. Detection of prosodic cues from different languages is possible and babies are able to perceive contrasts that do not belong to their language. Between 6 and 12 months of age, babies gradually lose the ability to perceive phonemes that are not used in their mother tongue [Werker & Tees, 1984].

By the end of their first year, infants have acquired the sound characteristics of their language (rhythm, melody, specific phonemes of their language and their rules for combination in words). Their babbling is marked by the mother tongue. They are able to extract words from continuous speech, and associate the most frequent words with people, events and objects to progressively produce grammatical statements by combining the different syntactic categories of their mother tongue [Le Normand et al.] These early, essentially perceptual, acquisitions take place in successive stages, beginning with the lexical explosion at around 18 months, and progressing to structured intelligible speech and perfectly constituted grammar by the age of 4.

These initial perceptual abilities, shared by all babies in the world, are rapidly modified by the language of the environment. In Western industrialised countries, children’s language development is correlated with words spoken directly to them by adults, not with other words heard. Is this correlation universal? Research is currently underway collecting speech from children in a wide range of cultures to answer these questions. [Cristia et al., 2017].

From multi-sensory perception to multimodal communication


Children’s perception of the world is multi-sensory and multimodal. The auditory-verbal function is confronted with information from other senses. Children associate what they touch, what they smell, what they see, what they hear in a multimodal and coherent approach of the world around them in order to then share it with expression and emotion to their entourage.

Multimodal approach for a child: syntactic feature/take action to environment/throw the ball

Multisensory perception: motricity – body sensitivity – hearing – seeing

Editor: Marie-Thérèse Le Normand