Rhythm pervades the universe. Most natural and cultural phenomena are governed by rhythm. The waves of the sea move rhythmically, we walk rhythmically, our heart beats rhythmically and music as well as dance is rhythmic. While the elements that establish rhythm in different...
Rhythm pervades the universe. Most natural and cultural phenomena are governed by rhythm. The waves of the sea move rhythmically, we walk rhythmically, our heart beats rhythmically and music as well as dance is rhythmic. While the elements that establish rhythm in different natural and cultural phenomena may vary, humans are thought to perceive rhythm whenever alternation or a regular occurrence of elements in time occurs1. Rhythm is thus vital for a wide range of cognitive abilities, ranging from time perception and building expectations of future events, to perceiving and producing spoken language, music as well as dance. Yet despite considerable research in various cognitive domains, comparing different rhythmic phenomena directly is difficult. The cognitive machinery necessary for rhythm perception and the role it plays in various cognitive domains remains poorly understood.
This project investigated how rhythms in music and language interact by exploiting the fact that the perception of auditory rhythm is often spontaneously accompanied by synchronized rhythmic motor behavior. When we listen to music we often tap our finger or our foot to the beat of the music. Also, when we speak our hands and our head move in synchrony with the prosody of our spoken utterances. How does rhythmic synchronization between language and music – that are both highly modular cognitive abilities – occur? How does the mind resolve the differences in the basic elements carrying rhythm in language (consonants vs vowels) and in music (beats)? How is the production of musical rhythm synchronized to the speech signal, and does it differ from the way the production of speech or song is synchronized to musical melodies?
This project therefore had the following Objectives:
1) To investigate rhythm perception by looking at how rhythm interacts in two highly modular cognitive abilities – i.e. language and music.
2) To exploit rhythm synchronization as a direct online measure for comparing rhythm in language and music.
3) Determine how the human mind resolves rhythmic conflicts between speech and music.
4) Explore how the predisposition for rhythm synchronization interacts with developmental factors by studying infants at different developmental.
To meet the objectives described above, the Fellow carried out 10 experiments, tested 123 adult participants and 107 infants and analyzed synchronized movements to rhythmic stimuli of 160 babies who had already been tested in previous experiments on rhythm perception. The project developed a new way to measure spontaneous rhythmic coordination of perception and action, by looking at human pupils. The experimental results show that human pupils synchronize spontaneously to rhythms in speech and music – i.e., they fluctuate in size in synchrony with the perceived rhythm. This synchronization is driven by abstract rhythmic representations that are created in the human brain. We have experimentally shown that these synchronized pupillary changes are not sensitive to low-level acoustic differences between speech and music, suggesting that rhythm is perceived in the two domains through shared cognitive and neural resources. The results also show that rhythmic violations and disparities between rhythms in different domains are detected fast and overcome immediately. This is true for both adult participants as well as young infants during the first year of life.
The project also investigated rhythm synchronization in infants of different ages, ranging between 5- and 10-months-of-age. These ages are interesting because they correspond to different developmental milestones. For example, 5-month-old infants produce babbling sounds, are not thought to know many words and cannot yet fully control their hands and legs. By 7-months-of-age infants have acquired considerable finger dexterity to handle objects with agility, they already know many common nouns and have developed considerable rhythmic knowledge. Finally, at the end of the first-year infants begin to utter their first words, they begin to walk and they are thought to master the rhythms of their native tongue. It is therefore interesting that synchronized rhythmic behavior was observed in infants’ already in the youngest age group. Developmental changes to rhythm synchronization could therefore only be detected in a graded manner, with stronger results being observed for older infants and adults. This suggests that synchronized rhythm perception is fundamental for our congitivie repertoire from the earliest stages of development.
It has long been known that rhythm in language and music may to some degree share perceptual as well as cognitive resources. However, it has been difficult to compare language and music directly, due to which most of the evidence for shared cognitive resources has been either indirect or correlational. The present project went beyond that, by trying to provide causal evidence for a link between rhythm processing in the two domains. For example, the results show that listeners – adults and infants – can produce spontaneous and synchronized motor-responses to rhythms in speech and music – i.e., their pupils oscillate at rhythm frequency. These oscillations are maintained across boundaries between musical and speech rhythm, showing that rhythm synchronization is oblivious to the domain of the rhythms. Importantly, this oscillatory pupil size changes are so accurate in time that the results parallel findings with electrophysiological studies (e.g. EEG), suggesting that we can use pupillometry as a measure of sensorimotor synchronization in the brain.
The results of this project show that rhythm perception in music and spoken language can emerge from the same neuro-cognitive resources. This is important, because it explains how early exposure to music and regular musical training can enhance our linguistic skills. It also suggests that depending on the language we speak – i.e., different languages have different rhythms – we may be selectively enhancing or hindering our ability to perceive rhythm in music. The results are also important because if our ability to synchronize to rhythm is linked to our linguistic abilities – for example dyslexic children are known to be worse in beat synchronization – then evidence for shared rhythmic processing between language and music may help to uncover the roots of language and music related pathologies. The novel method developed for studying rhythm synchronization may therefore provide a basis for developing a valuable tool for detecting language, music and general cognitive pathologies, already in young infants who cannot be instructed to complete linguistic, musical or cognitive task.
More info: https://www.researchgate.net/profile/Alan_Langus.