A single link to the first track to allow the export script to build the search page
  • Addiction, Drugs
  • Information from Lay-Language Summaries is Embargoed Until the Conclusion of the Scientific Presentation

    637—Auditory System: Perception, Cognition, and Action - Human Studies

    Tuesday, November 12, 2013, 1:00 pm - 5:00 pm

    637.18: Neural oscillations in auditory cortex and the phase tracking of music

    Location: Halls B-H

    ">K. DOELLING1, *D. POEPPEL1,2;
    1Psychology, New York Univ., New York, NY; 2NYUAD Inst., Abu Dhabi, United Arab Emirates

    Abstract Body: The presence and role of neural oscillations in auditory cortex in the context of auditory perception has elicited controversy. A growing body of data supports the hypothesis that delta-theta oscillations track the envelope of an auditory input in order to parse the stimulus. Alternatively, it has been argued that oscillatory activity is an epiphenomenon arising from onset responses to segmented auditory stimuli. One prediction stemming from the oscillatory model of tracking – but not from the onset response model – is that there must be a lower limit to the frequency range at which this oscillatory tracking mechanism breaks down. Here we test this prediction, using musical stimuli of varying tempos. Specifically, we test the ability of delta and theta oscillations to track stimuli whose envelopes maintained modal frequencies at ~8, ~4 and ~0.5 Hz using ecologically valid stimuli – classical music with carefully selected properties. While undergoing MEG recording, participants listened to three clips each from three classical piano pieces (Bach, Beethoven, Brahms; played by the same performer), each piece with the modal modulation rate at the above modulation frequencies. Participants had to detect a pitch distortion present in 20% of the trials. Inter-trial coherence (ITC) analysis showed phase consistency for each clip in the fast- and mid-tempo pieces at their respective modulation rates (8 Hz, 4 Hz). For the slow piece, phase consistency was demonstrated at similar levels to the other conditions in peaks between 2 and 8 Hz; however, ITC is considerably lower at 0.5 Hz, the modal modulation rate of this condition. While the mid and fast tempo conditions had modal modulation rates within the range of the delta-theta oscillation, the slow condition appears to be below the lower limit. We hypothesize that the system compensates by tracking the envelope using oscillations at harmonic frequencies within the possible range of the system. Thus, our findings support a prediction generated by the oscillatory tracking model and align with the concept of an active role of delta-theta oscillations in auditory perception.

    Lay Language Summary: The experiments we report demonstrate rhythmic neural activity that matches the rhythm of musical stimuli as they are played. One critical finding is that this rhythm matching breaks down during particularly slow stimuli, pointing to a lower limit for tracking using brain rhythms.
    Previous work in our lab shows that particular brain rhythms in auditory cortex align precisely with the rhythms of speech and track the syllabic rate. Extending this finding to music broadens our understanding of the role of such neural oscillations. Namely, this activity may be involved in discretization - the isolation of individual events (i.e. syllables or notes) out of a seemingly continuous sequence of sounds (i.e. sentences or musical phrases). It has been suggested that the breakdown of this mechanism could underlie some language impairments, such as dyslexia.
    To investigate how neural oscillations help parse musical stimuli, we used magnetoencephalography (MEG), which detects the magnetic field changes generated around the human skull by neuronal networks as they fire in synchrony. We played clips from three piano pieces (one each by Beethoven, Brahms and Bach) to our participants and asked them to detect small pitch distortions to keep them engaged. While the Bach and Brahms selections were of fast- and mid-range tempi - ~8 and ~5 notes per second (Hz), respectively - the Beethoven piece was relatively slow (~.5 Hz).
    MEG recordings from 12 participants (non-musicians) show that during fast and mid-range selections (i.e. the Bach and the Brahms) neural activity oscillates at the same rhythmic rates as the musical pieces (8 Hz and 5 Hz, respectively). Interestingly, the slow piece showed rhythmic neural activity at 2 Hz, about 4 times the rhythmic rate of the stimuli (see figure). This modulated effect likely reflects a lower limit of this neural mechanism. It suggests that in particularly slow (< 1-2 Hz) music or speech, the brain can only accurately parse the stimulus at some multiple of the stimulus rate.

    Oscillations (neural or otherwise) are the result of two temporally coordinated opposing forces. With neural oscillations, these two forces are coordinated inhibitory and excitatory signals sent from other neurons. When the music is too slow, one of these forces must push the oscillation to the next cycle before the next note can occur, resulting in multiple cycles between notes: the oscillation tracks the stimulus at a multiple of the note rate.
    This effect is similar to a behavioral practice of musicians called subdivision: the idea that long notes can only be accurately perceived and reproduced by internally counting them as shorter beats. It is possible that this oscillatory effect of discretization may also act as a neural index of subdivision.
    Understanding how the processes and limitations of discretization in auditory perception work will elucidate the deficits that occur in disorders such as dyslexia and specific language impairment. Having shown that both speech and music perception share this process of discretization through oscillations could point to potential benefits of music training for dyslexic children as a more engaging intervention.

    Information from Lay-Language Summaries is Embargoed Until the Conclusion of the Scientific Presentation

    637—Auditory System: Perception, Cognition, and Action - Human Studies

    Tuesday, November 12, 2013, 1:00 pm - 5:00 pm

    637.18: Neural oscillations in auditory cortex and the phase tracking of music

    Location: Halls B-H

    ">K. DOELLING1, *D. POEPPEL1,2;
    1Psychology, New York Univ., New York, NY; 2NYUAD Inst., Abu Dhabi, United Arab Emirates

    Abstract Body: The presence and role of neural oscillations in auditory cortex in the context of auditory perception has elicited controversy. A growing body of data supports the hypothesis that delta-theta oscillations track the envelope of an auditory input in order to parse the stimulus. Alternatively, it has been argued that oscillatory activity is an epiphenomenon arising from onset responses to segmented auditory stimuli. One prediction stemming from the oscillatory model of tracking – but not from the onset response model – is that there must be a lower limit to the frequency range at which this oscillatory tracking mechanism breaks down. Here we test this prediction, using musical stimuli of varying tempos. Specifically, we test the ability of delta and theta oscillations to track stimuli whose envelopes maintained modal frequencies at ~8, ~4 and ~0.5 Hz using ecologically valid stimuli – classical music with carefully selected properties. While undergoing MEG recording, participants listened to three clips each from three classical piano pieces (Bach, Beethoven, Brahms; played by the same performer), each piece with the modal modulation rate at the above modulation frequencies. Participants had to detect a pitch distortion present in 20% of the trials. Inter-trial coherence (ITC) analysis showed phase consistency for each clip in the fast- and mid-tempo pieces at their respective modulation rates (8 Hz, 4 Hz). For the slow piece, phase consistency was demonstrated at similar levels to the other conditions in peaks between 2 and 8 Hz; however, ITC is considerably lower at 0.5 Hz, the modal modulation rate of this condition. While the mid and fast tempo conditions had modal modulation rates within the range of the delta-theta oscillation, the slow condition appears to be below the lower limit. We hypothesize that the system compensates by tracking the envelope using oscillations at harmonic frequencies within the possible range of the system. Thus, our findings support a prediction generated by the oscillatory tracking model and align with the concept of an active role of delta-theta oscillations in auditory perception.

    Lay Language Summary: The experiments we report demonstrate rhythmic neural activity that matches the rhythm of musical stimuli as they are played. One critical finding is that this rhythm matching breaks down during particularly slow stimuli, pointing to a lower limit for tracking using brain rhythms.
    Previous work in our lab shows that particular brain rhythms in auditory cortex align precisely with the rhythms of speech and track the syllabic rate. Extending this finding to music broadens our understanding of the role of such neural oscillations. Namely, this activity may be involved in discretization - the isolation of individual events (i.e. syllables or notes) out of a seemingly continuous sequence of sounds (i.e. sentences or musical phrases). It has been suggested that the breakdown of this mechanism could underlie some language impairments, such as dyslexia.
    To investigate how neural oscillations help parse musical stimuli, we used magnetoencephalography (MEG), which detects the magnetic field changes generated around the human skull by neuronal networks as they fire in synchrony. We played clips from three piano pieces (one each by Beethoven, Brahms and Bach) to our participants and asked them to detect small pitch distortions to keep them engaged. While the Bach and Brahms selections were of fast- and mid-range tempi - ~8 and ~5 notes per second (Hz), respectively - the Beethoven piece was relatively slow (~.5 Hz).
    MEG recordings from 12 participants (non-musicians) show that during fast and mid-range selections (i.e. the Bach and the Brahms) neural activity oscillates at the same rhythmic rates as the musical pieces (8 Hz and 5 Hz, respectively). Interestingly, the slow piece showed rhythmic neural activity at 2 Hz, about 4 times the rhythmic rate of the stimuli (see figure). This modulated effect likely reflects a lower limit of this neural mechanism. It suggests that in particularly slow (< 1-2 Hz) music or speech, the brain can only accurately parse the stimulus at some multiple of the stimulus rate.

    Oscillations (neural or otherwise) are the result of two temporally coordinated opposing forces. With neural oscillations, these two forces are coordinated inhibitory and excitatory signals sent from other neurons. When the music is too slow, one of these forces must push the oscillation to the next cycle before the next note can occur, resulting in multiple cycles between notes: the oscillation tracks the stimulus at a multiple of the note rate.
    This effect is similar to a behavioral practice of musicians called subdivision: the idea that long notes can only be accurately perceived and reproduced by internally counting them as shorter beats. It is possible that this oscillatory effect of discretization may also act as a neural index of subdivision.
    Understanding how the processes and limitations of discretization in auditory perception work will elucidate the deficits that occur in disorders such as dyslexia and specific language impairment. Having shown that both speech and music perception share this process of discretization through oscillations could point to potential benefits of music training for dyslexic children as a more engaging intervention.