NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 69 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Prendergast, Garreth; Green, Gary G. R. – Brain and Language, 2012
Classical views of speech perception argue that the static and dynamic characteristics of spectral energy peaks (formants) are the acoustic features that underpin phoneme recognition. Here we use representations where the amplitude modulations of sub-band filtered speech are described, precisely, in terms of co-sinusoidal pulses. These pulses are…
Descriptors: Auditory Perception, Acoustics, Comprehension, Artificial Speech
Peer reviewed Peer reviewed
Direct linkDirect link
Partanen, Marita; Fitzpatrick, Kevin; Madler, Burkhard; Edgell, Dorothy; Bjornson, Bruce; Giaschi, Deborah E. – Brain and Language, 2012
The current study examined auditory processing deficits in dyslexia using a dichotic pitch stimulus and functional MRI. Cortical activation by the dichotic pitch task occurred in bilateral Heschl's gyri, right planum temporale, and right superior temporal sulcus. Adolescents with dyslexia, relative to age-matched controls, illustrated greater…
Descriptors: Dyslexia, Auditory Perception, Acoustics, Adolescents
Peer reviewed Peer reviewed
Direct linkDirect link
Biau, Emmanuel; Soto-Faraco, Salvador – Brain and Language, 2013
Spontaneous beat gestures are an integral part of the paralinguistic context during face-to-face conversations. Here we investigated the time course of beat-speech integration in speech perception by measuring ERPs evoked by words pronounced with or without an accompanying beat gesture, while participants watched a spoken discourse. Words…
Descriptors: Auditory Perception, Interpersonal Communication, Diagnostic Tests, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Tierney, Adam T.; Kraus, Nina – Brain and Language, 2013
Reading-impaired children have difficulty tapping to a beat. Here we tested whether this relationship between reading ability and synchronized tapping holds in typically-developing adolescents. We also hypothesized that tapping relates to two other abilities. First, since auditory-motor synchronization requires monitoring of the relationship…
Descriptors: Executive Function, Auditory Perception, Reading Ability, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Yoncheva, Yuliya N.; Maurer, Urs; Zevin, Jason D.; McCandliss, Bruce D. – Brain and Language, 2013
ERP responses to spoken words are sensitive to both rhyming effects and effects of associated spelling patterns. Are such effects automatically elicited by spoken words or dependent on selectively attending to phonology? To address this question, ERP responses to spoken word pairs were investigated under two equally demanding listening tasks that…
Descriptors: Spelling, Attention, Phonology, Word Recognition
Peer reviewed Peer reviewed
Direct linkDirect link
Murakami, Takenobu; Restle, Julia; Ziemann, Ulf – Brain and Language, 2012
A left-hemispheric cortico-cortical network involving areas of the temporoparietal junction (Tpj) and the posterior inferior frontal gyrus (pIFG) is thought to support sensorimotor integration of speech perception into articulatory motor activation, but how this network links with the lip area of the primary motor cortex (M1) during speech…
Descriptors: Brain Hemisphere Functions, Auditory Perception, Lateral Dominance, Sensory Integration
Peer reviewed Peer reviewed
Direct linkDirect link
Osnes, Berge; Hugdahl, Kenneth; Hjelmervik, Helene; Specht, Karsten – Brain and Language, 2012
In studies on auditory speech perception, participants are often asked to perform active tasks, e.g. decide whether the perceived sound is a speech sound or not. However, information about the stimulus, inherent in such tasks, may induce expectations that cause altered activations not only in the auditory cortex, but also in frontal areas such as…
Descriptors: Music, Auditory Perception, Speech Communication, Brain
Peer reviewed Peer reviewed
Direct linkDirect link
Hertrich, Ingo; Dietrich, Susanne; Ackermann, Hermann – Brain and Language, 2013
Blind people can learn to understand speech at ultra-high syllable rates (ca. 20 syllables/s), a capability associated with hemodynamic activation of the central-visual system. To further elucidate the neural mechanisms underlying this skill, magnetoencephalographic (MEG) measurements during listening to sentence utterances were cross-correlated…
Descriptors: Syllables, Oral Language, Blindness, Language Processing
Peer reviewed Peer reviewed
Direct linkDirect link
Wagner, Monica; Shafer, Valerie L.; Martin, Brett; Steinschneider, Mitchell – Brain and Language, 2012
The effect of exposure to the contextual features of the /pt/ cluster was investigated in native-English and native-Polish listeners using behavioral and event-related potential (ERP) methodology. Both groups experience the /pt/ cluster in their languages, but only the Polish group experiences the cluster in the context of word onset examined in…
Descriptors: Evidence, Phonology, Polish, Phonemes
Peer reviewed Peer reviewed
Direct linkDirect link
Kryuchkova, Tatiana; Tucker, Benjamin V.; Wurm, Lee H.; Baayen, R. Harald – Brain and Language, 2012
Visual emotionally charged stimuli have been shown to elicit early electrophysiological responses (e.g., Ihssen, Heim, & Keil, 2007; Schupp, Junghofer, Weike, & Hamm, 2003; Stolarova, Keil, & Moratti, 2006). We presented isolated words to listeners, and observed, using generalized additive modeling, oscillations in the upper part of the delta…
Descriptors: Evidence, Visual Perception, Language Processing, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Scharinger, Mathias; Merickel, Jennifer; Riley, Joshua; Idsardi, William J. – Brain and Language, 2011
Speech sounds can be classified on the basis of their underlying articulators or on the basis of the acoustic characteristics resulting from particular articulatory positions. Research in speech perception suggests that distinctive features are based on both articulatory and acoustic information. In recent years, neuroelectric and neuromagnetic…
Descriptors: Investigations, Articulation (Speech), Auditory Perception, Acoustics
Peer reviewed Peer reviewed
Direct linkDirect link
Brunelliere, Angele; Soto-Faraco, Salvador – Brain and Language, 2013
This study investigates the specificity of predictive coding in spoken word comprehension using event-related potentials (ERPs). We measured word-evoked ERPs in Catalan speakers listening to semantically constraining sentences produced in their native regional accent (Experiment 1) or in a non-native accent (Experiment 2). Semantically anomalous…
Descriptors: Semantics, Word Recognition, Auditory Perception, Sentences
Peer reviewed Peer reviewed
Direct linkDirect link
Tomaschek, Fabian; Truckenbrodt, Hubert; Hertrich, Ingo – Brain and Language, 2013
Recent experiments showed that the perception of vowel length by German listeners exhibits the characteristics of categorical perception. The present study sought to find the neural activity reflecting categorical vowel length and the short-long boundary by examining the processing of non-contrastive durations and categorical length using MEG.…
Descriptors: Language Processing, Brain Hemisphere Functions, Auditory Perception, Syllables
Peer reviewed Peer reviewed
Direct linkDirect link
Flinker, A.; Chang, E. F.; Barbaro, N. M.; Berger, M. S.; Knight, R. T. – Brain and Language, 2011
The human temporal lobe is well known to be critical for language comprehension. Previous physiological research has focused mainly on non-invasive neuroimaging and electrophysiological techniques with each approach requiring averaging across many trials and subjects. The results of these studies have implicated extended anatomical regions in…
Descriptors: Evidence, Stimuli, Phonemes, Auditory Perception
Peer reviewed Peer reviewed
Direct linkDirect link
Vukovic, Mile; Sujic, Radmila; Petrovic-Lazic, Mirjana; Miller, Nick; Milutinovic, Dejan; Babac, Snezana; Vukovic, Irena – Brain and Language, 2012
Phonation is a fundamental feature of human communication. Control of phonation in the context of speech-language disturbances has traditionally been considered a characteristic of lesions to subcortical structures and pathways. Evidence suggests however, that cortical lesions may also implicate phonation. We carried out acoustic and perceptual…
Descriptors: Evidence, Articulation (Speech), Aphasia, Neurological Impairments
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5