NotesFAQContact Us
Collection
Advanced
Search Tips
Showing 1 to 15 of 177 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Kompus, Kristiina; Specht, Karsten; Ersland, Lars; Juvodden, Hilde T.; van Wageningen, Heidi; Hugdahl, Kenneth; Westerhausen, Rene – Brain and Language, 2012
We report fMRI and behavioral data from 113 subjects on attention and cognitive control using a variant of the classic dichotic listening paradigm with pairwise presentations of consonant-vowel syllables. The syllable stimuli were presented in a block-design while subjects were in the MR scanner. The subjects were instructed to pay attention to…
Descriptors: Attention, Cognitive Processes, Listening, Syllables
Peer reviewed Peer reviewed
Direct linkDirect link
Mashal, N.; Vishne, T.; Laor, N.; Titone, D. – Brain and Language, 2013
The neural basis involved in novel metaphor comprehension in schizophrenia is relatively unknown. Fourteen people with schizophrenia and fourteen controls were scanned while they silently read novel metaphors, conventional metaphors, literal expressions, and meaningless word-pairs. People with schizophrenia showed reduced comprehension of both…
Descriptors: Cognitive Processes, Figurative Language, Brain Hemisphere Functions, Schizophrenia
Peer reviewed Peer reviewed
Direct linkDirect link
Humphreys, Gina F.; Newling, Katherine; Jennings, Caroline; Gennari, Silvia P. – Brain and Language, 2013
Understanding verbs typically activates posterior temporal regions and, in some circumstances, motion perception area V5. However, the nature and role of this activation remains unclear: does language alone indeed activate V5? And are posterior temporal representations modality-specific motion representations, or supra-modal motion-independent…
Descriptors: Semantics, Sentences, Motion, Imagery
Peer reviewed Peer reviewed
Direct linkDirect link
Hessler, Dorte; Jonkers, Roel; Stowe, Laurie; Bastiaanse, Roelien – Brain and Language, 2013
In the current ERP study, an active oddball task was carried out, testing pure tones and auditory, visual and audiovisual syllables. For pure tones, an MMN, an N2b, and a P3 were found, confirming traditional findings. Auditory syllables evoked an N2 and a P3. We found that the amplitude of the P3 depended on the distance between standard and…
Descriptors: Auditory Stimuli, Audiovisual Aids, Phonemes, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Wang, Lin; Zhu, Zude; Bastiaansen, Marcel; Hagoort, Peter; Yang, Yufang – Brain and Language, 2013
Unlike common nouns, person names refer to unique entities and generally have a referring function. We used event-related potentials to investigate the time course of identifying the emotional meaning of nouns and names. The emotional valence of names and nouns were manipulated separately. The results show early N1 effects in response to emotional…
Descriptors: Cognitive Processes, Word Recognition, Nouns, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Adorni, Roberta; Manfredi, Mirella; Proverbio, Alice Mado – Brain and Language, 2013
The aim of the study was to investigate the effect of both word age of acquisition (AoA) and frequency of occurrence on the timing and topographical distribution of ERP components. The processing of early- versus late-acquired words was compared with that of high-frequency versus low-frequency words. Participants were asked to perform an…
Descriptors: Word Recognition, Language Processing, Cognitive Processes, Role
Peer reviewed Peer reviewed
Direct linkDirect link
Twomey, Tae; Duncan, Keith J. Kawabata; Hogan, John S.; Morita, Kenji; Umeda, Kazumasa; Sakai, Katsuyuki; Devlin, Joseph T. – Brain and Language, 2013
In Japanese, the same word can be written in either morphographic Kanji or syllabographic Hiragana and this provides a unique opportunity to disentangle a word's lexical frequency from the frequency of its visual form--an important distinction for understanding the neural information processing in regions engaged by reading. Behaviorally,…
Descriptors: Familiarity, Japanese, Written Language, Word Frequency
Peer reviewed Peer reviewed
Direct linkDirect link
Biau, Emmanuel; Soto-Faraco, Salvador – Brain and Language, 2013
Spontaneous beat gestures are an integral part of the paralinguistic context during face-to-face conversations. Here we investigated the time course of beat-speech integration in speech perception by measuring ERPs evoked by words pronounced with or without an accompanying beat gesture, while participants watched a spoken discourse. Words…
Descriptors: Auditory Perception, Interpersonal Communication, Diagnostic Tests, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Fritsch, Nathalie; Kuchinke, Lars – Brain and Language, 2013
The present study examined how contextual learning and in particular emotionality conditioning impacts the neural processing of words, as possible key factors for the acquisition of words' emotional connotation. 21 participants learned on five consecutive days associations between meaningless pseudowords and unpleasant or neutral pictures using an…
Descriptors: Context Effect, Emotional Response, Cognitive Processes, Word Recognition
Peer reviewed Peer reviewed
Direct linkDirect link
Tierney, Adam T.; Kraus, Nina – Brain and Language, 2013
Reading-impaired children have difficulty tapping to a beat. Here we tested whether this relationship between reading ability and synchronized tapping holds in typically-developing adolescents. We also hypothesized that tapping relates to two other abilities. First, since auditory-motor synchronization requires monitoring of the relationship…
Descriptors: Executive Function, Auditory Perception, Reading Ability, Correlation
Peer reviewed Peer reviewed
Direct linkDirect link
Yoncheva, Yuliya N.; Maurer, Urs; Zevin, Jason D.; McCandliss, Bruce D. – Brain and Language, 2013
ERP responses to spoken words are sensitive to both rhyming effects and effects of associated spelling patterns. Are such effects automatically elicited by spoken words or dependent on selectively attending to phonology? To address this question, ERP responses to spoken word pairs were investigated under two equally demanding listening tasks that…
Descriptors: Spelling, Attention, Phonology, Word Recognition
Peer reviewed Peer reviewed
Direct linkDirect link
Fedorenko, Evelina; Nieto-Castanon, Alfonso; Kanwisher, Nancy – Brain and Language, 2012
For every claim in the neuroimaging literature about a particular brain region supporting syntactic processing, there exist other claims implicating the target region in different linguistic processes, and, in many cases, in non-linguistic cognitive processes (e.g., Blumstein, 2009). We argue that traditional group analysis methods in neuroimaging…
Descriptors: Social Cognition, Brain Hemisphere Functions, Specialization, Inferences
Peer reviewed Peer reviewed
Direct linkDirect link
Romagno, Domenica; Rota, Giuseppina; Ricciardi, Emiliano; Pietrini, Pietro – Brain and Language, 2012
In this study we investigated whether the human brain distinguishes between telic events that necessarily entail a specified endpoint (e.g., "reaching"), and atelic events with no delimitation or final state (e.g., "chasing"). We used functional magnetic resonance imaging to explore the patterns of neural response associated with verbs denoting…
Descriptors: Evidence, Semantics, Neurology, Brain Hemisphere Functions
Peer reviewed Peer reviewed
Direct linkDirect link
Tesan, Graciela; Johnson, Blake W.; Crain, Stephen – Brain and Language, 2012
The word "any" may appear in some sentences, but not in others. For example, "any" is permitted in sentences that contain the word "nobody", as in "Nobody ate any fruit". However, in a minimally different context "any" seems strikingly anomalous: *"Everybody ate any fruit". The aim of the present study was to investigate how the brain responds to…
Descriptors: Sentences, Brain Hemisphere Functions, Anatomy, Language Usage
Peer reviewed Peer reviewed
Direct linkDirect link
Oi, Misato; Saito, Hirofumi; Li, Zongfeng; Zhao, Wenjun – Brain and Language, 2013
To examine the neural mechanism of co-speech gesture production, we measured brain activity of bilinguals during an animation-narration task using near-infrared spectroscopy. The task of the participants was to watch two stories via an animated cartoon, and then narrate the contents in their first language (Ll) and second language (L2),…
Descriptors: Cognitive Processes, Bilingualism, Animation, Cartoons
Previous Page | Next Page ยป
Pages: 1  |  2  |  3  |  4  |  5  |  6  |  7  |  8  |  9  |  10  |  11  |  12