Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Zeitschriftenartikel

The communicative influence of gesture and action during speech comprehension: Gestures have the upper hand [Abstract]

MPG-Autoren
/persons/resource/persons142

Ozyurek,  Asli
Language in our Hands: Sign and Gesture, MPI for Psycholinguistics, Max Planck Society;
Center for language studies, Radboud University Nijmegen;
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;
Language in Action , MPI for Psycholinguistics, Max Planck Society;
Multimodal Language and Cognition, Radboud University Nijmegen, External Organizations;

/persons/resource/persons4512

Holler,  Judith
Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society;
Communication in Social Interaction, Radboud University Nijmegen, External Organizations;

Externe Ressourcen
Es sind keine externen Ressourcen hinterlegt
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)
Es sind keine frei zugänglichen Volltexte in PuRe verfügbar
Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Kelly, S., Healey, M., Ozyurek, A., & Holler, J. (2012). The communicative influence of gesture and action during speech comprehension: Gestures have the upper hand [Abstract]. Abstracts of the Acoustics 2012 Hong Kong conference published in The Journal of the Acoustical Society of America, 131, 3311. doi:10.1121/1.4708385.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-000F-A6C9-0
Zusammenfassung
Hand gestures combine with speech to form a single integrated system of meaning during language comprehension (Kelly et al., 2010). However, it is unknown whether gesture is uniquely integrated with speech or is processed like any other manual action. Thirty-one participants watched videos presenting speech with gestures or manual actions on objects. The relationship between the speech and gesture/action was either complementary (e.g., “He found the answer,” while producing a calculating gesture vs. actually using a calculator) or incongruent (e.g., the same sentence paired with the incongruent gesture/action of stirring with a spoon). Participants watched the video (prime) and then responded to a written word (target) that was or was not spoken in the video prime (e.g., “found” or “cut”). ERPs were taken to the primes (time-locked to the spoken verb, e.g., “found”) and the written targets. For primes, there was a larger frontal N400 (semantic processing) to incongruent vs. congruent items for the gesture, but not action, condition. For targets, the P2 (phonemic processing) was smaller for target words following congruent vs. incongruent gesture, but not action, primes. These findings suggest that hand gestures are integrated with speech in a privileged fashion compared to manual actions on objects.