Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT
  The communicative influence of gesture and action during speech comprehension: Gestures have the upper hand [Abstract]

Kelly, S., Healey, M., Ozyurek, A., & Holler, J. (2012). The communicative influence of gesture and action during speech comprehension: Gestures have the upper hand [Abstract]. Abstracts of the Acoustics 2012 Hong Kong conference published in The Journal of the Acoustical Society of America, 131, 3311. doi:10.1121/1.4708385.

Item is

Externe Referenzen

einblenden:

Urheber

einblenden:
ausblenden:
 Urheber:
Kelly, Spencer1, Autor
Healey, Meghan2, Autor
Ozyurek, Asli3, 4, 5, 6, Autor           
Holler, Judith5, Autor           
Affiliations:
1Colgate University, ou_persistent22              
2National Institutes of Health , ou_persistent22              
3Language in our Hands: Sign and Gesture, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_789545              
4Center for language studies, Radboud University Nijmegen, ou_persistent22              
5Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_792551              
6Language in Action , MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_55214              

Inhalt

einblenden:
ausblenden:
Schlagwörter: -
 Zusammenfassung: Hand gestures combine with speech to form a single integrated system of meaning during language comprehension (Kelly et al., 2010). However, it is unknown whether gesture is uniquely integrated with speech or is processed like any other manual action. Thirty-one participants watched videos presenting speech with gestures or manual actions on objects. The relationship between the speech and gesture/action was either complementary (e.g., “He found the answer,” while producing a calculating gesture vs. actually using a calculator) or incongruent (e.g., the same sentence paired with the incongruent gesture/action of stirring with a spoon). Participants watched the video (prime) and then responded to a written word (target) that was or was not spoken in the video prime (e.g., “found” or “cut”). ERPs were taken to the primes (time-locked to the spoken verb, e.g., “found”) and the written targets. For primes, there was a larger frontal N400 (semantic processing) to incongruent vs. congruent items for the gesture, but not action, condition. For targets, the P2 (phonemic processing) was smaller for target words following congruent vs. incongruent gesture, but not action, primes. These findings suggest that hand gestures are integrated with speech in a privileged fashion compared to manual actions on objects.

Details

einblenden:
ausblenden:
Sprache(n): eng - English
 Datum: 20122012
 Publikationsstatus: Erschienen
 Seiten: -
 Ort, Verlag, Ausgabe: -
 Inhaltsverzeichnis: -
 Art der Begutachtung: Expertenbegutachtung
 Identifikatoren: DOI: 10.1121/1.4708385
 Art des Abschluß: -

Veranstaltung

einblenden:

Entscheidung

einblenden:

Projektinformation

einblenden:

Quelle 1

einblenden:
ausblenden:
Titel: Abstracts of the Acoustics 2012 Hong Kong conference published in The Journal of the Acoustical Society of America
Genre der Quelle: Zeitschrift
 Urheber:
Affiliations:
Ort, Verlag, Ausgabe: Woodbury, NY : Acoustical Society of America through the American Institute of Physics
Seiten: - Band / Heft: 131 Artikelnummer: - Start- / Endseite: 3311 Identifikator: ISSN: 1520-9024
CoNE: https://pure.mpg.de/cone/journals/resource/991042754070048