English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
 
 
DownloadE-Mail
  The communicative influence of gesture and action during speech comprehension: Gestures have the upper hand [Abstract]

Kelly, S., Healey, M., Ozyurek, A., & Holler, J. (2012). The communicative influence of gesture and action during speech comprehension: Gestures have the upper hand [Abstract]. Abstracts of the Acoustics 2012 Hong Kong conference published in The Journal of the Acoustical Society of America, 131, 3311. doi:10.1121/1.4708385.

Item is

Files

show Files

Locators

show

Creators

show
hide
 Creators:
Kelly, Spencer1, Author
Healey, Meghan2, Author
Ozyurek, Asli3, 4, 5, 6, Author           
Holler, Judith5, Author           
Affiliations:
1Colgate University, ou_persistent22              
2National Institutes of Health , ou_persistent22              
3Language in our Hands: Sign and Gesture, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_789545              
4Center for language studies, Radboud University Nijmegen, ou_persistent22              
5Neurobiology of Language Department, MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_792551              
6Language in Action , MPI for Psycholinguistics, Max Planck Society, Nijmegen, NL, ou_55214              

Content

show
hide
Free keywords: -
 Abstract: Hand gestures combine with speech to form a single integrated system of meaning during language comprehension (Kelly et al., 2010). However, it is unknown whether gesture is uniquely integrated with speech or is processed like any other manual action. Thirty-one participants watched videos presenting speech with gestures or manual actions on objects. The relationship between the speech and gesture/action was either complementary (e.g., “He found the answer,” while producing a calculating gesture vs. actually using a calculator) or incongruent (e.g., the same sentence paired with the incongruent gesture/action of stirring with a spoon). Participants watched the video (prime) and then responded to a written word (target) that was or was not spoken in the video prime (e.g., “found” or “cut”). ERPs were taken to the primes (time-locked to the spoken verb, e.g., “found”) and the written targets. For primes, there was a larger frontal N400 (semantic processing) to incongruent vs. congruent items for the gesture, but not action, condition. For targets, the P2 (phonemic processing) was smaller for target words following congruent vs. incongruent gesture, but not action, primes. These findings suggest that hand gestures are integrated with speech in a privileged fashion compared to manual actions on objects.

Details

show
hide
Language(s): eng - English
 Dates: 20122012
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: Peer
 Identifiers: DOI: 10.1121/1.4708385
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Abstracts of the Acoustics 2012 Hong Kong conference published in The Journal of the Acoustical Society of America
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: Woodbury, NY : Acoustical Society of America through the American Institute of Physics
Pages: - Volume / Issue: 131 Sequence Number: - Start / End Page: 3311 Identifier: ISSN: 1520-9024
CoNE: https://pure.mpg.de/cone/journals/resource/991042754070048