Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group (https://www.hull.ac.uk/neuroscience)
Dr Henning Holle H.Holle@hull.ac.uk
Reader in Psychology / Leader of Cognitive and Clinical Neuroscience group (https://www.hull.ac.uk/neuroscience)
Thomas C. Gunter
Shirley-Ann Rüschemeyer
Andreas Hennenlotter
Marco Iacoboni
In communicative situations, speech is often accompanied by gestures. For example, speakers tend to illustrate certain contents of speech by means of iconic gestures which are hand movements that bear a formal relationship to the contents of speech. The meaning of an iconic gesture is determined both by its form as well as the speech context in which it is performed. Thus, gesture and speech interact in comprehension. Using fMRI, the present study investigated what brain areas are involved in this interaction process. Participants watched videos in which sentences containing an ambiguous word (e.g. She touched the mouse) were accompanied by either a meaningless grooming movement, a gesture supporting the more frequent dominant meaning (e.g. animal) or a gesture supporting the less frequent subordinate meaning (e.g. computer device). We hypothesized that brain areas involved in the interaction of gesture and speech would show greater activation to gesture-supported sentences as compared to sentences accompanied by a meaningless grooming movement. The main results are that when contrasted with grooming, both types of gestures (dominant and subordinate) activated an array of brain regions consisting of the left posterior superior temporal sulcus (STS), the inferior parietal lobule bilaterally and the ventral precentral sulcus bilaterally. Given the crucial role of the STS in audiovisual integration processes, this activation might reflect the interaction between the meaning of gesture and the ambiguous sentence. The activations in inferior frontal and inferior parietal regions may reflect a mechanism of determining the goal of co-speech hand movements through an observation-execution matching process.
Holle, H., Gunter, T. C., Rüschemeyer, S.-A., Hennenlotter, A., & Iacoboni, M. (2008). Neural correlates of the processing of co-speech gestures. NeuroImage, 39(4), 2010-2024. https://doi.org/10.1016/j.neuroimage.2007.10.055
Journal Article Type | Article |
---|---|
Acceptance Date | Oct 26, 2007 |
Online Publication Date | Nov 13, 2007 |
Publication Date | Feb 15, 2008 |
Deposit Date | Nov 13, 2014 |
Journal | Neuroimage |
Print ISSN | 1053-8119 |
Publisher | Elsevier |
Peer Reviewed | Peer Reviewed |
Volume | 39 |
Issue | 4 |
Pages | 2010-2024 |
DOI | https://doi.org/10.1016/j.neuroimage.2007.10.055 |
Keywords | Cognitive Neuroscience; Neurology |
Public URL | https://hull-repository.worktribe.com/output/464722 |
Contract Date | Nov 13, 2014 |
The Many Challenges of Human Experimental Itch Research
(2023)
Book Chapter
No preconscious attentional bias towards itch in healthy individuals
(2022)
Journal Article
Can contagious itch be affected by positive and negative suggestions?
(2022)
Journal Article
Human but not robotic gaze facilitates action prediction
(2022)
Journal Article
Acute itch induces attentional avoidance of itch-related information
(2022)
Journal Article
About Repository@Hull
Administrator e-mail: repository@hull.ac.uk
This application uses the following open-source libraries:
Apache License Version 2.0 (http://www.apache.org/licenses/)
Apache License Version 2.0 (http://www.apache.org/licenses/)
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search