Co-thought and Co-speech Gestures Are Generated by the Same Action Generation Process

Mingyuan Chu, Sotaro Kita

Research output: Contribution to journalArticle

11 Citations (Scopus)
48 Downloads (Pure)

Abstract

People spontaneously gesture when they speak (co-speech gestures) and when they solve problems silently (co-thought gestures). In this study, we first explored the relationship between these 2 types of gestures and found that individuals who produced co-thought gestures more frequently also produced co-speech gestures more frequently (Experiments 1 and 2). This suggests that the 2 types of gestures are generated from the same process. We then investigated whether both types of gestures can be generated from the representational use of the action generation process that also generates purposeful actions that have a direct physical impact on the world, such as manipulating an object or locomotion (the action generation hypothesis). To this end, we examined the effect of object affordances on the production of both types of gestures (Experiments 3 and 4). We found that individuals produced co-thought and co-speech gestures more often when the stimulus objects afforded action (objects with a smooth surface) than when they did not (objects with a spiky surface). These results support the action generation hypothesis for representational gestures. However, our findings are incompatible with the hypothesis that co-speech representational gestures are solely generated from the speech production process (the speech production hypothesis). (PsycINFO Database Record (c) 2015 APA, all rights reserved)
Original languageEnglish
Pages (from-to)257-270
Number of pages14
JournalJournal of Experimental Psychology: Learning, Memory and Cognition
Volume42
Issue number2
Early online date3 Aug 2015
DOIs
Publication statusPublished - Feb 2016

Fingerprint

Gestures
Thought
Co-speech Gestures
Gesture
experiment
production process
Locomotion
stimulus

Keywords

  • co-speech gesture
  • co-thought gesture
  • action generation
  • speech production
  • affordance

Cite this

Co-thought and Co-speech Gestures Are Generated by the Same Action Generation Process. / Chu, Mingyuan; Kita, Sotaro.

In: Journal of Experimental Psychology: Learning, Memory and Cognition, Vol. 42, No. 2, 02.2016, p. 257-270.

Research output: Contribution to journalArticle

@article{4b73e9cd30d14147ab9e9f389afd0b0d,
title = "Co-thought and Co-speech Gestures Are Generated by the Same Action Generation Process",
abstract = "People spontaneously gesture when they speak (co-speech gestures) and when they solve problems silently (co-thought gestures). In this study, we first explored the relationship between these 2 types of gestures and found that individuals who produced co-thought gestures more frequently also produced co-speech gestures more frequently (Experiments 1 and 2). This suggests that the 2 types of gestures are generated from the same process. We then investigated whether both types of gestures can be generated from the representational use of the action generation process that also generates purposeful actions that have a direct physical impact on the world, such as manipulating an object or locomotion (the action generation hypothesis). To this end, we examined the effect of object affordances on the production of both types of gestures (Experiments 3 and 4). We found that individuals produced co-thought and co-speech gestures more often when the stimulus objects afforded action (objects with a smooth surface) than when they did not (objects with a spiky surface). These results support the action generation hypothesis for representational gestures. However, our findings are incompatible with the hypothesis that co-speech representational gestures are solely generated from the speech production process (the speech production hypothesis). (PsycINFO Database Record (c) 2015 APA, all rights reserved)",
keywords = "co-speech gesture, co-thought gesture, action generation, speech production, affordance",
author = "Mingyuan Chu and Sotaro Kita",
note = "We thank Lucy Foulkes, Rachel Furness, Valentina Lee, and Zeshu Shao for their help with data collection; Paraskevi Argyriou for her help with reliability checks of gesture coding; and Agnieszka Konopka and Josje Praamstra for their help with proofreading this article.",
year = "2016",
month = "2",
doi = "10.1037/xlm0000168",
language = "English",
volume = "42",
pages = "257--270",
journal = "Journal of Experimental Psychology: Learning, Memory and Cognition",
issn = "0278-7393",
publisher = "American Psychological Association Inc.",
number = "2",

}

TY - JOUR

T1 - Co-thought and Co-speech Gestures Are Generated by the Same Action Generation Process

AU - Chu, Mingyuan

AU - Kita, Sotaro

N1 - We thank Lucy Foulkes, Rachel Furness, Valentina Lee, and Zeshu Shao for their help with data collection; Paraskevi Argyriou for her help with reliability checks of gesture coding; and Agnieszka Konopka and Josje Praamstra for their help with proofreading this article.

PY - 2016/2

Y1 - 2016/2

N2 - People spontaneously gesture when they speak (co-speech gestures) and when they solve problems silently (co-thought gestures). In this study, we first explored the relationship between these 2 types of gestures and found that individuals who produced co-thought gestures more frequently also produced co-speech gestures more frequently (Experiments 1 and 2). This suggests that the 2 types of gestures are generated from the same process. We then investigated whether both types of gestures can be generated from the representational use of the action generation process that also generates purposeful actions that have a direct physical impact on the world, such as manipulating an object or locomotion (the action generation hypothesis). To this end, we examined the effect of object affordances on the production of both types of gestures (Experiments 3 and 4). We found that individuals produced co-thought and co-speech gestures more often when the stimulus objects afforded action (objects with a smooth surface) than when they did not (objects with a spiky surface). These results support the action generation hypothesis for representational gestures. However, our findings are incompatible with the hypothesis that co-speech representational gestures are solely generated from the speech production process (the speech production hypothesis). (PsycINFO Database Record (c) 2015 APA, all rights reserved)

AB - People spontaneously gesture when they speak (co-speech gestures) and when they solve problems silently (co-thought gestures). In this study, we first explored the relationship between these 2 types of gestures and found that individuals who produced co-thought gestures more frequently also produced co-speech gestures more frequently (Experiments 1 and 2). This suggests that the 2 types of gestures are generated from the same process. We then investigated whether both types of gestures can be generated from the representational use of the action generation process that also generates purposeful actions that have a direct physical impact on the world, such as manipulating an object or locomotion (the action generation hypothesis). To this end, we examined the effect of object affordances on the production of both types of gestures (Experiments 3 and 4). We found that individuals produced co-thought and co-speech gestures more often when the stimulus objects afforded action (objects with a smooth surface) than when they did not (objects with a spiky surface). These results support the action generation hypothesis for representational gestures. However, our findings are incompatible with the hypothesis that co-speech representational gestures are solely generated from the speech production process (the speech production hypothesis). (PsycINFO Database Record (c) 2015 APA, all rights reserved)

KW - co-speech gesture

KW - co-thought gesture

KW - action generation

KW - speech production

KW - affordance

U2 - 10.1037/xlm0000168

DO - 10.1037/xlm0000168

M3 - Article

VL - 42

SP - 257

EP - 270

JO - Journal of Experimental Psychology: Learning, Memory and Cognition

JF - Journal of Experimental Psychology: Learning, Memory and Cognition

SN - 0278-7393

IS - 2

ER -