Synchronization of Speech and Gesture

Evidence for Interaction in Action

Mingyuan Chu, Peter Hagoort

Research output: Contribution to journalArticle

10 Citations (Scopus)
8 Downloads (Pure)

Abstract


Language and action systems are highly interlinked. A critical piece of evidence is that speech and its accompanying gestures are tightly synchronized. Five experiments were conducted to test 2 hypotheses about the synchronization of speech and gesture. According to the interactive view, there is continuous information exchange between the gesture and speech systems, during both their planning and execution phases. According to the ballistic view, information exchange occurs only during the planning phases of gesture and speech, but the 2 systems become independent once their execution has been initiated. In all experiments, participants were required to point to and/or name a light that had just lit up. Virtual reality and motion tracking technologies were used to disrupt their gesture or speech execution. Participants delayed their speech onset when their gesture was disrupted. They did so even when their gesture was disrupted at its late phase and even when they received only the kinesthetic feedback of their gesture. Also, participants prolonged their gestures when their speech was disrupted. These findings support the interactive view and add new constraints on models of speech and gesture production. (PsycINFO Database Record (c) 2014 APA, all rights reserved)
Original languageEnglish
Pages (from-to)1726-1741
Number of pages16
JournalJournal of Experimental Psychology: General
Volume143
Issue number4
DOIs
Publication statusPublished - Aug 2014

Fingerprint

Gestures
Names
Language
Technology
Light

Keywords

  • gesture-speech synchronization
  • interactive view
  • pointing gesture
  • virtual reality
  • forward model

Cite this

Synchronization of Speech and Gesture : Evidence for Interaction in Action. / Chu, Mingyuan; Hagoort, Peter.

In: Journal of Experimental Psychology: General, Vol. 143, No. 4, 08.2014, p. 1726-1741.

Research output: Contribution to journalArticle

@article{4753ffcc3c42442290ab87eb7e85cce0,
title = "Synchronization of Speech and Gesture: Evidence for Interaction in Action",
abstract = "Language and action systems are highly interlinked. A critical piece of evidence is that speech and its accompanying gestures are tightly synchronized. Five experiments were conducted to test 2 hypotheses about the synchronization of speech and gesture. According to the interactive view, there is continuous information exchange between the gesture and speech systems, during both their planning and execution phases. According to the ballistic view, information exchange occurs only during the planning phases of gesture and speech, but the 2 systems become independent once their execution has been initiated. In all experiments, participants were required to point to and/or name a light that had just lit up. Virtual reality and motion tracking technologies were used to disrupt their gesture or speech execution. Participants delayed their speech onset when their gesture was disrupted. They did so even when their gesture was disrupted at its late phase and even when they received only the kinesthetic feedback of their gesture. Also, participants prolonged their gestures when their speech was disrupted. These findings support the interactive view and add new constraints on models of speech and gesture production. (PsycINFO Database Record (c) 2014 APA, all rights reserved)",
keywords = "gesture-speech synchronization, interactive view, pointing gesture, virtual reality, forward model",
author = "Mingyuan Chu and Peter Hagoort",
year = "2014",
month = "8",
doi = "10.1037/a0036281",
language = "English",
volume = "143",
pages = "1726--1741",
journal = "Journal of Experimental Psychology: General",
issn = "0096-3445",
publisher = "American Psychological Association Inc.",
number = "4",

}

TY - JOUR

T1 - Synchronization of Speech and Gesture

T2 - Evidence for Interaction in Action

AU - Chu, Mingyuan

AU - Hagoort, Peter

PY - 2014/8

Y1 - 2014/8

N2 - Language and action systems are highly interlinked. A critical piece of evidence is that speech and its accompanying gestures are tightly synchronized. Five experiments were conducted to test 2 hypotheses about the synchronization of speech and gesture. According to the interactive view, there is continuous information exchange between the gesture and speech systems, during both their planning and execution phases. According to the ballistic view, information exchange occurs only during the planning phases of gesture and speech, but the 2 systems become independent once their execution has been initiated. In all experiments, participants were required to point to and/or name a light that had just lit up. Virtual reality and motion tracking technologies were used to disrupt their gesture or speech execution. Participants delayed their speech onset when their gesture was disrupted. They did so even when their gesture was disrupted at its late phase and even when they received only the kinesthetic feedback of their gesture. Also, participants prolonged their gestures when their speech was disrupted. These findings support the interactive view and add new constraints on models of speech and gesture production. (PsycINFO Database Record (c) 2014 APA, all rights reserved)

AB - Language and action systems are highly interlinked. A critical piece of evidence is that speech and its accompanying gestures are tightly synchronized. Five experiments were conducted to test 2 hypotheses about the synchronization of speech and gesture. According to the interactive view, there is continuous information exchange between the gesture and speech systems, during both their planning and execution phases. According to the ballistic view, information exchange occurs only during the planning phases of gesture and speech, but the 2 systems become independent once their execution has been initiated. In all experiments, participants were required to point to and/or name a light that had just lit up. Virtual reality and motion tracking technologies were used to disrupt their gesture or speech execution. Participants delayed their speech onset when their gesture was disrupted. They did so even when their gesture was disrupted at its late phase and even when they received only the kinesthetic feedback of their gesture. Also, participants prolonged their gestures when their speech was disrupted. These findings support the interactive view and add new constraints on models of speech and gesture production. (PsycINFO Database Record (c) 2014 APA, all rights reserved)

KW - gesture-speech synchronization

KW - interactive view

KW - pointing gesture

KW - virtual reality

KW - forward model

U2 - 10.1037/a0036281

DO - 10.1037/a0036281

M3 - Article

VL - 143

SP - 1726

EP - 1741

JO - Journal of Experimental Psychology: General

JF - Journal of Experimental Psychology: General

SN - 0096-3445

IS - 4

ER -