Beat that Word

How Listeners Integrate Beat Gesture and Focus in Multimodal Speech Discourse

Diana Dimitrova, Mingyuan Chu, Lin Wang, Asli Özyürek, Peter Hagoort

Research output: Contribution to journalArticle

9 Citations (Scopus)
4 Downloads (Pure)

Abstract

Communication is facilitated when listeners allocate their attention to important information (focus) in the message, a process called “information structure.” Linguistic cues like the preceding context and pitch accent help listeners to identify focused information. In multimodal communication, relevant information can be emphasized by nonverbal cues like beat gestures, which represent rhythmic nonmeaningful hand movements. Recent studies have found that linguistic and nonverbal attention cues are integrated independently in single sentences. However, it is possible that these two cues interact when information is embedded in context, because context allows listeners to predict what information is important. In an ERP study, we tested this hypothesis and asked listeners to view videos capturing a dialogue. In the critical sentence, focused and nonfocused words were accompanied by beat gestures, grooming hand movements, or no gestures. ERP results showed that focused words are processed more attentively than nonfocused words as reflected in an N1 and P300 component. Hand movements also captured attention and elicited a P300 component. Importantly, beat gesture and focus interacted in a late time window of 600–900 msec relative to target word onset, giving rise to a late positivity when nonfocused words were accompanied by beat gestures. Our results show that listeners integrate beat gesture with the focus of the message and that integration costs arise when beat gesture falls on nonfocused information. This suggests that beat gestures fulfill a unique focusing function in multimodal discourse processing and that they have to be integrated with the information structure of the message.

Original languageEnglish
Pages (from-to)1255-1269
Number of pages15
JournalJournal of Cognitive Neuroscience
Volume28
Issue number9
Early online date29 Jul 2016
DOIs
Publication statusPublished - Sep 2016

Fingerprint

Gestures
Cues
P300 Event-Related Potentials
Hand
Linguistics
Communication
Sodium Glutamate
Grooming
Costs and Cost Analysis

Cite this

Beat that Word : How Listeners Integrate Beat Gesture and Focus in Multimodal Speech Discourse. / Dimitrova, Diana ; Chu, Mingyuan; Wang, Lin; Özyürek, Asli; Hagoort, Peter.

In: Journal of Cognitive Neuroscience, Vol. 28, No. 9, 09.2016, p. 1255-1269.

Research output: Contribution to journalArticle

Dimitrova, Diana ; Chu, Mingyuan ; Wang, Lin ; Özyürek, Asli ; Hagoort, Peter. / Beat that Word : How Listeners Integrate Beat Gesture and Focus in Multimodal Speech Discourse. In: Journal of Cognitive Neuroscience. 2016 ; Vol. 28, No. 9. pp. 1255-1269.
@article{faf0cb7c27984bb09f844d3ed7521c63,
title = "Beat that Word: How Listeners Integrate Beat Gesture and Focus in Multimodal Speech Discourse",
abstract = "Communication is facilitated when listeners allocate their attention to important information (focus) in the message, a process called “information structure.” Linguistic cues like the preceding context and pitch accent help listeners to identify focused information. In multimodal communication, relevant information can be emphasized by nonverbal cues like beat gestures, which represent rhythmic nonmeaningful hand movements. Recent studies have found that linguistic and nonverbal attention cues are integrated independently in single sentences. However, it is possible that these two cues interact when information is embedded in context, because context allows listeners to predict what information is important. In an ERP study, we tested this hypothesis and asked listeners to view videos capturing a dialogue. In the critical sentence, focused and nonfocused words were accompanied by beat gestures, grooming hand movements, or no gestures. ERP results showed that focused words are processed more attentively than nonfocused words as reflected in an N1 and P300 component. Hand movements also captured attention and elicited a P300 component. Importantly, beat gesture and focus interacted in a late time window of 600–900 msec relative to target word onset, giving rise to a late positivity when nonfocused words were accompanied by beat gestures. Our results show that listeners integrate beat gesture with the focus of the message and that integration costs arise when beat gesture falls on nonfocused information. This suggests that beat gestures fulfill a unique focusing function in multimodal discourse processing and that they have to be integrated with the information structure of the message.",
author = "Diana Dimitrova and Mingyuan Chu and Lin Wang and Asli {\"O}zy{\"u}rek and Peter Hagoort",
year = "2016",
month = "9",
doi = "10.1162/jocn_a_00963",
language = "English",
volume = "28",
pages = "1255--1269",
journal = "Journal of Cognitive Neuroscience",
issn = "0898-929X",
publisher = "MIT Press Journals",
number = "9",

}

TY - JOUR

T1 - Beat that Word

T2 - How Listeners Integrate Beat Gesture and Focus in Multimodal Speech Discourse

AU - Dimitrova, Diana

AU - Chu, Mingyuan

AU - Wang, Lin

AU - Özyürek, Asli

AU - Hagoort, Peter

PY - 2016/9

Y1 - 2016/9

N2 - Communication is facilitated when listeners allocate their attention to important information (focus) in the message, a process called “information structure.” Linguistic cues like the preceding context and pitch accent help listeners to identify focused information. In multimodal communication, relevant information can be emphasized by nonverbal cues like beat gestures, which represent rhythmic nonmeaningful hand movements. Recent studies have found that linguistic and nonverbal attention cues are integrated independently in single sentences. However, it is possible that these two cues interact when information is embedded in context, because context allows listeners to predict what information is important. In an ERP study, we tested this hypothesis and asked listeners to view videos capturing a dialogue. In the critical sentence, focused and nonfocused words were accompanied by beat gestures, grooming hand movements, or no gestures. ERP results showed that focused words are processed more attentively than nonfocused words as reflected in an N1 and P300 component. Hand movements also captured attention and elicited a P300 component. Importantly, beat gesture and focus interacted in a late time window of 600–900 msec relative to target word onset, giving rise to a late positivity when nonfocused words were accompanied by beat gestures. Our results show that listeners integrate beat gesture with the focus of the message and that integration costs arise when beat gesture falls on nonfocused information. This suggests that beat gestures fulfill a unique focusing function in multimodal discourse processing and that they have to be integrated with the information structure of the message.

AB - Communication is facilitated when listeners allocate their attention to important information (focus) in the message, a process called “information structure.” Linguistic cues like the preceding context and pitch accent help listeners to identify focused information. In multimodal communication, relevant information can be emphasized by nonverbal cues like beat gestures, which represent rhythmic nonmeaningful hand movements. Recent studies have found that linguistic and nonverbal attention cues are integrated independently in single sentences. However, it is possible that these two cues interact when information is embedded in context, because context allows listeners to predict what information is important. In an ERP study, we tested this hypothesis and asked listeners to view videos capturing a dialogue. In the critical sentence, focused and nonfocused words were accompanied by beat gestures, grooming hand movements, or no gestures. ERP results showed that focused words are processed more attentively than nonfocused words as reflected in an N1 and P300 component. Hand movements also captured attention and elicited a P300 component. Importantly, beat gesture and focus interacted in a late time window of 600–900 msec relative to target word onset, giving rise to a late positivity when nonfocused words were accompanied by beat gestures. Our results show that listeners integrate beat gesture with the focus of the message and that integration costs arise when beat gesture falls on nonfocused information. This suggests that beat gestures fulfill a unique focusing function in multimodal discourse processing and that they have to be integrated with the information structure of the message.

U2 - 10.1162/jocn_a_00963

DO - 10.1162/jocn_a_00963

M3 - Article

VL - 28

SP - 1255

EP - 1269

JO - Journal of Cognitive Neuroscience

JF - Journal of Cognitive Neuroscience

SN - 0898-929X

IS - 9

ER -