Prosodic encoding of information structure in mandarin Chinese: Evidence from picture description task

Yifei Bi, Lesya Y. Ganushchak, Agnieszka E. Konopka, Guiqin Ren, Xue Sui, Yiya Chen

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

This study investigates the extent to which prosodic cues are employed during online sentence production to distinguish three different notions of information structure (informational focus, corrective focus, and givenness) at two sentential focus locations (i.e. the subject and object positions). Participants were asked to describe pictures. The information status of the subject and object characters was manipulated in the discourse preceding the presentation of each picture. Acoustic data (including duration, F0, and intensity) from 65 participants were analysed. Results showed consistent acoustic differences between corrective focus and givenness, confirming the findings from reading and semi-controlled production tasks. Contrary to previous studies, our results showed no durational and F0 differences between informational focus and givenness, and no differences in intensity range between informational focus and corrective focus. Moreover, our results showed that the sentential focus locations of the target word also had an impact on the prosodic encoding of information structure in natural utterance production.

Original languageEnglish
Pages (from-to)726-730
Number of pages5
JournalProceedings of the International Conference on Speech Prosody
Volume2016
Issue numberJanuary
Publication statusPublished - 31 Dec 2016
Event8th Speech Prosody 2016 - Boston University, Boston, United States
Duration: 31 May 20163 Jun 2016
Conference number: 122553

Keywords

  • Information structure
  • Mandarin Chinese
  • Natural utterance production
  • Prosodic encoding

Fingerprint

Dive into the research topics of 'Prosodic encoding of information structure in mandarin Chinese: Evidence from picture description task'. Together they form a unique fingerprint.

Cite this