A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification

Ruizhe Li*, Chenghua Lin, Matthew Collinson, Xiao Li, Guanyi Chen

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

11 Downloads (Pure)

Abstract

Recognising dialogue acts (DA) is important for many natural language processing tasks such as dialogue generation and intention recognition. In this paper, we propose a dual-attention hierarchical recurrent neural network for DA classification. Our model is partially inspired by the observation that conversational utterances are normally associated with both a DA and a topic, where the former captures the social act and the latter describes the subject matter. However, such a dependency between DAs and topics has not been utilised by most existing systems for DA classification. With a novel dual task-specific attention mechanism, our model is able, for utterances, to capture information about both DAs and topics, as well as information about the interactions between them. Experimental results show that by modelling topic as an auxiliary task, our model can significantly improve DA classification, yielding better or comparable performance to the state-of-the-art method on three public datasets.
Original languageEnglish
Title of host publication2019 SIGNLL Proceedings
PublisherACL Anthology
Publication statusAccepted/In press - 28 Aug 2019
Event2019 SIGNLL Conference on Computational Natural Language Learning (CoNLL) - , Hong Kong
Duration: 3 Nov 20194 Nov 2019

Conference

Conference2019 SIGNLL Conference on Computational Natural Language Learning (CoNLL)
CountryHong Kong
Period3/11/194/11/19

Fingerprint

Recurrent neural networks
Processing

Keywords

  • cs.CL

Cite this

Li, R., Lin, C., Collinson, M., Li, X., & Chen, G. (Accepted/In press). A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification. In 2019 SIGNLL Proceedings ACL Anthology.

A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification. / Li, Ruizhe; Lin, Chenghua; Collinson, Matthew; Li, Xiao; Chen, Guanyi.

2019 SIGNLL Proceedings. ACL Anthology, 2019.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Li, R, Lin, C, Collinson, M, Li, X & Chen, G 2019, A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification. in 2019 SIGNLL Proceedings. ACL Anthology, 2019 SIGNLL Conference on Computational Natural Language Learning (CoNLL), Hong Kong, 3/11/19.
Li R, Lin C, Collinson M, Li X, Chen G. A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification. In 2019 SIGNLL Proceedings. ACL Anthology. 2019
Li, Ruizhe ; Lin, Chenghua ; Collinson, Matthew ; Li, Xiao ; Chen, Guanyi. / A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification. 2019 SIGNLL Proceedings. ACL Anthology, 2019.
@inproceedings{e9e45269f3544157a4e7186a26ad808b,
title = "A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification",
abstract = "Recognising dialogue acts (DA) is important for many natural language processing tasks such as dialogue generation and intention recognition. In this paper, we propose a dual-attention hierarchical recurrent neural network for DA classification. Our model is partially inspired by the observation that conversational utterances are normally associated with both a DA and a topic, where the former captures the social act and the latter describes the subject matter. However, such a dependency between DAs and topics has not been utilised by most existing systems for DA classification. With a novel dual task-specific attention mechanism, our model is able, for utterances, to capture information about both DAs and topics, as well as information about the interactions between them. Experimental results show that by modelling topic as an auxiliary task, our model can significantly improve DA classification, yielding better or comparable performance to the state-of-the-art method on three public datasets.",
keywords = "cs.CL",
author = "Ruizhe Li and Chenghua Lin and Matthew Collinson and Xiao Li and Guanyi Chen",
note = "Acknowledgment This work is supported by the award made by the UK Engineering and Physical Sciences Research Council (Grant number: EP/P011829/1).",
year = "2019",
month = "8",
day = "28",
language = "English",
booktitle = "2019 SIGNLL Proceedings",
publisher = "ACL Anthology",

}

TY - GEN

T1 - A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification

AU - Li, Ruizhe

AU - Lin, Chenghua

AU - Collinson, Matthew

AU - Li, Xiao

AU - Chen, Guanyi

N1 - Acknowledgment This work is supported by the award made by the UK Engineering and Physical Sciences Research Council (Grant number: EP/P011829/1).

PY - 2019/8/28

Y1 - 2019/8/28

N2 - Recognising dialogue acts (DA) is important for many natural language processing tasks such as dialogue generation and intention recognition. In this paper, we propose a dual-attention hierarchical recurrent neural network for DA classification. Our model is partially inspired by the observation that conversational utterances are normally associated with both a DA and a topic, where the former captures the social act and the latter describes the subject matter. However, such a dependency between DAs and topics has not been utilised by most existing systems for DA classification. With a novel dual task-specific attention mechanism, our model is able, for utterances, to capture information about both DAs and topics, as well as information about the interactions between them. Experimental results show that by modelling topic as an auxiliary task, our model can significantly improve DA classification, yielding better or comparable performance to the state-of-the-art method on three public datasets.

AB - Recognising dialogue acts (DA) is important for many natural language processing tasks such as dialogue generation and intention recognition. In this paper, we propose a dual-attention hierarchical recurrent neural network for DA classification. Our model is partially inspired by the observation that conversational utterances are normally associated with both a DA and a topic, where the former captures the social act and the latter describes the subject matter. However, such a dependency between DAs and topics has not been utilised by most existing systems for DA classification. With a novel dual task-specific attention mechanism, our model is able, for utterances, to capture information about both DAs and topics, as well as information about the interactions between them. Experimental results show that by modelling topic as an auxiliary task, our model can significantly improve DA classification, yielding better or comparable performance to the state-of-the-art method on three public datasets.

KW - cs.CL

UR - https://www.aclweb.org/anthology/

M3 - Conference contribution

BT - 2019 SIGNLL Proceedings

PB - ACL Anthology

ER -