Abstract
We propose DGST, a novel and simple Dual-Generator network architecture for text Style Transfer. Our model employs two generators only, and does not rely on any discriminators or parallel corpus for training. Both quantitative and qualitative experiments on the Yelp and IMDb datasets show that our model gives competitive performance compared to several strong baselines with more complicated architecture designs.
Original language | English |
---|---|
Title of host publication | EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 7131-7136 |
Number of pages | 6 |
ISBN (Electronic) | 9781952148606 |
Publication status | Published - 2020 |
Event | 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 - Virtual, Online Duration: 16 Nov 2020 → 20 Nov 2020 |
Conference
Conference | 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 |
---|---|
City | Virtual, Online |
Period | 16/11/20 → 20/11/20 |
Bibliographical note
Funding Information:We would like to thank all the anonymous reviewers for their insightful comments. This work is supported by the award made by the UK Engineering and Physical Sciences Research Council (Grant number: EP/P011829/1).
Publisher Copyright:
© 2020 Association for Computational Linguistics.