Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Cycle-Consistent Adversarial Autoencoders for Unsupervised Text Style Transfer

Yufang Huang, Wentao Zhu, Deyi Xiong, Yiye Zhang, Changjian Hu, Feiyu Xu


Abstract
Unsupervised text style transfer is full of challenges due to the lack of parallel data and difficulties in content preservation. In this paper, we propose a novel neural approach to unsupervised text style transfer which we refer to as Cycle-consistent Adversarial autoEncoders (CAE) trained from non-parallel data. CAE consists of three essential components: (1) LSTM autoencoders that encode a text in one style into its latent representation and decode an encoded representation into its original text or a transferred representation into a style-transferred text, (2) adversarial style transfer networks that use an adversarially trained generator to transform a latent representation in one style into a representation in another style, and (3) a cycle-consistent constraint that enhances the capacity of the adversarial style transfer networks in content preservation. The entire CAE with these three components can be trained end-to-end. Extensive experiments and in-depth analyses on two widely-used public datasets consistently validate the effectiveness of proposed CAE in both style transfer and content preservation against several strong baselines in terms of four automatic evaluation metrics and human evaluation.
Anthology ID:
2020.coling-main.201
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2213–2223
Language:
URL:
https://aclanthology.org/2020.coling-main.201
DOI:
10.18653/v1/2020.coling-main.201
Bibkey:
Cite (ACL):
Yufang Huang, Wentao Zhu, Deyi Xiong, Yiye Zhang, Changjian Hu, and Feiyu Xu. 2020. Cycle-Consistent Adversarial Autoencoders for Unsupervised Text Style Transfer. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2213–2223, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Cycle-Consistent Adversarial Autoencoders for Unsupervised Text Style Transfer (Huang et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.201.pdf