Transferring multiple text styles using CycleGAN with supervised style latent space
Author
Abstract

Neural Style Transfer - Text style transfer is a relevant task, contributing to theoretical and practical advancement in several areas, especially when working with non-parallel data. The concept behind nonparallel style transfer is to change a specific dimension of the sentence while retaining the overall context. Previous work used adversarial learning to perform such a task. Although it was not initially created to work with textual data, it proved very effective. Most of the previous work has focused on developing algorithms capable of transferring between binary styles, with limited generalization capabilities and limited applications. This work proposes a framework capable of working with multiple styles and improving content retention (BLEU) after a transfer. The proposed framework combines supervised learning of latent spaces and their separation within the architecture. The results suggest that the proposed framework improves content retention in multi-style scenarios while maintaining accuracy comparable to state-of-the-art.

Year of Publication
2022
Date Published
jul
Publisher
IEEE
Conference Location
Padua, Italy
ISBN Number
978-1-72818-671-9
URL
https://ieeexplore.ieee.org/document/9892978/
DOI
10.1109/IJCNN55064.2022.9892978
Google Scholar | BibTeX | DOI