Abstract
Syntactically controlled paraphrase generation can produce diverse paraphrases by exposing syntactic control, where both semantic preservation and syntactic variations are two important factors. Previous works mainly focus on using fine-grained syntactic structures (e.g., full parse tree) as syntactic control. While these methods can achieve excellent syntactic controllability, leads to failing to preserve the semantics of the input sentence. The main reason is that it is difficult to retrieve perfectly compatible syntactic structures with the input sentences. In this paper, we explore coarse-grained syntactic structures to trade-off semantic preservation and syntactic variations. Furthermore, to improve semantic preservation and syntactic controllability, we propose a Syntax Attention-Guided Paraphrase (SAGP) model that can correctly select syntactic information according to the current state for surface realization. Experiment results show that SAGP outperforms the previous state-of-the-art method under the same setting. Additionally, we validate that using coarse-grained structures can generate more semantically reasonable text without affecting the syntactic controllability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Obtained using the Stanford CoreNLPÂ [14].
- 2.
We used the paraphrase-distilroberta-base-v1, which is available at: https://public.ukp.informatik.tu-darmstadt.de/reimers/sentence-transformers/v0.2/.
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: ICLR 2015 (2015)
Banerjee, S., Lavie, A.: METEOR: an automatic metric for MT evaluation with improved correlation with human judgments. In: ACL Workshop, Ann Arbor, Michigan (2005)
Bao, Y., et al.: Generating sentences from disentangled syntactic and semantic spaces. In: ACL (2019)
Bowman, S.R., Vilnis, L., Vinyals, O., Dai, A., Jozefowicz, R., Bengio, S.: Generating sentences from a continuous space. In: SIGNLL, Berlin, Germany (2016)
Chen, M., Tang, Q., Wiseman, S., Gimpel, K.: Controllable paraphrase generation with a syntactic exemplar. In: ACL, Florence, Italy (2019)
Dong, L., Mallinson, J., Reddy, S., Lapata, M.: Learning to paraphrase for question answering. In: EMNLP, Copenhagen, Denmark (2017)
Gupta, A., Agarwal, A., Singh, P., Rai, P.: A deep generative framework for paraphrase generation. In: AAAI (2018). https://aaai.org/ocs/index.php/AAAI/AAAI18/paper/view/16353
Iyyer, M., Wieting, J., Gimpel, K., Zettlemoyer, L.: Adversarial example generation with syntactically controlled paraphrase networks. In: NAACL (2018)
Kumar, A., Ahuja, K., Vadapalli, R., Talukdar, P.: Syntax-guided controlled generation of paraphrases. TACL 8, 329–345 (2020). https://www.aclweb.org/anthology/2020.tacl-1.22
Li, Z., Jiang, X., Shang, L., Li, H.: Paraphrase generation with deep reinforcement learning. In: EMNLP, Brussels, Belgium (2018)
Lin, C.Y.: ROUGE: a package for automatic evaluation of summaries. In: Text Summarization Branches Out, Barcelona, Spain (2004)
Liu, M., et al.: Exploring bilingual parallel corpora for syntactically controllable paraphrase generation. In: IJCAI-20 (2020)
Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: EMNLP, Lisbon, Portugal (2015)
Manning, C., Surdeanu, M., Bauer, J., Finkel, J., Bethard, S., McClosky, D.: The Stanford CoreNLP natural language processing toolkit. In: ACL, Baltimore, Maryland (2014)
Papineni, K., Roukos, S., Ward, T., Zhu, W.J.: BLEU: a method for automatic evaluation of machine translation (2002)
Prakash, A., et al.: Neural paraphrase generation with stacked residual LSTM networks. In: COLING, pp. 2923–2934. The COLING 2016 Organizing Committee, Osaka (2016). https://www.aclweb.org/anthology/C16-1275
Reimers, N., Gurevych, I.: Sentence-BERT: sentence embeddings using Siamese BERT-networks. In: EMNLP, Hong Kong, China (2019)
See, A., Liu, P.J., Manning, C.D.: Get to the point: summarization with pointer-generator networks. In: ACL, Vancouver, Canada (2017)
Wieting, J., Gimpel, K.: ParaNMT-50M: pushing the limits of paraphrastic sentence embeddings with millions of machine translations. In: ACL, Melbourne, Australia (2018)
Zhang, X., Yang, Y., Yuan, S., Shen, D., Carin, L.: Syntax-infused variational autoencoder for text generation. In: ACL, Florence, Italy (2019)
Zhao, S., Meng, R., He, D., Saptono, A., Parmanto, B.: Integrating transformer and paraphrase rules for sentence simplification. In: EMNLP, Brussels, Belgium (2018)
Zhou, Z., Sperber, M., Waibel, A.: Paraphrases as foreign languages in multilingual neural machine translation. In: ACL: Student Research Workshop (2019)
Acknowledgments
This work is supported by the National Science Foundation of China (Contract 61876198, 61976105, 61976016).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Yang, E. et al. (2021). Explore Coarse-Grained Structures for Syntactically Controllable Paraphrase Generation. In: Wang, L., Feng, Y., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2021. Lecture Notes in Computer Science(), vol 13028. Springer, Cham. https://doi.org/10.1007/978-3-030-88480-2_29
Download citation
DOI: https://doi.org/10.1007/978-3-030-88480-2_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-88479-6
Online ISBN: 978-3-030-88480-2
eBook Packages: Computer ScienceComputer Science (R0)