TY - GEN
T1 - Paraphrase Generation
T2 - 2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021
AU - Zhou, Jianing
AU - Bhat, Suma
N1 - Publisher Copyright:
© 2021 Association for Computational Linguistics
PY - 2021
Y1 - 2021
N2 - This paper focuses on paraphrase generation, which is a widely studied natural language generation task in NLP. With the development of neural models, paraphrase generation research has exhibited a gradual shift to neural methods in the recent years. This has provided architectures for contextualized representation of an input text and generating fluent, diverse and human-like paraphrases. This paper surveys various approaches to paraphrase generation with a main focus on neural methods.
AB - This paper focuses on paraphrase generation, which is a widely studied natural language generation task in NLP. With the development of neural models, paraphrase generation research has exhibited a gradual shift to neural methods in the recent years. This has provided architectures for contextualized representation of an input text and generating fluent, diverse and human-like paraphrases. This paper surveys various approaches to paraphrase generation with a main focus on neural methods.
UR - http://www.scopus.com/inward/record.url?scp=85127424060&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85127424060&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85127424060
T3 - EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
SP - 5075
EP - 5086
BT - EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
PB - Association for Computational Linguistics (ACL)
Y2 - 7 November 2021 through 11 November 2021
ER -