Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer, Colin Raffel, Noam Shazeer, Adam Roberts, Katherine Lee, Sharan Narang, Michael Matena, Yanqi Zhou, Wei Li, Peter J. Liu, 2020Journal of Machine Learning Research, Vol. 21 - Introduces the T5 model, a text-to-text transformer architecture capable of performing various NLP tasks, including paraphrasing, by framing them as text generation.
Hugging Face Transformers Library Documentation, Hugging Face, 2024 (Hugging Face) - Official documentation for the transformers library, essential for practical implementation and understanding of pre-trained models and pipelines for text generation and paraphrasing.