Papers of the day   All papers

mT5: A massively multilingual pre-trained text-to-text transformer

Comments

Adam Roberts: We are releasing mT5: A massively-multilingual version of T5 that supports over 💯 languages! mT5 was pre-trained on a multilingual version of C4 and achieves SoTA on many cross-lingual NLP tasks. 📜Pre-print: https://arxiv.org/abs/2010.11934 💾Code/models: http://goo.gle/mt5 https://t.co/LUJIFaMxF5

4 replies, 511 likes


Aran Komatsuzaki: mT5: A massively multilingual pre-trained text-to-text transformer Attains SotA on a various English NLP tasks with a multilingual T5 pre-trained on a Common Crawl-based dataset covering 101 languages. https://arxiv.org/abs/2010.11934 https://t.co/NzEFBjlByL

3 replies, 41 likes


Abhishek Thakur: Multi-lingual T5 model from google. Paper: https://arxiv.org/pdf/2010.11934.pdf Code/models: https://github.com/google-research/multilingual-t5

0 replies, 39 likes


Hugging Face: - Official paper: https://arxiv.org/pdf/2010.11934.pdf - Official Results: https://github.com/google-research/multilingual-t5#results - First fine-tuned model by @mrm8488: https://huggingface.co/mrm8488/mT5-small-finetuned-tydiqa-for-xqa

0 replies, 24 likes


arXiv CS-CL: mT5: A massively multilingual pre-trained text-to-text transformer http://arxiv.org/abs/2010.11934

0 replies, 15 likes


AK: mT5: A massively multilingual pre-trained text-to-text transformer pdf: https://arxiv.org/pdf/2010.11934.pdf abs: https://arxiv.org/abs/2010.11934 https://t.co/3ncB6aM0RN

0 replies, 7 likes


Dawn Anderson: Jeez. Another big breakthrough -'mT5'. Google's T5 (trained on C4 (Colossal Clean Crawl Corpus) of billions of petabytes of web pages since 2011) now multilingual. Right on the heels of Facebook's new multilingual model. Ridiculous developments happening https://arxiv.org/pdf/2010.11934.pdf

1 replies, 6 likes


Content

Found on Oct 23 2020 at https://arxiv.org/pdf/2010.11934.pdf

PDF content of a computer science paper: mT5: A massively multilingual pre-trained text-to-text transformer