mT5: A massively multilingual pre-trained text-to-text transformer
Abstract
mT5, a multilingual T5 model trained on 101 languages, achieves top performance on multilingual benchmarks and includes a technique to avoid accidental translation in zero-shot scenarios.
The recent "Text-to-Text Transfer Transformer" (T5) leveraged a unified text-to-text format and scale to attain state-of-the-art results on a wide variety of English-language NLP tasks. In this paper, we introduce mT5, a multilingual variant of T5 that was pre-trained on a new Common Crawl-based dataset covering 101 languages. We detail the design and modified training of mT5 and demonstrate its state-of-the-art performance on many multilingual benchmarks. We also describe a simple technique to prevent "accidental translation" in the zero-shot setting, where a generative model chooses to (partially) translate its prediction into the wrong language. All of the code and model checkpoints used in this work are publicly available.
Get this paper in your agent:
hf papers read 2010.11934 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash