Papers of the day   All papers

MultiFiT: Efficient Multi-lingual Language Model Fine-tuning


Oct 22 2019 Sebastian Ruder

Most of the world’s text is not in English. We are releasing MultiFiT to train and fine-tune language models efficiently in any language. Post: Paper: With @eisenjulian @PiotrCzapla Marcin Kadras @GuggerSylvain @jeremyphoward
12 replies, 1255 likes

Oct 29 2019 DataScienceNigeria

NLP to the next level with MultiFit - novel methods for multilingual fine-tuning of languages that outperform models trained with far more data Kudos @seb_ruder @eisenjulian @PiotrCzapla @GuggerSylvain @jeremyphoward Paper: Code:
2 replies, 173 likes

Oct 23 2019 Suzana Ilić

Great effort! πŸ‘πŸ‘πŸ‘
0 replies, 44 likes

Oct 23 2019 Jade Abbott

Very important work by some very wonderful people on fine-tuning models efficiently for any language! 🀩🀩🀩 Thank you for the work you all do in supporting Low-Resourced languages 🌍✨πŸ’ͺβ™₯️
0 replies, 32 likes

Oct 23 2019 Dat Tran

Interestingly they don’t build on BERT like many NLP models these days but instead use an efficient variant of LSTM. Hence, it’s cheaper to pretrain and also results in a smaller model.
0 replies, 2 likes

Oct 23 2019 Graciela Gonzalez-Hernandez, PhD

Could be useful for #SMM4H 2020 @UPennHLP
0 replies, 1 likes

Nov 27 2019 Sebastian Ruder

@EdwardDixon3 Thanks for the pointer, Edward! Yes, I'm very much a fan of not going with the mainstream. Other recent approaches in this line: - @GaborMelis' Mogrifier LSTM (SOTA on PTB and WT-2): - Our MultiFit (competitive with mBERT):
0 replies, 1 likes

Oct 22 2019 Elias W. BA

πŸ˜‹πŸ˜‹πŸ˜‹ @MasakhaneMt @galsenai @baamtusarl
0 replies, 1 likes

Oct 22 2019 Robert Munro

This is one of the most important technology releases of the year. Only 5% of daily conversations are in English but most AI only works in English or other privileged languages. Congrats to everyone involved!
1 replies, 1 likes