Papers of the day   All papers

FlauBERT: Unsupervised Language Model Pre-training for French

Comments

Sebastian Ruder: Transfer learning is increasingly going multilingual with language-specific BERT models: - 🇩🇪 German BERT https://deepset.ai/german-bert - 🇫🇷 CamemBERT https://arxiv.org/abs/1911.03894, FlauBERT https://arxiv.org/abs/1912.05372 - 🇮🇹 AlBERTo http://ceur-ws.org/Vol-2481/paper57.pdf - 🇳🇱 RobBERT https://arxiv.org/abs/2001.06286

19 replies, 613 likes


Jeremy Howard: TIL Google invented universal language model pre-training with BERT. https://t.co/ANyi8F2UP3

18 replies, 578 likes


laurent besacier: Our FlauBERT (French BERT) models have now been integrated into official @huggingface library with 4 below configurations ! https://t.co/Z2mPoAP6EO

3 replies, 105 likes


Hang Le: Our FlauBERT is now natively supported by @huggingface's transformers library. Many thanks to @julien_c, @LysandreJik and the Hugging Face team for the active technical support! Paper (new version will be available soon): https://arxiv.org/abs/1912.05372 Code: https://github.com/getalp/Flaubert

0 replies, 94 likes


Hugging Face: You can now find most of them here: https://huggingface.co/models https://twitter.com/seb_ruder/status/1221851361811128321?s=20

0 replies, 53 likes


D. Khuê Lê-Huu: This reminds me of this Flaubert paper: https://arxiv.org/abs/1912.05372 that gave proper credits to ULMFiT (by @jeremyphoward and @seb_ruder). https://t.co/fUYKuT8d9U

2 replies, 49 likes


laurent besacier: The (LREC) camera-ready paper on FlauBERT is now online: https://arxiv.org/abs/1912.05372 . Includes new results with FlauBERT_large. All models available on @huggingface transformers library. Benchmark NLP tasks (FLUE) provided on https://github.com/getalp/Flaubert

0 replies, 47 likes


laurent besacier: Here is FlauBERT: a French LM learnt (with #CNRS J-Zay supercomputer) on a large and heterogeneous corpus. Along with it comes FLUE (evaluation setup for French NLP). FlauBERT was successfully applied to complex tasks (NLI, WSD, Parsing). More on https://github.com/getalp/Flaubert

0 replies, 42 likes


Jay Alammar جهاد العمار: Somebody please make AraBERT happen!

3 replies, 13 likes


Hang Le: Our work on FlauBERT and FLUE (language models and evaluation benchmark for French) have been released today (198th birthday of Gustave Flaubert). #Flaubert Paper: https://arxiv.org/abs/1912.05372 Code and models: https://github.com/getalp/Flaubert

1 replies, 11 likes


Dr Jochen L Leidner: French #NLP with the Transformer: From #BERT over CamemBERT to FlauBERT https://arxiv.org/pdf/1912.05372.pdf

0 replies, 8 likes


Machine Learning: FlauBERT: Unsupervised Language Model Pre-training for French. http://arxiv.org/abs/1912.05372

0 replies, 7 likes


Julien Velcin: Between CamemBERT (https://arxiv.org/pdf/1911.03894.pdf) and FlauBERT (https://arxiv.org/pdf/1912.05372.pdf), which one will win the race? Anyway thank you for working on French-oriented NLP resources, and well done for finding such interesting names! #nlp #deeplearning #bert

0 replies, 6 likes


Dominique Mariko: Comes with FLUE benchmark ! A GLUE for French!!

0 replies, 5 likes


Christopher: FlauBERT - Unsupervised Language Model Pre-training for French. The repo contains pre-trained large & small models, all the data used plus code for training & inference. It also contains FLUE, a GLUE like benchmark for French NLProc https://arxiv.org/abs/1912.05372 https://github.com/getalp/Flaubert

0 replies, 2 likes


Content

Found on Jan 27 2020 at https://arxiv.org/pdf/1912.05372.pdf

PDF content of a computer science paper: FlauBERT: Unsupervised Language Model Pre-training for French