Papers of the day   All papers

On the Cross-lingual Transferability of Monolingual Representations

Comments

Mikel Artetxe: Check out our new paper "On the Cross-lingual Transferability of Monolingual Representations" (w/ @seb_ruder & @DaniYogatama) We challenge common beliefs of why mBERT works by showing that a monolingual BERT can also be transferred to new languages https://arxiv.org/abs/1910.11856 https://t.co/IVmLK0EroS

3 replies, 312 likes


Dani Yogatama: Our amazing intern @artetxem show we can transfer a monolingual model to other languages by just learning lexical embeddings. This results contradict previous theories of the basis of x-ling 0-shot ability and suggest deep monolingual models learn some generalizable abstractions.

0 replies, 114 likes


Sebastian Ruder: - 🌍XQuAD (ours): https://arxiv.org/abs/1910.11856 TyDiQA: https://00e9e64bac5aa95ee51e3b623d56e6b23c2eff7c2307e35f3f-apidata.googleusercontent.com/download/storage/v1/b/tydiqa/o/tydiqa.pdf?qk=AD5uMEufo0ZN3puKAmTIje1bV04fdQCCo0nAM4vKSO4Z2xkstGhkFOaB0MKLnwyjuQXwWQ7NzN_tFRyak32E84M2ej0uZVSeZ4gADazSAoOmzwSKmiDvhkYt9rqI1iHa0JfYqvyMXGfO6fUnxKpAfzeP9U8m5Cjx_636HVUtZF9OahMVBM5hQPJ644iPnINcfdllZFxJblutESZMeUYDTPa6RDHLs_xJxTRVilx78eMywErY90_6ND37pEcaXrD-UxTkvW6b1HB3C2MmpbPuzfZoeZuaKb9bW75oPZdE7WUEsOwyUFOU9Esl8IcyBIXVjD8V6RUQFzdUiaHOF7z1_OByH2Yt_Ww5KAzbKB6aSdIUOOlQBM29zZk4UfS124nAohZGzmlI7VTzCAjUMkgkTQkELNUDDGy_XJmWOBBFNo03dyn2viqqcDBOFwZA45XS7TH7W7yT_2MnWxRVG1HN6wBWTK6VU02goYZSaZms3kbfWDDFPl1XA-LrLtmHetHvHjL08GOXOr0xbZt5Noo6QPYhoH0Ep5sqjubDTLK_A1UrxKoE7xFr2CJPxhmXKfAPhG-dTIuKxlV18DxCO3Vts48xE2lom8ESylKpR8-lZvTMpQz84XNJeUlu0tP55JEwnq3OLApaSb1UHjAUN7_9iblyeQfyvxExjk3JyVeZ_QfHfu5m0YQ9UrIIglXKw967NLxCH6E1O6ZtrxHMcnrl9IGwQqC1oeDJFv9pIuKOip_FUDbxG6l_zwQrbXI4CrZzRfV-OtnmhBoU3S42uzMxxS0fol3Uh4sRxg MLQA: https://arxiv.org/abs/1910.07475 Are there any other recent ones that I missed?

2 replies, 26 likes


Alexis Conneau: @ashkamath20 @seb_ruder @NYUDataScience I'm not sure about monolingual transfer performing as well as joint learning though. Results from https://arxiv.org/pdf/1910.11856.pdf perform around 70% on XNLI which is more than 5% average accuracy below the state of the art.

1 replies, 9 likes


Jason M Pittman: [R] re-training only embedding matrix for new language is good enough for transfer learning - new paper from Deepmind https://arxiv.org/abs/1910.11856 #MachineLearning

0 replies, 2 likes


HotComputerScience: Most popular computer science paper of the day: "On the Cross-lingual Transferability of Monolingual Representations" https://hotcomputerscience.com/paper/on-the-cross-lingual-transferability-of-monolingual-representations https://twitter.com/artetxem/status/1188788015969255425

0 replies, 1 likes


Sean Welleck: @seb_ruder https://arxiv.org/pdf/1910.11856.pdf https://t.co/KkmPV5HdP2

0 replies, 1 likes


Gianluca Fiorelli: This document explains how BERT may be able to be multilingual starting from only 1 language: https://arxiv.org/abs/1910.11856 (h/t: @natzir9 )

1 replies, 0 likes


Thomas Scialom: 2/ In recent work (see https://arxiv.org/abs/1910.11856), @artetxem et al. suggested that deep monolingual models learn some abstractions that generalize across languages. Inspired by this paper, we explored if these abstractions could generalize across modalities as well.

1 replies, 0 likes


Content

Found on Oct 28 2019 at https://arxiv.org/pdf/1910.11856.pdf

PDF content of a computer science paper: On the Cross-lingual Transferability of Monolingual Representations