Papers of the day   All papers

ALBERT: A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS

Comments

Dec 11 2019 Hugging Face

So, we read ALBERT. @remilouf took some notes for you 👇 Paper: https://arxiv.org/abs/1909.11942 Also in 🤗 transformers: https://github.com/huggingface/transformers https://t.co/LLaNema3mc
6 replies, 429 likes


Jan 07 2020 Kirk Borne

Google Open-Sources ALBERT Natural Language Model: https://www.infoq.com/news/2020/01/google-albert-ai-nlp/ —————— #NLProc #NLU #NLG #AI #MachineLearning #DeepLearning #TensorFlow #Algorithms #BigData #DataScience —————— Research paper: https://arxiv.org/abs/1909.11942 https://t.co/9F7SzKEOOY
1 replies, 33 likes


Sep 28 2019 Miles Brundage

P.S. See also the ALBERT paper, which shows stronger results on these metrics for a similarly sized model, using a different approach (using parameters better to begin with vs. compressing a big model later): https://arxiv.org/abs/1909.11942
0 replies, 32 likes


Oct 31 2019 arXiv CS-CL

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations http://arxiv.org/abs/1909.11942
0 replies, 11 likes


Sep 28 2019 arXiv CS-CL

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations http://arxiv.org/abs/1909.11942
0 replies, 10 likes


Oct 16 2019 Leonid Boytsov

[thread] Another very interesting efficiency paper is the one that introduces ALBERT.https://arxiv.org/abs/1909.11942 1. Factorized embedding matrix. Very clever idea that permits untying embedding and hidden layer sizes. 2. Cross-layer parameter sharing (e.g., attention parameters)
1 replies, 9 likes


Dec 25 2019 BioDecoded

ALBERT: A Lite BERT for Self-Supervised Learning of Language Representations | Google AI Blog https://ai.googleblog.com/2019/12/albert-lite-bert-for-self-supervised.html https://arxiv.org/abs/1909.11942 #NLP #DeepLearning https://t.co/hWC0o3E4Ki
0 replies, 8 likes


Oct 11 2019 Husein Zolkepli

Hi everyone! Malaya released ALBERT-Base for Bahasa Malaysia / Manglish / Rojak / Bahasa Indonesia. Original paper for ALBERT, https://arxiv.org/abs/1909.11942 Can read more about ALBERT-Bahasa from here and how to start use ALBERT-Bahasa, https://github.com/huseinzol05/Malaya/tree/3.0/pretrained-model/albert
0 replies, 7 likes


Dec 12 2019 Santosh ML

This is one of the best ML summaries I have ever seen
0 replies, 7 likes


Sep 27 2019 arXiv CS-CL

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations http://arxiv.org/abs/1909.11942
0 replies, 5 likes


Oct 05 2019 AUEB NLP Group

Next AUEB NLP Group meeting, Tue Oct 8, 17:15-19:00, *IPLab* (http://nlp.cs.aueb.gr/contact.html): Discussion of RoBERTa (https://arxiv.org/abs/1907.11692) and ALBERT (https://arxiv.org/abs/1909.11942). Coordinator: Ilias Chalkidis @KiddoThe2B. Study the papers before the meeting. All welcome.
0 replies, 4 likes


Sep 27 2019 arXiv CS-CL

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations http://arxiv.org/abs/1909.11942
0 replies, 3 likes


Oct 13 2019 AUEB NLP Group

Next AUEB NLP Group meeting, Tue Oct 15, 17:15-19:00, *IPLab* (http://nlp.cs.aueb.gr/contact.html): Part II of discussion of RoBERTa (https://arxiv.org/abs/1907.11692) and ALBERT (https://arxiv.org/abs/1909.11942). Coordinator: Ilias Chalkidis. Study the papers before the meeting. All welcome.
0 replies, 2 likes


Feb 11 2020 arXiv CS-CL

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations http://arxiv.org/abs/1909.11942
0 replies, 2 likes


Oct 16 2019 cs.CL Papers

https://ift.tt/31jxpJI Pruning a BERT-based Question Answering Model. (arXiv:1910.06360v1 [http://cs.CL]) #NLProc
0 replies, 2 likes


Oct 31 2019 arXiv CS-CL

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations http://arxiv.org/abs/1909.11942
0 replies, 1 likes


Oct 06 2019 Guenter Bartsch

Looks like both #PyTorch as well as #TensorFlow implementations of ALBERT https://arxiv.org/pdf/1909.11942.pdf have been open sourced: https://github.com/lonePatient/albert_pytorch https://github.com/brightmart/albert_zh
0 replies, 1 likes


Jan 08 2020 Christopher Ackerman

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations https://arxiv.org/pdf/1909.11942.pdf Google open-sourced A Lite Bert (ALBERT), a deep-learning natural language processing (NLP) model https://www.infoq.com/news/2020/01/google-albert-ai-nlp/ https://devopedia.org/bert-language-model https://t.co/vOsRGbSB9t
0 replies, 1 likes


Jan 14 2020 arXiv CS-CL

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations http://arxiv.org/abs/1909.11942
0 replies, 1 likes


Content