Papers of the day   All papers

Transformers: State-of-the-art Natural Language Processing

Comments

Thomas Wolf: With 180+ papers mentioning šŸ¤— Transformers and its predecessors, it was high time to put out a real paper that people could cite. šŸ„³Ā šŸŽ‰ https://arxiv.org/abs/1910.03771 With @LysandreJik @SanhEstPasMoi @julien_c @ClementDelangue @moi_anthony @pierrci @remilouf @MorganFunto @jamieabrew https://t.co/oJT9lbLbyg

12 replies, 1147 likes


Sasha Rush: We wrote a longer version of the @huggingfacešŸ¤—transformers paper (EMNLP demos). It goes through the library and model hub. Lot has happened in the last 9 months! Paper: https://arxiv.org/abs/1910.03771 Consider citing (not linking) in your next paper: https://github.com/huggingface/transformers#citation https://t.co/DmGpgycUzM

3 replies, 557 likes


Sebastian Raschka: @iamtrask @huggingface Transformers preprint :). Ok, it's more of a logo here, and not sure if it was published, but still ... ;) https://arxiv.org/abs/1910.03771 https://t.co/ZcSv4gxk1E

1 replies, 47 likes


arXiv CS-CL: Transformers: State-of-the-art Natural Language Processing http://arxiv.org/abs/1910.03771

0 replies, 23 likes


arXiv CS-CL: HuggingFace's Transformers: State-of-the-art Natural Language Processing http://arxiv.org/abs/1910.03771

0 replies, 14 likes


Caner Okan: Transformers: State-of-the-art Natural Language Processing Direct to Pdf: https://arxiv.org/pdf/1910.03771.pdf By @Cornell @Thom_Wolf @huggingface TY; @ceobillionaire

0 replies, 12 likes


Denis Alejandro: How do you like reading a paper with a huggingface in the title ? https://arxiv.org/pdf/1910.03771.pdf #NLP #survey #transformer @huggingface https://t.co/YXafT5vaWN

0 replies, 8 likes


Julien Chaumond: https://twitter.com/Thom_Wolf/status/1182282216933597185

0 replies, 7 likes


Rajaswa Patil: Long awaited

0 replies, 7 likes


Joseph Sirosh: This is a great contribution to democratizing Transformer models.

0 replies, 6 likes


Amir Saffari: Excellent! Great paper!

0 replies, 5 likes


GudGud: Library for implementing Transformer

0 replies, 4 likes


Sam Shleifer: go team!

0 replies, 4 likes


Evspƶke šŸ‘»: I have been waiting for a paper with an emoji in the title for years

2 replies, 3 likes


arXiv CS-CL: HuggingFace's Transformers: State-of-the-art Natural Language Processing http://arxiv.org/abs/1910.03771

0 replies, 2 likes


Sourav Mishra: The goto reference for all Transformer related recent research.

0 replies, 2 likes


Aleksander Molak: An updated version (Jul 14th, 2020) of @huggingfacešŸ¤—Transformers paper is out there! āœØ

0 replies, 2 likes


arXiv CS-CL: HuggingFace's Transformers: State-of-the-art Natural Language Processing http://arxiv.org/abs/1910.03771

0 replies, 2 likes


Cristiano De Nobili: Really a good job! #DeepLearning #NLP #ArtificialIntelligence

0 replies, 1 likes


arXiv CS-CL: HuggingFace's Transformers: State-of-the-art Natural Language Processing http://arxiv.org/abs/1910.03771

0 replies, 1 likes


arXiv in review: #NeurIPS2019 Transformers: State-of-the-art Natural Language Processing. (arXiv:1910.03771v1 [cs\.CL]) http://arxiv.org/abs/1910.03771

0 replies, 1 likes


Content

Found on Oct 10 2019 at https://arxiv.org/pdf/1910.03771.pdf

PDF content of a computer science paper: Transformers: State-of-the-art Natural Language Processing