Papers of the day   All papers

Energy and Policy Considerations for Deep Learning in NLP

Comments

May 17 2019 Emma Strubell

Are you interested in deep learning for NLP but also concerned about the CO2 footprint of training? You should be! Excited to share our work "Energy and Policy Considerations for Deep Learning in NLP" at @ACL2019_Italy! With @ananya__g and @andrewmccallum. Preprint coming soon. https://t.co/kIgZWcptRR
77 replies, 2551 likes


Oct 03 2019 Safiya Umoja Noble PhD

Training a single AI model can emit as much carbon as five cars in their lifetimes - via @techreview https://www.technologyreview.com/s/613630/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/?utm_campaign=site_visitor.unpaid.engagement&utm_source=twitter&utm_medium=social_share&utm_content=2019-10-03
19 replies, 697 likes


Jan 12 2020 Eric Topol

Perhaps the least acknowledged downside of deep neural net #AI models: the carbon footprint But this key preprint is starting to get noticed https://arxiv.org/abs/1906.02243 by @strubell @andrewmccallum https://jamanetwork.com/journals/jama/fullarticle/2758612 @AndrewLBeam https://www.technologyreview.com/s/613630/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/ @techreview @_KarenHao https://t.co/j3dcGityEv
14 replies, 310 likes


Jan 12 2020 Isaac Kohane

Why we should admire @fastdotai & @jeremyphoward focus on computational efficiency (eg superconvergence) https://www.theverge.com/2018/5/7/17316010/fast-ai-speed-test-stanford-dawnbench-google-intel
2 replies, 161 likes


Nov 21 2019 Sylvain ❄️👨🏻‍🎓

Training a single machine-learning model can have over four times the footprint of a car over its entire lifetime 😶 https://arxiv.org/abs/1906.02243 https://t.co/kAdkhxITi7
14 replies, 119 likes


Jun 07 2019 MIT Technology Review

Training a single AI model can emit as much carbon as five cars in their lifetimes https://trib.al/LmEr6F7
1 replies, 96 likes


Jan 14 2020 Lior Pachter

We (w/@sinabooeshaghi @VeigaBeltrame) computed the carbon footprint of running Cell Ranger vs. kallisto bustools for scRNA-seq. Turns out for one dataset it's the difference between driving a car from LA to Mexico vs. driving a few blocks in Pasadena. https://twitter.com/EricTopol/status/1216187183402373122
1 replies, 62 likes


Jun 25 2019 Arthur Charpentier 🌻

"Training a single AI model can emit as much carbon as five cars in their lifetimes" https://www.technologyreview.com/s/613630/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/?utm_source=twitter&utm_medium=tr_social&utm_campaign=site_visitor.unpaid.engagement (I knew it was bad.... but that bad !?!) "Deep learning has a terrible carbon footprint" see https://arxiv.org/abs/1906.02243
2 replies, 38 likes


Jan 14 2020 Mark Robinson

Right, I guess we should add (low) carbon footprint as a criteria when benchmarking computational methods then ..
3 replies, 34 likes


Jun 25 2019 MIT Technology Review

Deep learning has a terrible carbon footprint. https://trib.al/7XL71bW
0 replies, 32 likes


Aug 07 2019 Shreya Shankar

https://arxiv.org/pdf/1906.02243.pdf https://t.co/sEyuYr55v2
2 replies, 29 likes


Jan 12 2020 Brandon Rohrer

Small is beautiful. Some machine learning tasks can only feasibly be done by enormous models. But the reflex to improve every model by making it bigger has hidden costs.
1 replies, 27 likes


Jan 12 2020 Pietro Michelucci

Remarkably, the fastest supercomputers can process information about as fast as a human brain. More remarkable, perhaps, is that a supercomputer consumes about 20 million watts, compared to the human brain, which consumes about 20. You have to hand it to nature for efficiency.
1 replies, 26 likes


Jun 12 2019 Jose Javier Garde

Energy and Policy Considerations for Deep Learning in #NLP by Emma Strubell, Ananya Ganesh, Andrew McCallum https://arxiv.org/abs/1906.02243 #deeplearning #ArtificialIntelligence #machinelearning #energy #climatechange #climate #policy #NeuralNetworks #algorithms https://t.co/j3KNLSOr20
0 replies, 25 likes


Jun 07 2019 Emma Strubell

@ACL2019_Italy @ananya__g @andrewmccallum preprint now available! https://arxiv.org/abs/1906.02243
3 replies, 23 likes


Jan 12 2020 Ryan Flynn

I think this gets much less attention than it should
0 replies, 17 likes


Jul 31 2019 Alice Coucke

Yes! This should be mandatory. Very interesting talk by @strubell at #acl2019nlp 🌱 👉https://arxiv.org/abs/1906.02243 https://t.co/OYpuluuglu
1 replies, 15 likes


Jun 13 2019 Xander Steenbrugge

"Energy and Policy Considerations for Deep Learning in NLP" They provide some very interesting statistics on the environmental impact of training large Deep Learning models with today's various Cloud Providers! Paper: https://arxiv.org/abs/1906.02243 Article: https://www.technologyreview.com/s/613630/training-a-single-ai-model-can-emit-as-much-carbon-as-five-cars-in-their-lifetimes/ https://t.co/ZsY1ClMa4W
0 replies, 14 likes


Jan 12 2020 Dr Mike Nitabach

But yeah sanctimoniously yelling at people for flying on airplanes is totally awesome. https://twitter.com/EricTopol/status/1216187183402373122
2 replies, 12 likes


Nov 17 2019 Joss Moorkens

@itia_ireland According to this paper the emissions cost for training one system is the equivalent to the lifetime emissions of 6 cars inc fuel. https://arxiv.org/abs/1906.02243
1 replies, 12 likes


Feb 10 2020 Anna Rogers ✈️ #AAAI2020

@strubell at @RealAAAI #AAAI2020 Energy and Policy Considerations for Deep Learning in NLP Paper: https://arxiv.org/pdf/1906.02243.pdf In case anybody needs a reminder, DL research is being environmentally irresponsible, with Google's Meena as the latest offender. @andrewmccallum
1 replies, 10 likes


Sep 18 2019 tante

(btw. here are CO2 estimates for training neural nets https://arxiv.org/pdf/1906.02243.pdf )
1 replies, 8 likes


Jul 30 2019 Denis Newman-Griffis

Fantastic turnout for @strubell's paper on energy consumption in NLP research @ACL2019_Italy (this is only half the room) Paper at https://arxiv.org/abs/1906.02243 #ACL2019 https://t.co/oUNsvl0wio
1 replies, 8 likes


Dec 10 2019 Global Pulse

#Bigdata and new technologies like machine learning and neural networks, can be used responsibly for #climateaction. Thanks @UNFCCC for having us join your event during #COP25 in Madrid. Now is the #TimeForAction . https://t.co/EVhN0vLZBC
0 replies, 8 likes


Jun 09 2019 Julie Grollier

"It is estimated that we must cut carbon emissions by half over the next decade, and based on the estimated CO2 emissions listed in Table1, model training and development likely make up a substantial portion of the greenhouse gas emissions" https://arxiv.org/abs/1906.02243 https://t.co/NpNqadgPL6
0 replies, 8 likes


Jan 12 2020 Peter Bloem

While I don't disagree with researching more efficient models, note that this model also _costs_ as much as five cars to train. It's very rare to train such a big model, and it yields an artifact that is reused globally for years.
1 replies, 7 likes


Jun 07 2019 arXiv CS-CL

Energy and Policy Considerations for Deep Learning in NLP http://arxiv.org/abs/1906.02243
0 replies, 7 likes


Nov 21 2019 Steven Lowette

Food for thought. There's no free lunch.
0 replies, 6 likes


Jan 07 2020 Padraig Cunningham

Here is an estimate of the carbon footprint of Deep Learning; training a DL model could produce as much C02 as 5 cars would in a lifetime - if all the model search / parameter tuning work is taken into account. https://arxiv.org/abs/1906.02243 (Paper presented at ACL2019)
0 replies, 5 likes


Dec 15 2019 Jordi Mas

The ecological cost of training machine learning models (in this case NLP BERT models): "We also see that models emit substantial carbon emissions; training BERT on GPU is roughly equivalent to a trans-American flight." (https://arxiv.org/pdf/1906.02243.pdf)
0 replies, 5 likes


Sep 20 2019 Jenny Brennan

Some reading: 🔹Energy and Policy Considerations for Deep Learning in NLP: https://arxiv.org/pdf/1906.02243.pdf 🔹The State of Data Center Energy Use in 2018 @coedethics: https://docs.google.com/document/d/1eCCb3rgqtQxcRwLdTr0P_hCK_drIZrm1Dpb4dlPeG6M/ 🔹Anatomy of an AI system @AINowInstitute: https://anatomyof.ai/ 3/?
1 replies, 5 likes


Jun 09 2019 Anish Mohammed

Energy and Policy Considerations for Deep Learning in NLP < wondering if this was by accident or design, yet another moat for incumbents against challengers #DeepLearning https://arxiv.org/abs/1906.02243
0 replies, 5 likes


Jan 12 2020 Fredros Okumu, PhD

Deep Learning models are costly to train and develop, both financially, due to the cost of hardware and electricity or cloud compute time, and environmentally, due to the carbon footprint required to fuel modern tensor processing hardware. #deeplearning https://arxiv.org/abs/1906.02243
1 replies, 5 likes


Nov 21 2019 Cait Lamberton

This is interesting. Do we gain enough to make the ecological damage worthwhile? Or is there something missing in this calculation? (I need to read the article to find out what kind of machine-learning model we’re talking about; my bet is it‘s not a logistic regression. :-)
1 replies, 5 likes


Jan 12 2020 Jumanne Mtambalike

But is it a necessary cost? Or we have alternative.
0 replies, 5 likes


Jul 19 2019 Paul Bradshaw

If you're a data journalist exploring #AI/#ML/#NLP, prepare to feel guilty... https://arxiv.org/abs/1906.02243
0 replies, 4 likes


Oct 01 2019 Jean Senellart

@lorenlugosch @SanhEstPasMoi Yes it does. I used the CO₂e calculation model from https://arxiv.org/pdf/1906.02243.pdf. This estimate is based on energy production in USA with 36% non-carbon energy. For China, where the training has maybe (?) been run by #ShannonAI, the figure would be a bit higher.
0 replies, 4 likes


Feb 16 2020 Anna Rogers

The #AcademicTwitter #GreenAI panel from other threads and papers: * @nlpnoah @royschwartz02 @JesseDodge @etzioni https://arxiv.org/abs/1907.10597 * @strubell @andrewmccallum https://arxiv.org/abs/1906.02243 * @alex_lacoste_ @vict0rsch https://arxiv.org/abs/1910.09700 * @bkbrd
1 replies, 3 likes


Jun 07 2019 Russell Neches

626,155 pounds of CO2 to train one model? Ouch. Well, I guess now you know why huge tech companies are building custom chips for machine learning. https://arxiv.org/abs/1906.02243
0 replies, 3 likes


Jun 08 2019 arXiv CS-CL

Energy and Policy Considerations for Deep Learning in NLP http://arxiv.org/abs/1906.02243
0 replies, 3 likes


Dec 14 2019 C. Gómez-Rodríguez

@yoavgo @natschluter @afalenska and @strubell et al.'s "Energy and Policy Considerations for Deep Learning in NLP", https://arxiv.org/abs/1906.02243
1 replies, 3 likes


Jan 12 2020 Sanjay Kamath | ಸಂಜಯ್ ಕಾಮತ್

Good paper but not every AI model is a transformer (big) model with neural architecture search. Hope the media doesn't put this out of context.
0 replies, 3 likes


Dec 17 2019 Wolfgang Schröder

#AI #Sustainability #GreenAI #ecology Research published earlier this year found that the training of a neural network creates a carbon dioxide footprint of 284 tonnes - the equivalent of five times the lifespan emissions of a typical car. https://arxiv.org/abs/1906.02243
0 replies, 3 likes


Oct 06 2019 Erik Hamburger

Can #MachineLearning and #Sustainability go together? This study shows how energy intensive all this #ML and #AI is. https://arxiv.org/pdf/1906.02243.pdf
0 replies, 2 likes


Jan 12 2020 J. Chris Pires

cc #PAGXXVIII #PAG2020 #AI Machine Learning
0 replies, 2 likes


Aug 15 2019 Beril Sirmacek 🦋

From now on, the #artificialintelligence frameworks will not be judged by their #performance but by their #energy labels. #carbonfootprint #climatechange #co2 https://arxiv.org/abs/1906.02243 https://t.co/NYN5xqJnMq
1 replies, 2 likes


Nov 21 2019 Patrick Burr

#AI comes at an environmental cost. The largest bearer of it's externalities is our planet. I can't vouch for the accuracy of these findings, but I have similar trends quite few times now. Running loads of CPUs directly increases CO2 emissions and resources consumption.
1 replies, 2 likes


Jun 14 2019 Dr William Marshall

Paper: Energy and Policy Considerations for Deep Learning in NLP (Natural Language Processing). Training an #AI model uses a lot of electrical power and leaves a vast carbon footprint. #climatechange https://arxiv.org/pdf/1906.02243.pdf
0 replies, 2 likes


Jun 18 2019 Dave Costenaro

Interesting paper: "Energy and Policy Considerations for Deep Learning in NLP." Training 1 big NN model has the carbon footprint of 5 cars over their lifetimes. (https://arxiv.org/abs/1906.02243). Compute is cheap...but not free, so please give efficient code some thought!
0 replies, 2 likes


Jun 14 2019 Manyvip

A Deep Learning Process Can Emit 284.000 kilograms of Carbon Dioxide (CO2). Download PDF. https://arxiv.org/abs/1906.02243 https://t.co/OGOT6UXFB4
0 replies, 2 likes


Aug 22 2019 Libby Hemphill, PhD

@danieljkelley Just starting the read, but the environmental impact of our models is definitely something we talk about on my team. @strubell had a great paper at #ACL2019 about neural models and their costs: https://arxiv.org/abs/1906.02243
0 replies, 1 likes


Jul 18 2019 Alasdair Allan

@swardley I do have some issues with the broader applicability of their analysis, but here's the link to the Strubell, Ganesh & McCallum (2019) paper with the life-cycle analysis of #MachineLearning training I talked about at the start of my #OSCON talk, https://arxiv.org/abs/1906.02243.
0 replies, 1 likes


Jul 24 2019 GetzlerChem

@Chemjobber See also the surprising and staggering cost of deep learning. (preprint, so caveat emptor, etc) https://arxiv.org/abs/1906.02243 https://t.co/ZxC7snogK9
0 replies, 1 likes


Aug 15 2019 Tim Heiler

The average deep learning model using fossil fuel releases around 78,000 pounds of carbon. That’s more than half of a car's output from assembly line to landfill. According to this paper: https://arxiv.org/abs/1906.02243 #AI #machinelearning #ClimateChange #ClimateEmergency https://t.co/27GQVlH1wK
0 replies, 1 likes


Jan 14 2020 Juan A. Botía

Next time you have some desire to do a blind search for your deep ann model best parameters think twice!
0 replies, 1 likes


Jul 30 2019 Swadhin | স্বাধীন

@krismicinski A recent related paper focusing on NLP model and energy : https://arxiv.org/abs/1906.02243 . Emma discusses about this in detail in TWIMLAI podcast this week.
0 replies, 1 likes


Jan 12 2020 Richard Rathe, MD

Reminds me of the energy needed to support #cyptocurrencies and so-called "mining". Enough electricity to keep whole cities going for months/years!!
0 replies, 1 likes


Jun 11 2019 DOSE Engineering

Data centers and cloud computing providers need to up their use of renewable energy in order to meet the high energy demands of CPU/GPU/TPU by artificial intelligence/deep learning. #greenenergy #cloudcomputing #datacenters #AI https://arxiv.org/pdf/1906.02243.pdf https://t.co/8SssmMf7Hl
0 replies, 1 likes


Jun 08 2019 Jordan Foley

Really interesting work on the potential environmental implications of machine learning and AI. Ive seen lots of important convos about the ethical dimensions of these technologies but few that center questions like these. https://arxiv.org/abs/1906.02243
0 replies, 1 likes


Jan 14 2020 Eduardo Eyras

Given the formula CO2e = 0.954pt, pretty much all current data science will end up burning the planet https://arxiv.org/abs/1906.02243
0 replies, 1 likes


Jun 10 2019 Philippe Durance

Recent progress in training #neuralnetworks depends on the availability of exceptionally large computational resources that necessitate similarly substantial #energy consumption https://arxiv.org/abs/1906.02243?utm_campaign=the_algorithm.unpaid.engagement&utm_source=hs_email&utm_medium=email&utm_content=73464008&_hsenc=p2ANqtz-_Fe6QnHVjx60Rmn8sSlVsG30Q0TJFvIXv2ykzz8aVKdt_RV6sUronq35AtaM5iZ1YOF8qIQdICVUflzM_vHIREtVblgQ&_hsmi=73464008 @Cornell
0 replies, 1 likes


Jun 07 2019 Andrés Murcia

La huella de carbono del Deep Learning - "Runs on energy-intensive computer chips, can emit more than 626,000 pounds of carbon dioxide equivalent, nearly five times the lifetime emissions of the average American car." - https://arxiv.org/abs/1906.02243?utm_campaign=the_algorithm.unpaid.engagement&utm_source=hs_email&utm_medium=email&utm_content=73464008&_hsenc=p2ANqtz-_46saoiHzXONwwvcO8_1mRilORNzze1VMZK13OjGfGio6b6T1fa4hK60qYibywgomX5-w8tRl0vrOP0HnIcSWyXcS9wQ&_hsmi=73464009 https://t.co/mR3lVt6tNp
1 replies, 1 likes


Jun 07 2019 Nate Jue

Colleague sent this article to me and it's making me think more and more about the environmental impacts of my computational work and associated ethical decisions. How many of us computational biologists even have this on our radar? I sure haven't.😑 https://arxiv.org/abs/1906.02243
0 replies, 1 likes


Dec 10 2019 Miguel Luengo-Oroz

Some of the refs I shared: review on AI applications for climate https://arxiv.org/pdf/1906.05433.pdf ; how AI models grow exponentially https://openai.com/blog/ai-and-compute/ ; CO2 footprint of an AI system https://arxiv.org/pdf/1906.02243.pdf ; sustainability as a principle of AI development https://rdcu.be/bUYS1
0 replies, 1 likes


Jun 07 2019 Ken Figueredo

#Green credentials and #MachineLearning https://arxiv.org/pdf/1906.02243.pdf https://t.co/3zJ3FxxZxi
0 replies, 1 likes


Jun 16 2019 World Ethical Data Forum

Here's the UMass paper, if the numbers interest you: https://arxiv.org/pdf/1906.02243.pdf Thanks to @InfoMgmtExec for pointing out the original post wasn't clear enough.
0 replies, 1 likes


Jun 10 2019 Charles Starrett

We know computation like this, not to mention #blockchain (*shudder*), is worsening our #climate crisis. Why aren't we pushing harder for data centers to have their own solar/wind farms? — Energy and Policy Considerations for Deep Learning in NLP https://arxiv.org/abs/1906.02243
0 replies, 1 likes


Jun 28 2019 Lancelot PECQUET

#environment - Training a single #AI model can emit as much #carbon as five cars in their lifetimes https://arxiv.org/pdf/1906.02243.pdf https://t.co/64HGDETeML
1 replies, 0 likes


Nov 04 2019 Emily Hopkins

@emiliagogu @IEEEorg 5. Diversity, non-discrimination, fairness accessibility, bias, competing interests & objectives #ismir2019 6. Societal and environmental well-being sustainability and benefit to future generations energy use of deep learning: https://arxiv.org/abs/1906.02243
1 replies, 0 likes


Jul 05 2019 Zachary Lipton

@VishnuBoddeti So far, I have yet to be convinced of neural architecture search as a research direction but my degree of certainty is not high. To date, NAS requires 1000s× more resources w/o qualitatively stronger results. See paper by @strubell on environmental impact—https://arxiv.org/abs/1906.02243
1 replies, 0 likes


Jun 12 2019 Nathaniel Bullard

First: an AI model doesn't run on coal or gas; it runs on electricity, and its carbon will be a direct result of the power mix used to energize the data centers it runs on. The paper https://arxiv.org/pdf/1906.02243.pdf gets that right...
1 replies, 0 likes


Content