Papers of the day   All papers

Energy and Policy Considerations for Deep Learning in NLP


May 17 2019 Emma Strubell

Are you interested in deep learning for NLP but also concerned about the CO2 footprint of training? You should be! Excited to share our work "Energy and Policy Considerations for Deep Learning in NLP" at @ACL2019_Italy! With @ananya__g and @andrewmccallum. Preprint coming soon.
77 replies, 2551 likes

Oct 03 2019 Safiya Umoja Noble PhD

Training a single AI model can emit as much carbon as five cars in their lifetimes - via @techreview
19 replies, 697 likes

Jan 12 2020 Eric Topol

Perhaps the least acknowledged downside of deep neural net #AI models: the carbon footprint But this key preprint is starting to get noticed by @strubell @andrewmccallum @AndrewLBeam @techreview @_KarenHao
12 replies, 277 likes

Jan 12 2020 Isaac Kohane

Why we should admire @fastdotai & @jeremyphoward focus on computational efficiency (eg superconvergence)
2 replies, 161 likes

Nov 21 2019 Sylvain ❄️👨🏻‍🎓

Training a single machine-learning model can have over four times the footprint of a car over its entire lifetime 😶
14 replies, 119 likes

Jun 07 2019 MIT Technology Review

Training a single AI model can emit as much carbon as five cars in their lifetimes
1 replies, 96 likes

Jan 14 2020 Lior Pachter

We (w/@sinabooeshaghi @VeigaBeltrame) computed the carbon footprint of running Cell Ranger vs. kallisto bustools for scRNA-seq. Turns out for one dataset it's the difference between driving a car from LA to Mexico vs. driving a few blocks in Pasadena.
1 replies, 62 likes

Jun 25 2019 Arthur Charpentier 🌻

"Training a single AI model can emit as much carbon as five cars in their lifetimes" (I knew it was bad.... but that bad !?!) "Deep learning has a terrible carbon footprint" see
2 replies, 38 likes

Jan 14 2020 Mark Robinson

Right, I guess we should add (low) carbon footprint as a criteria when benchmarking computational methods then ..
3 replies, 32 likes

Jun 25 2019 MIT Technology Review

Deep learning has a terrible carbon footprint.
0 replies, 32 likes

Aug 07 2019 Shreya Shankar
2 replies, 29 likes

Jan 12 2020 Brandon Rohrer

Small is beautiful. Some machine learning tasks can only feasibly be done by enormous models. But the reflex to improve every model by making it bigger has hidden costs.
1 replies, 27 likes

Jun 12 2019 Jose Javier Garde

Energy and Policy Considerations for Deep Learning in #NLP by Emma Strubell, Ananya Ganesh, Andrew McCallum #deeplearning #ArtificialIntelligence #machinelearning #energy #climatechange #climate #policy #NeuralNetworks #algorithms
0 replies, 25 likes

Jun 07 2019 Emma Strubell

@ACL2019_Italy @ananya__g @andrewmccallum preprint now available!
3 replies, 23 likes

Jan 12 2020 Pietro Michelucci

Remarkably, the fastest supercomputers can process information about as fast as a human brain. More remarkable, perhaps, is that a supercomputer consumes about 20 million watts, compared to the human brain, which consumes about 20. You have to hand it to nature for efficiency.
1 replies, 21 likes

Jan 12 2020 Ryan Flynn

I think this gets much less attention than it should
0 replies, 17 likes

Jul 31 2019 Alice Coucke

Yes! This should be mandatory. Very interesting talk by @strubell at #acl2019nlp 🌱 👉
1 replies, 15 likes

Jun 13 2019 Xander Steenbrugge

"Energy and Policy Considerations for Deep Learning in NLP" They provide some very interesting statistics on the environmental impact of training large Deep Learning models with today's various Cloud Providers! Paper: Article:
0 replies, 14 likes

Jan 12 2020 Dr Mike Nitabach

But yeah sanctimoniously yelling at people for flying on airplanes is totally awesome.
2 replies, 12 likes

Nov 17 2019 Joss Moorkens

@itia_ireland According to this paper the emissions cost for training one system is the equivalent to the lifetime emissions of 6 cars inc fuel.
1 replies, 12 likes

Jun 09 2019 Julie Grollier

"It is estimated that we must cut carbon emissions by half over the next decade, and based on the estimated CO2 emissions listed in Table1, model training and development likely make up a substantial portion of the greenhouse gas emissions"
0 replies, 8 likes

Jul 30 2019 Denis Newman-Griffis

Fantastic turnout for @strubell's paper on energy consumption in NLP research @ACL2019_Italy (this is only half the room) Paper at #ACL2019
1 replies, 8 likes

Dec 10 2019 Global Pulse

#Bigdata and new technologies like machine learning and neural networks, can be used responsibly for #climateaction. Thanks @UNFCCC for having us join your event during #COP25 in Madrid. Now is the #TimeForAction .
0 replies, 8 likes

Sep 18 2019 tante

(btw. here are CO2 estimates for training neural nets )
1 replies, 8 likes

Jan 12 2020 Peter Bloem

While I don't disagree with researching more efficient models, note that this model also _costs_ as much as five cars to train. It's very rare to train such a big model, and it yields an artifact that is reused globally for years.
1 replies, 7 likes

Jun 07 2019 arXiv CS-CL

Energy and Policy Considerations for Deep Learning in NLP
0 replies, 7 likes

Nov 21 2019 Steven Lowette

Food for thought. There's no free lunch.
0 replies, 6 likes

Jan 07 2020 Padraig Cunningham

Here is an estimate of the carbon footprint of Deep Learning; training a DL model could produce as much C02 as 5 cars would in a lifetime - if all the model search / parameter tuning work is taken into account. (Paper presented at ACL2019)
0 replies, 5 likes

Jun 09 2019 Anish Mohammed

Energy and Policy Considerations for Deep Learning in NLP < wondering if this was by accident or design, yet another moat for incumbents against challengers #DeepLearning
0 replies, 5 likes

Jan 12 2020 Fredros Okumu, PhD

Deep Learning models are costly to train and develop, both financially, due to the cost of hardware and electricity or cloud compute time, and environmentally, due to the carbon footprint required to fuel modern tensor processing hardware. #deeplearning
1 replies, 5 likes

Jan 12 2020 Jumanne Mtambalike

But is it a necessary cost? Or we have alternative.
0 replies, 5 likes

Nov 21 2019 Cait Lamberton

This is interesting. Do we gain enough to make the ecological damage worthwhile? Or is there something missing in this calculation? (I need to read the article to find out what kind of machine-learning model we’re talking about; my bet is it‘s not a logistic regression. :-)
1 replies, 5 likes

Sep 20 2019 Jenny Brennan

Some reading: 🔹Energy and Policy Considerations for Deep Learning in NLP: 🔹The State of Data Center Energy Use in 2018 @coedethics: 🔹Anatomy of an AI system @AINowInstitute: 3/?
1 replies, 5 likes

Dec 15 2019 Jordi Mas

The ecological cost of training machine learning models (in this case NLP BERT models): "We also see that models emit substantial carbon emissions; training BERT on GPU is roughly equivalent to a trans-American flight." (
0 replies, 5 likes

Jul 19 2019 Paul Bradshaw

If you're a data journalist exploring #AI/#ML/#NLP, prepare to feel guilty...
0 replies, 4 likes

Oct 01 2019 Jean Senellart

@lorenlugosch @SanhEstPasMoi Yes it does. I used the CO₂e calculation model from This estimate is based on energy production in USA with 36% non-carbon energy. For China, where the training has maybe (?) been run by #ShannonAI, the figure would be a bit higher.
0 replies, 4 likes

Dec 14 2019 C. Gómez-Rodríguez

@yoavgo @natschluter @afalenska and @strubell et al.'s "Energy and Policy Considerations for Deep Learning in NLP",
1 replies, 3 likes

Jun 07 2019 Russell Neches

626,155 pounds of CO2 to train one model? Ouch. Well, I guess now you know why huge tech companies are building custom chips for machine learning.
0 replies, 3 likes

Jun 08 2019 arXiv CS-CL

Energy and Policy Considerations for Deep Learning in NLP
0 replies, 3 likes

Jan 12 2020 Sanjay Kamath | ಸಂಜಯ್ ಕಾಮತ್

Good paper but not every AI model is a transformer (big) model with neural architecture search. Hope the media doesn't put this out of context.
0 replies, 3 likes

Dec 17 2019 Wolfgang Schröder

#AI #Sustainability #GreenAI #ecology Research published earlier this year found that the training of a neural network creates a carbon dioxide footprint of 284 tonnes - the equivalent of five times the lifespan emissions of a typical car.
0 replies, 3 likes

Oct 06 2019 Erik Hamburger

Can #MachineLearning and #Sustainability go together? This study shows how energy intensive all this #ML and #AI is.
0 replies, 2 likes

Jun 14 2019 Dr William Marshall

Paper: Energy and Policy Considerations for Deep Learning in NLP (Natural Language Processing). Training an #AI model uses a lot of electrical power and leaves a vast carbon footprint. #climatechange
0 replies, 2 likes

Jun 18 2019 Dave Costenaro

Interesting paper: "Energy and Policy Considerations for Deep Learning in NLP." Training 1 big NN model has the carbon footprint of 5 cars over their lifetimes. ( Compute is cheap...but not free, so please give efficient code some thought!
0 replies, 2 likes

Jun 14 2019 Manyvip

A Deep Learning Process Can Emit 284.000 kilograms of Carbon Dioxide (CO2). Download PDF.
0 replies, 2 likes

Nov 21 2019 Patrick Burr

#AI comes at an environmental cost. The largest bearer of it's externalities is our planet. I can't vouch for the accuracy of these findings, but I have similar trends quite few times now. Running loads of CPUs directly increases CO2 emissions and resources consumption.
1 replies, 2 likes

Aug 15 2019 Beril Sirmacek 🦋

From now on, the #artificialintelligence frameworks will not be judged by their #performance but by their #energy labels. #carbonfootprint #climatechange #co2
1 replies, 2 likes

Jan 12 2020 J. Chris Pires

cc #PAGXXVIII #PAG2020 #AI Machine Learning
0 replies, 2 likes

Jan 14 2020 Juan A. Botía

Next time you have some desire to do a blind search for your deep ann model best parameters think twice!
0 replies, 1 likes

Jun 11 2019 DOSE Engineering

Data centers and cloud computing providers need to up their use of renewable energy in order to meet the high energy demands of CPU/GPU/TPU by artificial intelligence/deep learning. #greenenergy #cloudcomputing #datacenters #AI
0 replies, 1 likes

Jun 07 2019 Ken Figueredo

#Green credentials and #MachineLearning
0 replies, 1 likes

Jul 30 2019 Swadhin | স্বাধীন

@krismicinski A recent related paper focusing on NLP model and energy : . Emma discusses about this in detail in TWIMLAI podcast this week.
0 replies, 1 likes

Aug 22 2019 Libby Hemphill, PhD

@danieljkelley Just starting the read, but the environmental impact of our models is definitely something we talk about on my team. @strubell had a great paper at #ACL2019 about neural models and their costs:
0 replies, 1 likes

Jun 07 2019 Nate Jue

Colleague sent this article to me and it's making me think more and more about the environmental impacts of my computational work and associated ethical decisions. How many of us computational biologists even have this on our radar? I sure haven't.😑
0 replies, 1 likes

Jul 24 2019 GetzlerChem

@Chemjobber See also the surprising and staggering cost of deep learning. (preprint, so caveat emptor, etc)
0 replies, 1 likes

Dec 10 2019 Miguel Luengo-Oroz

Some of the refs I shared: review on AI applications for climate ; how AI models grow exponentially ; CO2 footprint of an AI system ; sustainability as a principle of AI development
0 replies, 1 likes

Jan 12 2020 Richard Rathe, MD

Reminds me of the energy needed to support #cyptocurrencies and so-called "mining". Enough electricity to keep whole cities going for months/years!!
0 replies, 1 likes

Jun 08 2019 Jordan Foley

Really interesting work on the potential environmental implications of machine learning and AI. Ive seen lots of important convos about the ethical dimensions of these technologies but few that center questions like these.
0 replies, 1 likes

Jun 07 2019 Andrés Murcia

La huella de carbono del Deep Learning - "Runs on energy-intensive computer chips, can emit more than 626,000 pounds of carbon dioxide equivalent, nearly five times the lifetime emissions of the average American car." -
1 replies, 1 likes

Jan 14 2020 Eduardo Eyras

Given the formula CO2e = 0.954pt, pretty much all current data science will end up burning the planet
0 replies, 1 likes

Jun 10 2019 Philippe Durance

Recent progress in training #neuralnetworks depends on the availability of exceptionally large computational resources that necessitate similarly substantial #energy consumption @Cornell
0 replies, 1 likes

Aug 15 2019 Tim Heiler

The average deep learning model using fossil fuel releases around 78,000 pounds of carbon. That’s more than half of a car's output from assembly line to landfill. According to this paper: #AI #machinelearning #ClimateChange #ClimateEmergency
0 replies, 1 likes

Jun 10 2019 Charles Starrett

We know computation like this, not to mention #blockchain (*shudder*), is worsening our #climate crisis. Why aren't we pushing harder for data centers to have their own solar/wind farms? — Energy and Policy Considerations for Deep Learning in NLP
0 replies, 1 likes

Jul 18 2019 Alasdair Allan

@swardley I do have some issues with the broader applicability of their analysis, but here's the link to the Strubell, Ganesh & McCallum (2019) paper with the life-cycle analysis of #MachineLearning training I talked about at the start of my #OSCON talk,
0 replies, 1 likes

Jun 16 2019 World Ethical Data Forum

Here's the UMass paper, if the numbers interest you: Thanks to @InfoMgmtExec for pointing out the original post wasn't clear enough.
0 replies, 1 likes

Jun 12 2019 Nathaniel Bullard

First: an AI model doesn't run on coal or gas; it runs on electricity, and its carbon will be a direct result of the power mix used to energize the data centers it runs on. The paper gets that right...
1 replies, 0 likes

Nov 04 2019 Emily Hopkins

@emiliagogu @IEEEorg 5. Diversity, non-discrimination, fairness accessibility, bias, competing interests & objectives #ismir2019 6. Societal and environmental well-being sustainability and benefit to future generations energy use of deep learning:
1 replies, 0 likes

Jun 28 2019 Lancelot PECQUET

#environment - Training a single #AI model can emit as much #carbon as five cars in their lifetimes
1 replies, 0 likes

Jul 05 2019 Zachary Lipton

@VishnuBoddeti So far, I have yet to be convinced of neural architecture search as a research direction but my degree of certainty is not high. To date, NAS requires 1000s× more resources w/o qualitatively stronger results. See paper by @strubell on environmental impact—
1 replies, 0 likes