Your AI’s Carbon Footprint
TL;DR Let’s talk about semantha’s environmental impact. Artificial Intelligence (AI) is often seen as a possible solution for environmental problems – for example when it comes to transportation. Often, the cost of the AI itself is neglected. Yet, AI comes with a significant demand for energy – especially when it comes to training – recent NLP models require a large amount of energy during training especially, but also in production. We show that AI can indeed have a beneficial impact on an organization’s environmental footprint – in areas you would not have imagined though. Or did you think you could save 150 round-trips from Germany to the US just be applying AI in our document related processes? No – well we’ll give you the numbers of our clients …
The Ecological AI
Yes, AI has been hyped for a few years now – just like discussing the transportation revolution[1] and promoting energy efficiency: The EU announced the European Green Deal[2] and wants to pave Europe’s way towards a climate-neutral future. Consulting firms[3] advise their clients on climate-neutrality and insurance companies set their focus on sustainability[4]. But how does this look like in reality? This is not only a question we are pondering, but also a topic for politicians[5], university chairs[6], the academic community[7] and you can even tune in to your favorite podcasts for a discussion of the subject[8].
semantha – just like the other AI systems – requires energy for training and for production. However, if you look at the CO2 impact of AI solutions, you have to consider the total energy required by the target processes (that are to be replaced or supported) and the energy required by the AI. In our context that often boils down to a manual process that has been digitized. Our customers’ employees are already supported in their work by computers, but usually they only perform a manual task using the computer (for example, they fiddle with spreadsheets and search for content in PDFs). In addition, we built semantha in such a way that we do not have to train her from scratch for every new customer or use case. In the simplest case, we can employ semantha based solely on the lab-based training – no further training effort required when before used by our customer. When we familiarize semantha with the technical jargon of the application domain, we usually do not train her language module from scratch, but start with the existing models and refine them. Needless to say, that improves the overall energy efficiency of semantha compared to a “classical machine learning” system that has to be trained anew with each and every new customer and/or use case.
Numerical Modeling
To put some flesh on our claims, we set up a numerical model that we would like to share with you. Of course, it does not cover all aspects but it provides you with a good idea of semantha’s energy consumption. Since the basic training is only necessary once, we ignore it at this point and do not include the training in the calculation. If you’d like to consider that, simply add 0,75kg for training from scratch and half a kilo for the refinement respectively[9].
To determine the CO2 footprint, we assume that every employee uses a PC (110W) with monitor (60W) when doing his or her job and that the CPU utilization is at about 25% – a GPU is not required. We also assumed a power usage effectiveness (PUE) of 1.58[10]. On the other side, we have semantha , which runs on a single server system for all employees. It requires 165W, but does need a monitor; here we assume a utilization of 100%. Semantha also does not need a GPU for operation – we only need it for training and adjusting the models to domain jargon.
To calculate the CO2 footprint from the energy consumption (and the savings) in kWh, we take a look at the energy mix – you’ve probably already seen that in your private power bills (at least in Germany it is mandatory to provide the customer with a breakdown of the energy sources). The German Federal Environment Agency assumes that 0.434kg of CO2 has been produced per kWh when producing electricity in 2022[11]. The numbers for the US are in the range from 0.432 kg/kWh (as in the paper by Stubell et al.) to 0.707 kg/kWh (as in the official EPA figures for 2018[12]).
This is everything we need to estimate the CO2 emissions of our processes. Let’s assume that a company has 100 employees who perform the task 30 times a year, each time requiring one week (i.e. including research etc.). This gives us 100 x 30 x 40h x (25% x 110W + 60W) x 1.58 = 16,590 kWh. According to Germany’s energy mix, this results in 7.20 metric tons of CO2 produced.
We compare this value with the impact of a process supported by semantha: The basic calculation is the same. However, we assume that semantha spends 2 minutes of computing time per task (using 100% of the CPU) and that the employees – with her help – need only 4 hours to complete one task (i.e. only 1,659 kWh) and. Thus, semantha consumes 100 x 30 x 0.03h x (100% x 165W) x 1.58 = 26.07 kWh, i.e. 12.36 kg CO2. So we would have saved 14,905 kWh and thus (again based on the German energy mix) 6.469 metric tons of CO2.
To put all this into context, here are some CO2 values for comparison[13]:
CO2 [metric tons] | Source | |
Per capita and year, 2021, Germany | 8.09 | Statista |
Per capita and year, 2021, USA | 14.86 | |
Per capita and year, 2021, India | 1.93 | |
Round-trip flight, Frankfurt (FRA) → New York (JFK) in an Airbus A380-800 jet, per passenger | 3.226 | Atmosfair |
12.000 km driven in Germany with a mid-range car | 2 |
Using the calculator at the bottom of the page, you can also run through the calculation again with other assumptions, for example with more employees, less time saving, more semantha computing time per task and so on.
Green IT thanks to AI!
semantha can make a valuable contribution not only to process improvement but also to considerable savings in a company’s CO2 footprint. This is due to the fact that we don’t train semantha anew for each use case (and let’s be honest: with standard machine learning, one training cycle is seldom enough). Also when the situation changes (for example, when new information can be taken into account, regulations change, etc.) we can adapt semantha’s library instead of the language model used.
If we look at the from-scratch training explicitly, we see that we are far below the energy consumptions of a current NLP model: semantha’s training releases only 0,75kg of CO2 whereas training BERT from scratch produces over 700kg[8].
minutes Ø processing time (semantha)
hours Ø processing time (human only)
hours Ø processing time (human with semantha)
inquiries / year
manual process | process with semantha | ||
---|---|---|---|
human | semantha | ||
processing time (human + machine) | h | h | |
total time | h | h | h |
time saving | h | ||
internal total cost accounting (daily rate) | € | € | |
savings in personnel costs | € | ||
total energy | kWh | kWh | kWh |
total carbon | kg | kg | kg |
total carbon | t | t | t |
total carbon savings | t |
1.60-multiples of a Ø-person/year in Germany.
0.95-multiples of a Ø-person/year in the USA.
8.72-multiples of a Ø-person/year in India.
6.98-of a middle class car for 12.000 km.
4.32-of a round-trip flight from Frankfurt (FRA) to New York (JFK) with a Airbus A380-800.
Calculating 0.434 kg per kWh.
Footnotes
[1] For example Germany’s VDW https://vdw.de/en/start-ups-reducing-co2-levels-with-artificial-intelligence/
[2] https://ec.europa.eu/info/strategy/priorities-2019-2024/european-green-deal_en
[3] Ernst & Young Germany (https://www.ey.com/de_de/advisory/carbon) and pwc (https://www.aa.com.tr/en/energy/regulation-renewable/artificial-intelligence-can-be-used-to-reduce-emissions/25166).
[4] „Jetzt erst recht“: Big Player der Branche sehen Corona als Nachhaltigkeitsbeschleuniger 19. Mai 2020 in Versicherungswirtschaft heute,
[5] Question for the European Commission, Eugen Jurzyca “Carbon footprint of artificial intelligence (AI)”, https://www.europarl.europa.eu/doceo/document/E-9-2020-001000_EN.html
[6] Institute for Energy Efficiency @ UCSB, Santa Barbara, California, https://iee.ucsb.edu/
[7] Emma Strubell, Ananya Ganesh, Andrew McCallum “Energy and Policy Considerations for Deep Learning in NLP”, https://arxiv.org/abs/1906.02243 und Ameet Talwalkar “AI in the 2020s Must Get Greener—and Here’s How” in IEEE Spectrum, https://spectrum.ieee.org/energywise/artificial-intelligence/machine-learning/energy-efficient-green-ai-strategies
[8] Twiml talk with Emma Strubell “Environmental Impact of Large-Scale NLP Model Training”, https://twimlai.com/twiml-talk-286-environmental-impact-of-large-scale-nlp-model-training-with-emma-strubell/
[9] For training semantha from scratch, we need about 1.56 kWh and for fine tuning to a domain’s jargon about 1.04 kWh – this is roughly 0.74kg and 0,5kg CO2 respectively. In contrast to that, you need about 1,503 kWh for training BERT and refinement takes 19 kWh – a single (!) inference with BERT uses 0.12 kWh. Unbelievable, right? Double check the numbers in the paper referenced in [6] if you like…
[10] This is the average that Rhonda Ascierto estimates for 2018 in her – or Uptime Institute’s – Global Data Center Survey (https://datacenter.com/wp-content/uploads/2018/11/2018-data-center-industry-survey.pdf). The trend for PUE points downward yet the curve is already quite flat so that we’d rather wait for updated numbers instead of estimating ourselves. If you have more up-to-date numbers, tell us 😉
[11] Umweltbundesamt Germany: Entwicklung der spezifischen Kohlendioxid-Emissionen des deutschen Strommix 1990-2018 und erste Schätzungen 2019 im Vergleich zu CO2-Emissionen der Stromerzeugung, https://www.umweltbundesamt.de/bild/entwicklung-der-spezifischen-kohlendioxid-1
[12] EPA (2019) AVERT, U.S. national weighted average CO2 marginal emission rate, year 2018 data. U.S. Environmental Protection Agency, Washington, DC, https://www.epa.gov/statelocalenergy/avoided-emission-factors-generated-avert
[13] see https://de.statista.com/statistik/daten/studie/167877/umfrage/co-emissionen-nach-laendern-je-einwohner/, https://www.atmosfair.de/de/kompensieren/flug/