Last month, a group of researchers from the Allen Institute for AI, the University of Washington, Microsoft, and other prominent institutes presented a paper that came out with a framework to measure the carbon intensity of AI in cloud instances. 

Unlike earlier approaches, the tools presented to measure carbon footprint differ in two respects: First, the team instead of adding up the energy consumption of server chips over the time of training, the study records it as a series of measurements; second, it correlates this usage information with a number of data points that show the local emissions per kilowatt-hour (kWh) of energy consumed. In a statement to IEEE, Jesse Dodge, the lead author of the paper, said, "What I hope is that the vital first step towards a more green future and more equitable future is transparent reporting. Because you can't improve what you can't measure."

The research further states that there are many use cases for artificial intelligence that promise to advance sustainability efforts and prevent a rise in the global temperature of more than 1.5 degrees Celsius. However, the carbon intensity of AI workloads must also be taken into consideration. In order to help developers and consumers understand and reduce the carbon footprint of AI workloads, the research calls on cloud providers to give information on software carbon intensity and concrete solutions.

Are there causes of worry?

Today's organisations are eager to take advantage of AI, which includes lowering their carbon footprint. But make no mistake, each type of AI has a unique carbon footprint, which changes based on the methods used to train it. So, the need is to look at some of the latest research done in this domain to understand the gravity of the situation. 

  • As per the assessment made by researchers from the University of Massachusetts, Amherst recently assessed the life cycle of training several popular big AI models. As per the findings, the process can produce more than 626,000 pounds of CO2 equivalent or near about five times the lifetime emissions of the typical American car (and that includes the manufacture of the car itself).
  • In yet another paper, "Energy and Policy Considerations for Deep Learning in NLP", the authors assessed the CO2 emissions, electricity consumption, and training costs for some of the most advanced NLP models. As per the estimates, using GPUs to train a transformer model—a sizable model used in language processing—could result in 626,155 lbs of carbon dioxide emissions. 

It's a well-established fact that despite all the technological advancements made possible by artificial intelligence, such as voice recognition and self-driving cars, it comes with a cost: these systems use a lot of energy and potentially produce a lot of greenhouse gas emissions. 

There's a way out

Global climate change appears to be accelerated by greenhouse gas (GHG) emissions such as carbon dioxide. According to the IPCC 2018 report, energy generation continues to play a significant role in GHG emissions, accounting for nearly 25% of GHG emissions in 2010. Machine learning (ML) systems have the potential to considerably increase carbon emissions due to the exponential growth in the compute and energy requirements of many contemporary ML algorithms. Hence, it becomes vital to measure and then mitigate the risks.

To that end, a team of academics from Stanford, Facebook AI Research, and McGill University has developed an easy-to-use tool that instantly calculates how much electricity and carbon emissions a machine learning project would consume. Similarly, In Mila, BCG GAMMA, Haverford College, and Comet.ml jointly published CodeCarbon - an open-source software package to evaluate the location-dependent carbon dioxide footprint of computing- a significant affirmation of their dedication to responsible technology. 

Researchers at MIT have looked into the possibility of cutting down on computing expenses by sharing weights and architectures. Another MIT innovation aims to enhance the adoption of these models and lower the significant energy expenses that are typically needed. Earlier, machine learning has also allowed Google's DeepMind to cut the amount of cooling energy needed in data centres by 40%. 

Although AI has many potential social benefits, the amount of energy required to power the powerful computation powering it can have a significant negative impact on the environment. Although these studies represent a small portion of the work, more such steps are required to move ahead strongly in that direction.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in