A typical day for most of starts with getting out of bed, getting ready for work and settling in front of our laptops... likely in our study or bedroom or living room. Work From Home is the reality for a majority of professionals today, and this is a time truly marked by hyperconnectivity. 2020 was a bumper year for cloud services companies, as more companies began to throng cloud service providers to ensure business continuity in a pandemic year and initiated the rapid shift towards building an all-digital workforce. And prior to the pandemic, the trajectory for cloud growth was only upward. One can say the quantum of online activity is at an all-time high. Devices dominate our daily lives, and mundane transactions. Social media platforms are not just for recreation but have become a vital tool to mobilise digital citizens more efficiently for a host of reasons, manufacturing companies have finally realised that digitisation is essential for their survival, automakers are ramping up autonomous vehicle capabilities, drug makers increasingly rely on tech-assisted drug discovery to treat illnesses like COVID19.. the application of technologies like AI is boundless. 

While the might and proliferation of AI is unquestionable, it also compels us to look more closely at the environmental fallout that AI is causing. A recent report by the Allen Institute of Artificial Intelligence and OpenAI stated that the volume of calculations to be dominant in activities like language understanding, game playing and common-sense reasoning has soared an estimated 300,000 times in the last six years. 

If data was the new oil, deep learning is guzzling this oil recklessly with no thought spared for the benefit of the environment. A research paper by University of Massachusetts, Amherst said that training a large deep learning model generates the same carbon footprint as the lifetime of five American cars, including gas. Another paper by Emma Strubel and colleagues said that training and developing one machine translation model that uses a technique called neural architecture search was responsible for an estimated 626,000 tons of CO2 - in comparison, an average American is responsible for generating 36,000 tons of CO2. The unnaturally massive amounts of computational power needed to support AI-related research leave behind huge carbon footprints, causing an irrefutable strain on the environment. These projects are collectively known as Red AI, and stems from the AI's community's focus on accuracy instead of efficiency. AI communities are driven by state of the art results, exemplified by the popularity of their leader boards that prioritise accuracy of algorithms but don't explore the environmental impact of running high accuracy-driven data cycles. 

Green AI, on the other hand, is AI research that is environment-friendly and inclusive. The emphasis on inclusion is equally important, because it signifies the skewed balance of power that exists in the tech community today. Large computational efforts are understandably led by cloud and AI companies like Google, Facebook and Nvidia among many others. These companies are on the cutting edge of AI research today, in addition to driving the hugely profitable data centre and cloud storage businesses that are supported by millions and millions of computer chips running at peak efficiency. Thanks to these structures, every device we own today - right from a smart TV to a smartphone, can run with uninterrupted efficiency and speed. However, this leaves academicians and university researchers at an undue advantage. University-level research labs are no match to the billion-dollar sophisticated research facilities run by corporates, leaving professors and scientists feeling like the gap between industry and academia is only widening. To deliver cutting edge research, high levels of compute power is required and academicians, despite having the ideas, don't possess the infrastructure advantage that their peers working for any of these cloud or technology companies have. Even OpenAI, backed by tech mogul Elon Musk, started out as a nonprofit research lab in 2015, with the goal of empowering programmers. But for organisations like OpenAI to attract the right kind of talent to solve pressing challenges in agriculture, climate change and food security, having high compute infrastructure is absolutely essential. 

It is without doubt that Red AI has led to some pathbreaking developments today, and has enriched scientific and business communities greatly. But researchers want the needle to move towards Green AI gradually. Here are some ways to do get there: 

  • Republish and reproduce research with code to avoid duplication of efforts
  • Increased efficiency of AI on hardware, with companies like Nvidia and Intel leading the way in developing specialised hardware 
  • Drive more collaborations between research, academia, incubators, startups, tech companies and enterprises
  • Model compression, a variant of the pruning technique, to shrink models without impacting their performance 
  • Developers can report model efficiency and computational price tags, and tech companies can release trained models more openly to developers 

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in