A recent article from Heidi News explored the issue regarding the impact of generative AI on the environment. Digital energy consumption has captured the attention for many media outlets, with evidence from research citing that the CO2 emitted from training the models is enough to make a round trip to the moon, by car!
Back to the basics, the training process for generative AIs necessitates a substantial volume of examples, including text and images, which demands a lot of computing power. As models advance, the more computing power it requires, thus greater the electricity consumption. Taking GPT-4 as an example, it has 1000 billion parameters – 6 times more complex than its predecessor GPT-3. On top of this, the place and the mode of energy production play a role in how much carbon is emitted. For example, when comparing hydroelectric power to a coal-fired power plant, the carbon dioxide (CO2) emissions can increase by approximately 100-fold.
For Yash Raj Shrestha, professor and head of the Applied Artificial Intelligence Lab at HEC Lausanne, the situation is concerning.
“OpenAI does not reveal its source code, for example. However, under these conditions, users are required to trust them, including on the question of the carbon footprint… We should be able to audit the code independently.”
Read full article at Heidi News.