The rapid growth of artificial intelligence is posing a major energy problem for the future, with data center electricity demand predicted to increase by 160% according to a Goldman Sachs analysis. Dr. Jonathan Koomey, founder of Koomey Analytics, has been researching electricity use in computing since the 1990s and highlights the significant energy consumption of AI-powered data centers. With AI models requiring ten times more energy to answer queries compared to traditional Google searches, concerns are being raised about the environmental impact of AI usage. However, Dr. Koomey remains optimistic, believing that those paying for the expensive servers will focus AI on applications that generate revenue, making them more efficient and sustainable. The article also highlights how AI is already being used to reduce energy costs, with Google using machine learning to cut cooling costs at one of its data centers by 30-40%. This raises an interesting question about the role of AI in solving its own energy problems.

AI’s Energy Conundrum
It takes about ten times as much energy to answer a ChatGPT query than to ask the same thing in a normal Google search.
1–2 minutes










