The exponential growth of AI accelerators in data centers is leading to an unprecedented surge in power consumption, with forecasts predicting a significant increase in electricity demand. As Big Tech continues to spend billions on AI infrastructure, the focus is shifting from compute and memory to power consumption per chip. The latest generation of AI accelerators from Nvidia, AMD, and Intel are consuming more power than their predecessors, with some chips reaching a whopping 1,500W of power per chip. This trend is expected to continue, with the million-plus GPU data center target expected to be reached by 2027. The industry is taking steps to address this power consumption crisis, including the adoption of liquid cooling technologies and the development of more power-efficient chips.

Big Tech’s AI Ambitions Spark Power Consumption Crisis
As each new generation of AI accelerators boosts computing performance, it also consumes more power than its predecessor, meaning that as shipment volumes rise, so does total power demand.
1–2 minutes










