As AI’s power consumption continues to rise, it could already account for nearly half of datacentre power usage by the end of this year, with projections suggesting a significant increase in energy consumption by 2025.
Artificial intelligence (AI) is increasingly becoming a significant contributor to datacentre power consumption. According to recent analysis by Alex de Vries-Gao, the founder of the Digiconomist tech sustainability website, AI could account for nearly half of datacentre power usage by the end of this year.
The Growth of AI Power Demands
De Vries-Gao’s calculations are based on the power consumed by chips made by ‘Nvidia and Advanced Micro Devices that are used to train and operate AI models.’ The paper also takes into account the energy consumption of chips used by other companies, such as Broadcom. The International Energy Agency (IEA) estimates that all data centres – excluding mining for cryptocurrencies – consumed 415 terawatt hours (TWh) of electricity last year.
De Vries-Gao argues that AI could already account for ‘20% of this total,’ which translates to a significant increase in power consumption. By the end of 2025, energy consumption by AI systems could approach up to 49% of total datacentre power consumption, reaching 23 gigawatts (GW), twice the total energy consumption of the Netherlands.
Artificial intelligence (AI) systems require significant amounts of energy to operate, with estimates suggesting that they consume around 1-3% of global electricity.
This is largely due to the massive computational power required for tasks such as deep learning and natural language processing.
According to a study by the Natural Resources Defense Council, data centers used to support AI and cloud computing account for around 2% of global greenhouse gas emissions.
As AI continues to grow in importance, concerns about its energy consumption are also on the rise.
Factors Influencing AI Power Demands

Several factors could lead to a slowdown in hardware demand for AI systems. For example, waning demand for applications such as ‘ChatGPT‘ could result in reduced energy consumption. However, geopolitical tensions and export controls could also have an impact on producing AI hardware, as seen in the case of Chinese access to chips.
On the other hand, innovations in hardware design can reduce computational and energy costs of AI systems. Multiple countries attempting to build their own AI systems, a trend known as ‘sovereign AI,’ could increase hardware demand and exacerbate dependence on fossil fuels.
The Need for Transparency
The lack of transparency in AI’s power demands is an issue that needs to be addressed. De Vries-Gao describes the industry as ‘opaque.’ The EU AI Act requires AI companies to disclose the energy consumption behind training a model, but not for day-to-day use. This lack of transparency hinders efforts to develop more sustainable AI systems.
A Call for Action
As AI continues to grow in power demands, it is essential that we take steps to mitigate its environmental impact. The development and deployment of more efficient AI systems, as well as increased transparency on their energy consumption, are crucial steps towards creating a more sustainable future.
Artificial intelligence (AI) is transforming industries and revolutionizing the way we live.
However, its development and deployment come with significant environmental costs.
To mitigate these effects, researchers are exploring sustainable AI methods that reduce energy consumption and waste production.
Techniques like transfer learning, model pruning, and knowledge distillation can significantly decrease computational requirements.
Moreover, the use of renewable energy sources in data centers is becoming increasingly prevalent.
As 'AI continues to evolve,' prioritizing sustainability will be essential for its long-term viability.