AI usage power consumption is now similar to many small countries

Key Takeaways:

– Artificial intelligence (AI) is consuming around 4.3GW of power globally, which is almost as much as some small countries.
– Schneider Electric predicts that AI’s power usage will increase as adoption of the technology grows, and by 2028, it could account for between 13.5-20GW of power.
– Currently, AI makes up 8% of a typical data center’s power consumption, but this is expected to increase to 15-20% by 2028.
– The paper also highlights the need for cooling in data centers, which requires additional electricity and often correlates to high water usage.
– Schneider Electric advises data center operators to transition to a 240/415V distribution to accommodate the high power densities of AI workloads.
– Upgrading infrastructure and improving efficiency are necessary to manage power usage and make cloud computing and AI workloads more efficient.

TechRadar:

New figures from French energy management company Schneider Electric claim artificial intelligence (AI) is now consuming an estimated 4.3GW of power globally, almost as much as some small countries.

As adoption of the technology increases, so will its power usage. By 2028, Schneider Electric reckons AI will account for between 13.5-20GW, representing 26-36% compound annual growth.

Source link

AI Eclipse TLDR:

According to new figures from French energy management company Schneider Electric, artificial intelligence (AI) is currently consuming an estimated 4.3GW of power globally, which is equivalent to the power consumption of some small countries. As the adoption of AI technology continues to grow, its power usage is expected to increase as well. Schneider Electric predicts that by 2028, AI will account for between 13.5-20GW of power, representing a compound annual growth of 26-36%. The study also highlights the power intensity of data centers, emphasizing the need to upgrade infrastructure and improve efficiency in order to meet the growing demand. Currently, AI makes up only 8% of a typical data center’s power consumption, which totals 54GW. However, by 2028, data center usage is expected to reach 90GW, with AI making up around 15-20% of this. The report also addresses the requirement for cooling in data centers, as excess heat can present safety hazards and lead to premature component failure. Cooling processes not only require additional electricity but often correlate with high water usage, raising concerns about the environmental impact. Schneider Electric advises data center operators to transition from conventional 120/208V distribution to 240/415V to accommodate the high power densities of AI workloads. Furthermore, accurately predicting energy usage will become more challenging as high-energy training makes room for inference workloads, which have a more variable power requirement. Upgrading infrastructure and revising current practices are crucial in order to manage power usage and maximize the efficiency of cloud computing and AI workloads.