AI Consumes Gigawatts, Pushing Grids to the Limit

Advertisement

AI Devours Electricity and the Grid is on the Verge of Collapse

Artificial intelligence (AI) training centers consume tens of gigawatts of electricity tech giants compete for the title of “biggest electricity waster” in the U.S., AI will require an additional 3% in electricity supply, and the grid is not prepared for it.

High-AI data centers have become a new battlefield in the global technology race.
Giant companies, in a constant race with each other for the title of “the largest,” have found a new metric to compete on: the gigawatts of electricity they use.

This phenomenon, which professionals call “Bragawatts,” raises a huge question mark about the ability of regional and global power grids to handle the dramatic load that AI imposes on them.

Due to electricity costs, Meta currently needs to work hard to fund its Hyperion campus, its central AI facility in Louisiana, U.S with an electricity output measured in hundreds of megawatts.

OpenAI has even more ambitious plans for the Stargate center in Michigan, which could reach over a gigawatt as part of a larger complex, intended to surpass the 8-gigawatt threshold.

The AI Has a Price: Electricity Rates Jump

Research departments in international banks are trying to collect complex data on the electricity use of AI centers the estimate is that the total planned capacity for AI data centers worldwide already stands at about 46 gigawatts.
This implies a crazy capital investment of about 2.5 trillion dollars to build them.

It can be estimated that their total electricity demand (assuming a standard ratio of 1.2 for electricity use efficiency) reaches about 55.2 gigawatts.
By comparison, this demand could supply electricity to about 44.2 million homes in the U.S three times the total homes in California.

AI Server Farms: Half a Gigawatt

The phenomenon does not skip the U.S.
Preliminary data indicate a need for an additional consumption of about 500 megawatts (half a gigawatt) in the coming years for server farms.
This is an addition of about 3% to the total electricity consumption during peak hours, equivalent to the consumption of about half a million people.

Various factors, including government committees, energy authorities, and research firms, warn that the country,
in a race to meet the continuous need to increase electricity supply for its population, is not prepared for AI demands and call for a national strategy to support the efficient establishment of these farms.

AI centers are not just enlarged versions of classic cloud data centers.
While normal cloud servers operate at relatively low and variable loads, training AI models requires thousands of graphical processors (GPUs) operating in near-perfect coordination.

The power profile of an AI facility can jump from 30% to 100% within milliseconds, and existing power grids are not designed to handle this variability efficiently.

To meet demand, giant companies adopt several different strategies.
Some use the existing grid, although this strategy may create a mismatch between supply capacity and actual demand.

Other companies plan network expansions and upgrades investing the required capital to supply electricity at their own expense, with tax incentives for such investments others condition the construction of their facilities on local government participation in the investments.

In the medium and long term, studies show that existing power grids will not withstand the load, and the only solution is the construction of dedicated power stations next to data centers to supply them with their electricity.

Advertisement
Advertisement