Analysis

Goldman says AI build-out spending hinges on infrastructure assumptions

Goldman is saying the AI capex story is not one fixed trillion-dollar bill. The real swing factor is a handful of assumptions about silicon, power, labor, and data-center design.

Derek Washington··6 min read
Published
Listen to this article0:00 min
Share this article:
Goldman says AI build-out spending hinges on infrastructure assumptions
Source: goldmansachs.com
This article contains affiliate links, marked with a blue dot. We may earn a small commission at no extra cost to you.

The trillion-dollar headline hides a smaller set of decisions

Goldman Sachs Global Institute is pushing back on the idea that AI spending will land on one inevitable number. Its message is sharper than the usual hype cycle: the scale of the build-out depends on a few infrastructure assumptions, and small changes in those assumptions can move cumulative spend by hundreds of billions of dollars.

That matters inside Goldman because the AI trade is no longer just a software story or a chip story. It is a financing, power, construction, and depreciation story, which means the winners and losers will be shaped as much by execution as by adoption. For analysts, associates, and bankers advising hyperscalers, chip suppliers, developers, and private capital sponsors, the lesson is simple: the interesting question is not whether AI demand exists, but how much physical infrastructure the economy can actually build, power, and refresh.

The four assumptions doing the heavy lifting

Goldman says four variables matter most. First is the economic useful life of AI silicon. If chips stay productive longer, the replacement cycle stretches and cumulative spending falls. If they become obsolete faster, the bill rises quickly because the industry has to keep buying and installing new generations of hardware.

Second is the cost and complexity of next-generation data centers. Goldman’s earlier work already described a shift away from traditional facilities toward hyper-dense computational environments that require advanced cooling systems. That raises the price of each new site and makes construction slower and more specialized.

Third is the chip and architecture mix. The industry is not building one uniform stack, and the mix of accelerators, networking gear, and facility design choices can change how much capital is needed for the same amount of compute. Fourth is elongation from power, labor, and equipment bottlenecks. Those constraints do not just delay projects; they stretch out timelines and can force more spending into the future.

The key point is that the AI build-out is not a single straight line. It is a set of compounding decisions about replacement cadence, build speed, and what gets built first.

Why small timing shifts can create huge dollar swings

Goldman says even modest changes in AI silicon replacement timing can move cumulative spending by hundreds of billions of dollars. That is the kind of number that changes how hyperscalers plan capex, how suppliers price capacity, and how the Street models multi-year revenue streams.

AI-generated illustration
AI-generated illustration

For people inside Goldman, that should sound familiar. In every capital-intensive cycle, the forecast depends on whether today’s purchases are one-time installs or recurring replacement demand. If AI chips turn over quickly, suppliers and infrastructure owners can justify a much larger ongoing investment cycle. If they last longer than expected, the market can still grow, but the pace of reinvestment slows.

That distinction matters for compensation and career economics too. In strong infrastructure cycles, financing mandates, project advisory work, and data-center-related coverage can create more durable revenue than a short-lived thematic trade. In weaker cycles, desks can be left explaining why a supposed secular wave turned into a slower, lumpier spend pattern.

The physical build-out is already real

Goldman’s latest framing lands on top of evidence that the build-out is well underway. The firm says U.S. spending on data-center construction has tripled over the last three years. It also says occupancy rates for third-party leased data centers remain near record highs across most U.S. markets.

That combination tells you the market is not dealing with speculative empty shells. Capacity is being absorbed, and it is being absorbed fast enough to keep developers, landlords, utilities, and contractors under pressure. For hyperscalers, the race is not simply to buy more compute; it is to secure land, power, equipment, and delivery slots years ahead of need.

Goldman Research has also raised its own power outlook. It now expects data-center power demand to grow 220% by 2030 versus 2023 levels, up from a prior forecast of 175%. Goldman previously projected global power demand from data centers would rise 165% by 2030 versus 2023. Those are not abstract forecasting tweaks. They are signals that the bottleneck is moving from the server rack to the grid.

Power is becoming the core constraint, not a side issue

The wider policy backdrop makes Goldman’s assumptions more consequential. The International Energy Agency says AI is the most significant driver of rising electricity demand from data centers, and it projects electricity demand from AI-optimized data centers will more than quadruple by 2030. In its base case, global electricity generation to supply data centers rises from 460 TWh in 2024 to more than 1,000 TWh in 2030 and 1,300 TWh in 2035.

The U.S. Energy Information Administration adds another layer. It estimates computing accounted for 8% of U.S. commercial-sector electricity consumption in 2024 and rises to 20% by 2050 in its reference case. That is the kind of number utilities, grid operators, and policymakers cannot ignore.

Related stock photo
Photo by Brett Sayles

For Goldman teams, the implication is that AI capex is colliding with real-world limits in power, permits, labor, and equipment lead times. A server cannot generate revenue if the substation is not ready. A data hall cannot come online if specialized cooling is delayed. A hyperscaler cannot scale at will if interconnection queues and transformer shortages slow everything down.

The old debate about AI economics is still in the background

Goldman’s current note also fits into a longer internal and market debate. In 2024, the firm was already flagging roughly $1 trillion in coming AI capex tied to data centers, chips, AI infrastructure, and the power grid. That earlier discussion also featured Daron Acemoglu and Goldman’s Jim Covello, who questioned whether the economics of generative AI would justify the amount of money being spent.

That history matters because the May 1 note is not a retreat from the infrastructure theme. It is a more precise version of it. Goldman is not saying the AI build-out is fake. It is saying the final number depends on whether the industry can keep silicon useful longer, deliver denser data centers without runaway costs, and clear the power and labor bottlenecks that are already showing up in the field.

How to separate durable demand from hype

The practical filter for readers inside Goldman is to look for infrastructure demand that survives different assumptions. Durable demand usually has four traits:

  • It is tied to power-secured sites, not just announced plans.
  • It depends on repeatable replacement and refresh cycles, not one-off purchases.
  • It is supported by real occupancy and construction activity, not only narrative momentum.
  • It can survive slower permitting, higher cooling costs, and tighter equipment supply.

That is where the story becomes useful for anyone watching the AI trade from inside the firm. Goldman is effectively warning clients and employees alike not to confuse enthusiasm with inevitability. AI may still produce a massive capex cycle, but the scale will be set by plumbing, not slogans.

Know something we missed? Have a correction or additional information?

Submit a Tip

Never miss a story.

Get Goldman Sachs updates weekly. The top stories delivered to your inbox.

Free forever · Unsubscribe anytime

Discussion

More Goldman Sachs News