Big Tech Will Spend $600 Billion on AI Infrastructure This Year. Where Does That Money Actually Go?

Google just doubled its AI spending to $185 billion. Meta's at $135 billion. Combined, the hyperscalers will burn through more than $600 billion in 2026. What they're building, and what it costs the rest of us.

Alphabet told Wall Street this week that it plans to spend between $175 billion and $185 billion on capital expenditures in 2026. That’s roughly double the $91.4 billion it spent last year, and about $65 billion more than analysts expected.

The stock dropped 5% on Thursday. Investors, it turns out, get nervous when a company announces it will spend more than the GDP of Hungary on servers and data centers.

Google isn’t alone. Meta plans to spend $115 to $135 billion this year. Amazon’s analysts expect about $147 billion. Microsoft is on track for $99 billion. Add Oracle and the other hyperscalers, and the combined total exceeds $600 billion - a 36% increase over 2025.

That’s more than the entire US defense budget. For context, the combined capex of these five companies in 2023 was around $150 billion. In three years, the number has quadrupled.

What the Money Buys

About 75% of the spending, roughly $450 billion, is directly tied to AI infrastructure. In Alphabet’s case, CFO Anat Ashkenazi broke it down as approximately 60% on servers (primarily GPUs from NVIDIA) and 40% on data centers and networking equipment.

Sundar Pichai described the company’s biggest concern as “compute capacity and all the constraints - be it power, land, supply chain constraints” and the difficulty of “ramp[ing] up to meet this extraordinary demand.” He said Google would “go through the year in a supply constrained way,” meaning even $185 billion might not buy enough capacity fast enough.

The demand is real. Google Cloud’s backlog doubled year-over-year to $240 billion. Cloud revenue grew 48% to $17.7 billion in Q4 alone. Gemini now has over 750 million monthly active users. These aren’t speculative investments - they’re responses to existing customer demand that outpaces what Google can currently deliver.

But “demand is real” and “spending is wise” aren’t the same thing.

The Revenue Math Problem

Here’s the uncomfortable arithmetic. Alphabet’s total revenue for 2025 was about $400 billion - the first time it crossed that mark. Its net income was $132.2 billion. It’s planning to spend $175-185 billion on capex alone this year.

That means Alphabet is spending roughly half its annual revenue, and more than its entire net income, on infrastructure. The company isn’t going bankrupt - it has massive cash reserves and the business still prints money. But the ratio of investment to return is getting increasingly aggressive.

Morgan Stanley maintained an overweight rating, calling it another example of “big players pulling away from the pack.” Bernstein bumped its target to $345 but warned that some investors now think Alphabet is overspending. The market’s reaction - a 5% drop despite strong earnings - suggests the latter camp is growing.

The bull case: AI is a once-in-a-generation platform shift and the companies that build capacity now will dominate the next decade. The bear case: these companies are caught in a spending spiral where nobody can afford to stop because their competitors haven’t, regardless of whether the returns justify the investment.

The Environmental Toll

$600 billion buys a lot of concrete, copper, and electricity.

The International Energy Agency projects that data center energy consumption will roughly double by 2026, rising from 460 terawatt-hours to around 1,000 TWh. That’s roughly equivalent to Japan’s entire electricity consumption.

Google’s data centers already used nearly 6 billion gallons of water in 2024 for cooling. The company’s data center emissions surged 48% compared to 2019 levels. And that was before they announced plans to double capacity.

Every company in the hyperscaler cohort has made net-zero pledges. Google committed to carbon-free energy by 2030. Meta has similar targets. But when you’re doubling your infrastructure footprint year over year, the gap between ambition and emissions widens faster than renewable energy deployment can close it.

The power demands are so significant that they’re reshaping energy policy. Data center operators are signing deals to restart shuttered nuclear plants, building dedicated solar farms, and negotiating directly with utilities for gigawatt-scale power contracts. In some regions, data center demand is competing with residential and industrial electricity needs.

The Privacy Dimension

The infrastructure build-out serves AI workloads that increasingly depend on personal data.

Google has started prompting users to grant Gemini access to personal content from Gmail and Google Photos for model improvement. The company draws a line between consumer and enterprise data policies - Workspace business data isn’t used for training, but consumer data from Gemini apps can be reviewed by human annotators and retained for up to three years.

The more compute capacity these companies build, the more data they need to justify the investment. Bigger models require more training data. More users generating more interactions creates more data. The flywheel between infrastructure investment and data collection is not a coincidence - it’s the business model.

This is worth thinking about clearly: the $600 billion isn’t being spent to process data that already exists. It’s being spent to enable new AI services that will generate new data from billions of users. Every Gemini query, every AI-generated email draft, every photo enhancement creates interaction data that feeds the next generation of models.

What DeepSeek Proves About the Arms Race

While American companies race to spend the most, China’s DeepSeek is demonstrating that algorithmic innovation can substitute for raw capital expenditure. Their Engram architecture offloads knowledge storage to cheap system RAM instead of expensive GPU memory. Their upcoming V4 model reportedly runs on consumer hardware - dual RTX 4090s - while competing with models that need data center infrastructure.

DeepSeek’s R1 model last year triggered a $1 trillion tech stock selloff precisely because it challenged the assumption that more spending equals better AI. If a Chinese lab operating under US export controls can match frontier performance at a fraction of the cost, what does that say about the necessity of $185 billion budgets?

This isn’t an argument that the spending is worthless. Scale matters, especially for serving hundreds of millions of users. But it does raise the question of whether the hyperscaler spending race is driven more by competitive pressure than by technical necessity. None of these companies can afford to spend less than their rivals - the arms race logic requires everyone to keep bidding up.

What This Means

The $600 billion figure represents a bet that AI will become the primary interface for computing, the way search became the primary interface for the web. If that bet pays off, the companies that built capacity will own the platform.

If it doesn’t - if AI products plateau, if regulation tightens, if users resist the data extraction that funds the models - then 2026 will be remembered as the year Big Tech overcommitted to a future that didn’t arrive on schedule. The dot-com bubble didn’t prove the internet was worthless, but it did prove that timing and capital allocation matter.

For the rest of us, the implications are more concrete:

Higher cloud costs are coming. Somebody pays for $600 billion in infrastructure, and it’s ultimately cloud customers and advertisers. Google Cloud prices have already been rising.

Energy competition will intensify. Data centers are claiming an increasing share of electrical grid capacity. In some regions, this means higher electricity prices or delayed grid improvements for everyone else.

The data extraction will accelerate. These investments only pay off if the AI products they enable attract and retain billions of users. That requires data - lots of it - and the incentive to collect more will only grow as the infrastructure costs mount.

Concentration increases. A $185 billion capex budget is an extraordinary barrier to entry. The gap between hyperscalers and everyone else widens with every spending cycle. Startups and mid-size companies increasingly depend on these same platforms for their own AI capabilities.

The numbers are too big to be boring. $600 billion is a statement about what these companies believe the future looks like - and about what they’re willing to stake to own it.