The four largest US tech companies will spend roughly $650 billion on AI infrastructure in 2026 - a 67% jump from last year. That spending is transforming these cash machines into something that looks more like capital-intensive utilities than software companies.
The Numbers
Amazon leads the pack with $200 billion in planned capital expenditures for 2026. Alphabet follows with $175-185 billion. Meta’s guidance sits at $115-135 billion. Microsoft’s annual run rate puts it on track for $145 billion.
Combined, that’s $635-665 billion flowing into data centers, AI chips, and networking equipment. In 18 months, these companies have nearly doubled their annual infrastructure commitments - from around $380 billion in 2025 to roughly $660-690 billion this year.
Where the Money Goes
The vast majority funds one thing: AI compute. NVIDIA GPUs, servers to house them, and data centers to power them. Each company is racing to secure enough hardware to train and run AI models at scale.
This is not optional spending. The companies believe whoever builds the most AI capacity wins. They’re betting that AI will generate enough revenue to justify the investment. They’re betting their cash flows on it.
The Cash Flow Crater
These bets are already hitting the balance sheets hard.
Amazon faces negative free cash flow of $17-28 billion in 2026, according to Morgan Stanley and Bank of America analysts. The company that once generated rivers of cash is now spending more than it brings in.
Alphabet’s situation may be worse. Pivotal Research projects its free cash flow will plummet nearly 90% this year - from $73.3 billion in 2025 to $8.2 billion in 2026. Google’s parent company will generate less excess cash than many mid-sized software companies.
Meta and Microsoft face similar pressure. While their cash flow situations aren’t as dire, both are seeing significant declines as capital expenditures consume revenue that used to flow to shareholders.
The Strategy
The logic behind this spending rests on a few assumptions:
First, AI workloads will generate enough revenue to justify the investment. OpenAI’s ChatGPT, enterprise AI services, and AI-enhanced products need to bring in billions in new revenue.
Second, the companies that build the most infrastructure will have competitive advantages that slower-moving rivals can’t match. AI model training at scale requires hardware that takes years to acquire.
Third, memory and chip supply constraints will ease. Current shortages are forcing companies to lock in capacity years in advance, committing capital before they know exactly how it will be used.
Who Wins, Who Loses
The immediate winners are NVIDIA and the semiconductor supply chain. NVIDIA’s data center revenue has exploded as these companies scramble for GPUs. Memory makers like SK Hynix and Micron have sold out their high-bandwidth memory production through 2026.
The losers - at least for now - are shareholders expecting the cash returns that made these stocks so popular. Stock buybacks and dividend growth may slow as capital goes to data centers instead.
The bigger question is whether this spending pays off. If AI generates the revenue these companies expect, the infrastructure investments will look prescient. If AI growth disappoints or commoditizes faster than expected, these companies will have spent hundreds of billions on capacity they don’t need.
Big Tech is betting the house on AI. They’re spending like the future depends on it - because they believe it does.