Nvidia Smashes Q4 with $68 Billion Revenue, Declares 'Agentic AI Inflection Point'

The chipmaker beats Wall Street expectations again, announces Vera Rubin platform promising 10x better inference, but stock dips anyway

Nvidia reported its fourth consecutive quarter of record-breaking results yesterday, posting $68.1 billion in revenue and declaring that “the agentic AI inflection point has arrived.” The numbers beat Wall Street expectations by roughly $3 billion. And somehow, the stock still fell.

The Numbers

Fourth quarter fiscal 2026 revenue hit $68.1 billion, up 73% year-over-year and 20% from Q3. Data center revenue - the segment that actually matters - grew 75% to $62.3 billion. Earnings came in at $1.62 per share, up 82% from a year ago.

For the full fiscal year, Nvidia pulled in $215.9 billion, a 65% jump from fiscal 2025.

The guidance for Q1 fiscal 2027 landed at $78 billion, plus or minus 2%. Analysts had expected $72.6 billion. That’s 77% year-over-year growth projected for the current quarter - an acceleration from Q4’s already staggering pace.

Vera Rubin Takes the Stage

CEO Jensen Huang used the earnings call to formally introduce the Vera Rubin platform, Nvidia’s next-generation AI architecture that will replace Blackwell starting in late 2026.

The pitch: 10x better performance per watt for inference workloads. At a time when data centers are desperate for power and AI companies are spending billions just to keep the lights on, that efficiency gain matters.

Vera Rubin comprises six new chips designed to slash inference token costs - the actual operational expense of running AI models in production. Huang called Nvidia “the king of inference today” with Blackwell and promised Vera Rubin “will extend that leadership even further.”

The timing isn’t coincidental. Competitors from AMD to Google’s TPU team to a new crop of startups like MatX are gunning for Nvidia’s dominance. Vera Rubin is the moat-widening exercise.

The Agentic AI Thesis

Huang’s key soundbite: “Computing demand is growing exponentially - the agentic AI inflection point has arrived.”

Translation: AI models aren’t just answering questions anymore. They’re taking actions, running tools, browsing the web, writing code. Every agentic task multiplies the inference load. A single user session that once meant one API call might now mean dozens as an agent reasons through a multi-step workflow.

This is the growth story Nvidia is selling Wall Street. Not just more AI models, but AI models that think longer, try more approaches, and run more often. The demand curve isn’t linear - it’s exponential.

Why the Stock Fell Anyway

Despite beating expectations across the board, Nvidia shares dropped as much as 1.5% during the analyst call.

Part of the answer: expectations are now so high that “crushing estimates” is priced in. Part of it: Nvidia explicitly said it’s not counting on China data center revenue in its forecasts, acknowledging the ongoing export restrictions.

But mostly, it’s the nature of momentum stocks. When you’ve delivered this consistently, the bar keeps rising. Anything short of a blowout gets met with a shrug.

What It Means

The AI infrastructure boom isn’t slowing. If anything, the shift to agentic AI is accelerating it. Nvidia’s $78 billion guidance for Q1 alone is roughly what the entire company generated in fiscal 2024.

But the competition is heating up. AMD just landed a $100 billion deal with Meta. MatX is promising 10x better LLM training performance. Google’s TPU team keeps iterating. Intel is trying to claw back relevance.

Nvidia remains the dominant force in AI chips. The question is whether anyone can erode that position before Vera Rubin ships and resets the competitive landscape all over again.