AI News: GTC Day 3 Showcases Open Models and Physical AI, Micron Earnings Test Memory Demand

Daily roundup for March 18, 2026 covering Nvidia's open models panel and robotics announcements, Micron's crucial AI memory earnings, and the mystery trillion-parameter model Hunter Alpha

Top Stories

GTC Day 3: Open Models Panel and Physical AI Data Factory

Nvidia’s GTC continues with a packed Wednesday schedule. At 12:30 PM PT, Jensen Huang moderates a panel on open models with Harrison Chase (LangChain), A16Z leaders, AI2, Cursor, and Thinking Machines Lab. The discussion centers on how open-source models are reshaping enterprise AI adoption.

On the hardware side, Nvidia announced the Physical AI Data Factory Blueprint, an open reference architecture that automates how training data is generated, augmented, and evaluated for robotics and autonomous systems. The goal is reducing costs and complexity for training physical AI at scale.

Key partners adopting the blueprint include FieldAI, Hexagon Robotics, Linker Vision, Milestone Systems, Skild AI, Uber, and Teradyne Robotics. This signals growing momentum behind Nvidia’s vision of becoming the infrastructure layer for everything that moves - from warehouse robots to autonomous vehicles.

The company also expanded its NemoClaw platform, which now allows single-command deployment of OpenClaw agents across local and cloud models. OpenShell handles inference routing between local GPUs and cloud endpoints using defined policies.

Source: NVIDIA Blog, NVIDIA Newsroom

Micron Earnings: The AI Memory Demand Test

Micron Technology reports fiscal Q2 2026 results after market close today. Analysts expect a 138% revenue spike to $19.2 billion and earnings of $8.65 per share - a 5.5x jump from a year ago.

The numbers reflect explosive demand for high-bandwidth memory (HBM) that powers AI accelerators. Micron’s HBM3E production for Nvidia’s H200, GB200, and GB300 GPUs is 100% sold out through the remainder of 2026 under binding long-term agreements.

Memory prices tell the story: Counterpoint Research reports DRAM and NAND flash prices jumped 90% in Q1 2026 versus Q4 2025, with further increases expected. Wedbush has a $500 price target on Micron, calling it the “picks and shovels” play for AI infrastructure.

The earnings report arrives as Nvidia projects $1 trillion in chip orders through 2027, with memory supply being a key bottleneck. How Micron navigates capacity constraints while meeting AI demand will signal whether the memory industry can keep pace with the AI buildout.

Source: Motley Fool, Financial Content

Hunter Alpha: Is DeepSeek Quietly Testing a Trillion-Parameter Model?

A mysterious AI model called Hunter Alpha appeared anonymously on OpenRouter last week, triggering widespread speculation. The model claims to be a trillion-parameter system with a one-million token context window optimized for agent workflows.

Evidence points toward DeepSeek. The specs align with leaked details about the upcoming DeepSeek V4, and a “V4 Lite” variant briefly appeared on DeepSeek’s own website on March 9 - days before Hunter Alpha launched. DeepSeek has a history of anonymous releases followed by official announcements.

Chinese AI lab Whale Lab now expects DeepSeek V4 and Tencent’s new Hunyuan model to launch in April 2026. Earlier Financial Times reports had suggested a March release. The V4 is reportedly multimodal, handling text, images, and video with improved coding capabilities and long-term memory.

If Hunter Alpha is indeed a stealth test of DeepSeek V4, it suggests China’s AI labs continue closing the gap with Western frontier models despite export restrictions. The trillion-parameter scale would match the largest proprietary models from OpenAI and Google.

Source: Medium, Geeky Gadgets, Dataconomy

Quick Hits

  • Google Personal Intelligence for free users: Gemini’s Personal Intelligence feature is rolling out to free users in the US, allowing AI responses personalized from Gmail, Google Photos, YouTube, and connected apps. MacRumors

  • OpenAI releases GPT-5.4 mini and nano: Described as OpenAI’s “most capable small models yet,” these are optimized for high-volume workloads where speed and cost efficiency matter. 9to5Mac

  • Claude usage limits doubled through March 27: Anthropic is offering doubled limits during off-peak hours - weekdays outside 8AM-2PM Eastern, and 24/7 on weekends. Engadget

  • Anthropic found 24,000 fraudulent accounts from Chinese AI labs: Accounts allegedly created by DeepSeek, Moonshot AI, and MiniMax generated over 16 million interactions with Claude. Tom’s Hardware

  • State chatbot regulation heating up: Chatbot safety bills are advancing in multiple states including Oregon, Washington, New York, Colorado, and Michigan as the 2026 legislative session picks up. Transparency Coalition

Worth Watching

GTC continues through Thursday with the open models panel today and hundreds of sessions on robotics, autonomous vehicles, and enterprise AI deployment. The Physical AI Data Factory Blueprint suggests Nvidia sees the next battleground in moving beyond chatbots to systems that interact with the physical world.

Micron’s earnings call at 4:30 PM ET will reveal how memory supply constraints are affecting the AI buildout. If HBM3E demand outstrips even optimistic projections, it could signal that AI infrastructure spending is accelerating beyond current capacity - or that customers are over-ordering to secure supply.

The Hunter Alpha mystery bears watching. If DeepSeek is testing V4 anonymously, the formal launch in April could reset expectations about Chinese AI capabilities. A trillion-parameter multimodal model would directly compete with GPT-5.4 and Claude Opus 4.6.