We're starting to live in the Matrix: Key takeaways from Jensen Huang's keynote

Physical AI is the theme of 2026. Robots learn in VR, autopilot reasons about roads, Vera Rubin chip 2x market leaders, and game NPCs hear your voice.

Author: Michael Kokin ·

If last year was the year of agents, 2026 is the year of robots that learn in virtual reality and then create new, even more powerful robots. Jensen Huang said it himself: "Everything that moves will become autonomous." Here are the key points from his keynote.

Robots are the new LLMs

The main bottleneck isn't "brains" anymore (models are already smart) — it's body and data. Training a robot in reality is slow and expensive. NVIDIA proposes training them in Omniverse — a physically accurate simulation. There, a robot lives through 10,000 years in a couple of days, makes every mistake, then loads those reflexes into a physical body. Literally "The Matrix" for machines.

Alpamayo: a reasoning autopilot

The new autopilot model is a reasoning model. It doesn't just see obstacles — it *reasons* about the road and understands *why* it needs to brake. For safety, they invented Dual Stack: a smart neural network + classic "hard" code. If the neural net starts doubting, the algorithm instantly takes over. The Mercedes with this system has already been called the safest car.

Vera Rubin chip

Named after Vera Rubin — the astronomer who proved the existence of dark matter. The new Vera CPU is 2x more powerful than market leaders, and data centers can now be cooled with regular warm water (45°C) instead of giant refrigerators.

Siemens and recursion

NVIDIA and Siemens are introducing "agentic AI" into manufacturing. AI agents help design chips that run factories that build robots. The factory of the future is one giant robot orchestrator with smaller robots working inside it. A cybernetic matryoshka.

BioNeMo: generating proteins and drugs

The idea is to turn biology into an engineering problem. Instead of testing drugs in test tubes for years, simulate the processes inside a chip. The neural network generates protein and drug structures the same way ChatGPT generates essays, but in the language of chemistry.

ACE: talking to game characters in real time via microphone

Scripted dialogues are disappearing from games. ACE technology gives NPCs a "brain": the character listens to your voice, understands context, and generates a response on the fly. No more selecting dialogue options — you can actually argue with an RPG merchant, and they'll remember it.

The bottom line

NVIDIA has definitively stopped being a graphics card company. Jensen is building infrastructure where reality is first digitized, computed on Vera Rubin chips, then returned to the world as robot actions.

Full keynote: https://youtu.be/uDNXjnOqJ-A