NVIDIA and U.S. Energy Secretary Chris Wright announced two AI supercomputers at Argonne National Laboratory. The Solstice system will pack 100,000 NVIDIA Vera Rubin GPUs, delivering 5,000 exaflops — five times the entire TOP500 list combined.
Key facts
- Solstice: 100,000 NVIDIA Vera Rubin GPUs delivering 5,000 exaflops
- Equinox: 10,000 NVIDIA Grace Blackwell GPUs, now being stood up
- Open-source AI model: trained on 1.5M physics papers, fine-tuned on 100K fusion papers
- DOE partnership spans 17 national labs and two decades of NVIDIA collaboration
- U.S. electricity production barely grew despite tripling oil output over 20 years
U.S. Energy Secretary Chris Wright and NVIDIA Vice President Ian Buck made the case Thursday at the SCSP AI+ Expo: American AI leadership runs through American energy leadership. The 30-minute fireside chat, moderated by SCSP president Ylli Bajraktari, outlined the Genesis Mission, a DOE effort applying AI to scientific discovery across 17 national labs.
The Supercomputers
NVIDIA and the DOE are building two AI supercomputers together at Argonne. The first, Equinox, is being stood up now with 10,000 NVIDIA Grace Blackwell GPUs — what Buck called "the same GPU, the same software being used to train and build AI that we're all enjoying today." The second, Solstice, will use 100,000 GPUs with NVIDIA Vera Rubin.
"To put that 100,000 in perspective on the next-generation GPU, which is dedicated to science, it's 5,000 exaflops," Buck said. "That's a big number that actually is five times larger than the entire TOP500 supercomputer list combined." [According to Powering the Next American Century]
The Open-Source AI Model
Buck described an open-source NVIDIA AI model trained on 1.5 million physics papers, then fine-tuned on 100,000 papers specifically about fusion. The result is a specialized AI agent DOE researchers can interrogate to advance their work faster. "We're creating all the same technology, all the same hardware, all the same software building blocks used by all the major AI labs around the world," Buck said, "for all of world science to go get access to."

The Energy Challenge
Over the last 20 years, Wright said, the U.S. has tripled oil production and doubled natural gas production — but barely grown electricity production. That's a problem because, as Wright put it, the most important source of energy is the one that powers AI infrastructure. "Energy is life," Wright said. "The more energy you have, the more affordable energy you have, the more opportunities you have in your society."

Unique Take
The structural observation here: NVIDIA is using the same silicon and software stack for scientific supercomputing as for frontier AI training. This is not a separate supply chain — the Vera Rubin GPUs powering Solstice are the same architecture shipping to hyperscalers. That means DOE scientists get access to the same compute that trains GPT-5 and Claude, but optimized for physics simulation rather than language modeling.

What to watch
Watch for Solstice's deployment timeline and whether the DOE discloses its power consumption. The 5,000 exaflop claim will face scrutiny when benchmark results are published. Also track whether the open-source fusion model drives measurable fusion research breakthroughs within 12 months.









