Skip to content
gentic.news — AI News Intelligence Platform
Connecting to the Living Graph…

Listen to today's AI briefing

Daily podcast — 5 min, AI-narrated summary of top stories

A sprawling data center interior lit by blue LED lights, with rows of server racks and cables, representing the…

NVIDIA, DOE Build 100K-GPU Supercomputer for Science

DOE and NVIDIA announced Solstice, a 100K-GPU Vera Rubin supercomputer delivering 5,000 exaflops, and Equinox with 10K Blackwell GPUs.

·1d ago·3 min read··2 views·AI-Generated·Report error
Share:
Source: blogs.nvidia.comvia nvidia_dc_blogSingle Source
What AI supercomputers are the DOE and NVIDIA building together?

The U.S. Department of Energy and NVIDIA are building two AI supercomputers at Argonne: Equinox with 10,000 Grace Blackwell GPUs and Solstice with 100,000 Vera Rubin GPUs delivering 5,000 exaflops.

TL;DR

DOE and NVIDIA building 100K-GPU Solstice supercomputer · Solstice delivers 5,000 exaflops, 5x TOP500 combined · NVIDIA open-sources AI model trained on 1.5M physics papers

NVIDIA and U.S. Energy Secretary Chris Wright announced two AI supercomputers at Argonne National Laboratory. The Solstice system will pack 100,000 NVIDIA Vera Rubin GPUs, delivering 5,000 exaflops — five times the entire TOP500 list combined.

Key facts

  • Solstice: 100,000 NVIDIA Vera Rubin GPUs delivering 5,000 exaflops
  • Equinox: 10,000 NVIDIA Grace Blackwell GPUs, now being stood up
  • Open-source AI model: trained on 1.5M physics papers, fine-tuned on 100K fusion papers
  • DOE partnership spans 17 national labs and two decades of NVIDIA collaboration
  • U.S. electricity production barely grew despite tripling oil output over 20 years

U.S. Energy Secretary Chris Wright and NVIDIA Vice President Ian Buck made the case Thursday at the SCSP AI+ Expo: American AI leadership runs through American energy leadership. The 30-minute fireside chat, moderated by SCSP president Ylli Bajraktari, outlined the Genesis Mission, a DOE effort applying AI to scientific discovery across 17 national labs.

The Supercomputers

NVIDIA and the DOE are building two AI supercomputers together at Argonne. The first, Equinox, is being stood up now with 10,000 NVIDIA Grace Blackwell GPUs — what Buck called "the same GPU, the same software being used to train and build AI that we're all enjoying today." The second, Solstice, will use 100,000 GPUs with NVIDIA Vera Rubin.

"To put that 100,000 in perspective on the next-generation GPU, which is dedicated to science, it's 5,000 exaflops," Buck said. "That's a big number that actually is five times larger than the entire TOP500 supercomputer list combined." [According to Powering the Next American Century]

The Open-Source AI Model

Buck described an open-source NVIDIA AI model trained on 1.5 million physics papers, then fine-tuned on 100,000 papers specifically about fusion. The result is a specialized AI agent DOE researchers can interrogate to advance their work faster. "We're creating all the same technology, all the same hardware, all the same software building blocks used by all the major AI labs around the world," Buck said, "for all of world science to go get access to."

NVIDIA Launches Nemotron 3 Nano Omni Model, Unifying Vision, Audio and Language for up to 9x More Efficient AI Agents

The Energy Challenge

Over the last 20 years, Wright said, the U.S. has tripled oil production and doubled natural gas production — but barely grown electricity production. That's a problem because, as Wright put it, the most important source of energy is the one that powers AI infrastructure. "Energy is life," Wright said. "The more energy you have, the more affordable energy you have, the more opportunities you have in your society."

The Next Generation of AI Begins

Unique Take

The structural observation here: NVIDIA is using the same silicon and software stack for scientific supercomputing as for frontier AI training. This is not a separate supply chain — the Vera Rubin GPUs powering Solstice are the same architecture shipping to hyperscalers. That means DOE scientists get access to the same compute that trains GPT-5 and Claude, but optimized for physics simulation rather than language modeling.

From left: NVIDIA’s Ian Buck, U.S. Energy Secretary Chris Wright and SCSP president Ylli Bajraktari onstage at the SCSP AI+ Expo.

What to watch

Watch for Solstice's deployment timeline and whether the DOE discloses its power consumption. The 5,000 exaflop claim will face scrutiny when benchmark results are published. Also track whether the open-source fusion model drives measurable fusion research breakthroughs within 12 months.


Sources cited in this article

  1. Powering
  2. Energy Secretary Chris Wright
Source: gentic.news · · author= · citation.json

AI-assisted reporting. Generated by gentic.news from 2 verified sources, fact-checked against the Living Graph of 4,300+ entities. Edited by Ala SMITH.

Following this story?

Get a weekly digest with AI predictions, trends, and analysis — free.

AI Analysis

The structural significance here is supply-chain unification. NVIDIA is deploying the same Vera Rubin architecture for scientific supercomputing as for frontier AI training clusters like Anthropic's 220K GPU cluster announced last week. This is not a separate product line — it's the same silicon, same CUDA stack, same networking. The DOE effectively gets a reservation on the same compute that trains GPT-5. The open-source physics model is a smart hedge: by training on 1.5M papers and fine-tuning on 100K fusion papers, NVIDIA creates a specialized AI agent that makes DOE researchers dependent on the NVIDIA ecosystem for their daily work. That's vendor lock-in through utility, not contract. The energy framing is convenient but real: if the U.S. can't grow electricity production to match AI demand, these supercomputers sit idle. Wright's admission that electricity production has barely grown despite oil/gas expansion is the most honest line in the chat.
Compare side-by-side
Solstice vs Vera Rubin
Enjoyed this article?
Share:

AI Toolslive

Five one-click lenses on this article. Cached for 24h.

Pick a tool above to generate an instant lens on this article.

Related Articles

From the lab

The framework underneath this story

Every article on this site sits on top of one engine and one framework — both built by the lab.

More in Policy & Ethics

View all