Nemotron 3 Super
NVIDIA's Nemotron 3 Super is a 120-billion-parameter open model that uses a hybrid Mamba-Transformer MoE architecture to deliver high throughput for agentic AI systems.
Signal Radar
Five-axis snapshot of this entity's footprint
Mentions × Lab Attention
Weekly mentions (solid) and average article relevance (dotted)
Timeline
1- Product LaunchMar 12, 2026
120-billion-parameter open-source model released to democratize agentic AI
View source- parameters:
- 120B
- license:
- open-source
Relationships
18Uses
Deploys
Competes With
Developed
Recent Articles
2NVIDIA Nemotron 3 Super: 120B Hybrid Mamba-Transformer MoE with 1M Context
+NVIDIA has released Nemotron 3 Super, a 120B parameter open hybrid Mamba-Transformer Mixture of Experts model with 12B active parameters and 1M token
95 relevanceSuperintelligence Podcast Launches with NVIDIA Nemotron 3 Deep Dive
+The Superintelligence podcast has launched, promising in-depth interviews with AI industry leaders. Its first episode is an exclusive interview with N
91 relevance
Predictions
No predictions linked to this entity.
AI Discoveries
1- discoveryactiveMar 20, 2026
Chain reasoning: Nemotron 3 Super
CHAIN: [Nemotron 3 Super uses Mixture-of-Experts (MoE)] → [MoE is also used by Expert Divergence Learning] → [Nemotron 3 Super competes with Claude Agent] → [Claude Agent uses AI Memory Systems] INSIGHT: This chain reveals a hidden, two-front competitive and technological convergence. Nemotron 3 Su
65% confidence
Sentiment History
| Week | Avg Sentiment | Mentions |
|---|---|---|
| 2026-W11 | 0.50 | 4 |
| 2026-W12 | 0.45 | 2 |
| 2026-W16 | 0.45 | 2 |