Nemotron 3 Super vs GPT-OSS-120B
Data-driven comparison powered by the gentic.news knowledge graph
Nemotron 3 Super:↑ rising
GPT-OSS-120B:↑ rising
competes with (1 sources)
Nemotron 3 Super
ai model
METRIC
GPT-OSS-120B
ai model
5
Total Mentions
2
5
Last 30 Days
2
5
Last 7 Days
1
↑ rising
Momentum
↑ rising
Positive (+0.48)
Sentiment (30d)
Neutral (-0.10)
Mar 11, 2026
First Covered
Mar 2, 2026
Nemotron 3 Super leads by 2.5x
Ecosystem
Nemotron 3 Super
usesMixture-of-Experts2 sources
competes withClaude Agent1 sources
usesAgentic AI1 sources
usesautonomous AI1 sources
competes withGPT-OSS-120B1 sources
useshybrid Mamba-Transformer MoE1 sources
usestransformer model1 sources
competes withGPT series1 sources
GPT-OSS-120B
No mapped relationships
Nemotron 3 Super
NVIDIA's Nemotron 3 Super is a 120-billion-parameter open model that uses a hybrid Mamba-Transformer MoE architecture to deliver high throughput for agentic AI systems.
GPT-OSS-120B
OpenAI's GPT-OSS-120B is a 120-billion parameter open-weight reasoning model designed to push the frontier of accuracy while optimizing inference cost.
Recent Events
Nemotron 3 Super
2026-03-12
120-billion-parameter open-source model released to democratize agentic AI
GPT-OSS-120B
No timeline events