Coverage (30d)
1vs2
This Week
0vs0
Evidence
1 articlesRelationships
1Timeline
GPT-OSS-120B2026-03-22
OpenAI released the 20-billion parameter GPT-OSS open-source model
GPT-OSS-120B2026-03-22
Technical guide published on fine-tuning GPT-OSS 20B using LoRA on MoE architecture
Nemotron 3 Super2026-03-12
120-billion-parameter open-source model released to democratize agentic AI
Ecosystem
GPT-OSS-120B
usesMixture of Experts (Sparse MoE for LLMs)1 src
usesSocialGrid1 src
usesPRL-Bench1 src
deploysChain-of-Thought Prompting1 src
deploysRotary Position Embedding (RoPE)1 src
deploysTransformer Self-Attention1 src
Nemotron 3 Super
usesMixture of Experts (Sparse MoE for LLMs)3 src
usesTransformer Architectures1 src
usesautonomous agents1 src
useshybrid Mamba-Transformer MoE1 src
competes withGPT-OSS-120B1 src
usestransformer model1 src