Recipe ·
GPT-OSS-120B
OpenAI's GPT-OSS-120B is a 120-billion parameter open-weight reasoning model designed to push the frontier of accuracy while optimizing inference cost.
Ingredient list
Invented by Google · 2022-01 · Velocity 4y
“GPT-OSS-120B is a reasoning model that pushes the frontier of accuracy, implying it uses step-by-step reasoning techniques.”
reasoningmediumInvented by Zhuiyi Technology · 2021-04 · Velocity 5y
“As a large language model in the GPT lineage, it almost certainly uses Rotary Position Embedding (RoPE), which is standard in modern transformer architectures.”
architecturemediumInvented by Google · 2017-06 · Velocity 9y
“GPT-OSS-120B is a 120-billion parameter model, which fundamentally relies on the transformer self-attention architecture.”
architecturehigh
This recipe is part of the gentic.news Deployment Atlas. Every ingredient has an origin paper + evidence. Methodology is public. Dataset is CC BY 4.0.