Recipe ·
GPT-5
GPT-5 is a multimodal large language model developed by OpenAI and the fifth in its series of generative pre-trained transformer (GPT) foundation models. Preceded in the series by GPT-4, it was launched on August 7, 2025. It is publicly accessible to users of the chatbot products ChatGPT and Microso
Ingredient list
Invented by Google · 2023-05 · Velocity 3y
“GQA is a standard inference optimization for large-scale models to reduce memory overhead.”
architecturemediumInvented by Stanford · 2022-05 · Velocity 4y
“OpenAI's technical infrastructure for large models heavily utilizes optimized attention kernels like FlashAttention.”
inferencemediumInvented by University of Tokyo · 2022-05 · Velocity 4y
“GPT-5 exhibits zero-shot reasoning when prompted with 'think step by step', a hallmark of Zero-Shot CoT.”
reasoninghighInvented by Google · 2022-03 · Velocity 4y
“GPT-5 can generate multiple reasoning paths, and majority voting improves answer reliability.”
reasoningmediumInvented by Google · 2022-01 · Velocity 4y
“GPT-5 can be prompted to show step-by-step reasoning, a core CoT capability.”
reasoninghighInvented by Google · 2021-09 · Velocity 4y
“GPT-5 follows the instruction-following paradigm established by its predecessors, which were instruction-tuned.”
traininghighInvented by Zhuiyi Technology · 2021-04 · Velocity 5y
“RoPE is a standard positional encoding used in modern Transformer LLMs, including GPT series.”
architecturehighInvented by Google · 2017-06 · Velocity 9y
“GPT-5 is a Generative Pre-trained Transformer, fundamentally based on the Transformer architecture.”
architecturehighInvented by Google · 2017-01 · Velocity 9y
“GPT-5 is widely reported to be a Mixture of Experts (MoE) model, scaling parameters efficiently.”
architecturehigh
This recipe is part of the gentic.news Deployment Atlas. Every ingredient has an origin paper + evidence. Methodology is public. Dataset is CC BY 4.0.