Skip to content
gentic.news — AI News Intelligence Platform

Recipe ·

Llama 4 Maverick

Meta's flagship open-weight multimodal MoE with 17B active / 400B total params, 128 experts, 1M context. Distilled from unreleased Llama 4 Behemoth. LMArena ELO 1417 (experimental chat).

4
Techniques inside
6y
Median research → prod
1.6y
Fastest adoption
8y
Slowest adoption

Ingredient list

  1. Invented by Nous Research · 2023-08 · Velocity 1.6y

    Llama 4 Maverick supports 1M context. Meta's previous long-context models (Llama 3.1) used YaRN.

    architecturemedium
  2. Invented by Zhuiyi Technology · 2021-04 · Velocity 4y

    Llama family models consistently use RoPE. Llama 4 is a direct successor.

    architecturehigh
  3. Invented by Google · 2017-06 · Velocity 8y

    Llama 4 is a Transformer-based LLM, the core architecture is self-attention.

    architecturehigh
  4. Invented by Google · 2017-01 · Velocity 8y

    Meta's flagship open-weight multimodal MoE with 17B active / 400B total params, 128 experts

    architecturehigh

This recipe is part of the gentic.news Deployment Atlas. Every ingredient has an origin paper + evidence. Methodology is public. Dataset is CC BY 4.0.

Llama 4 Maverick Recipe — The Research Behind the Model | gentic.news