Skip to content
gentic.news — AI News Intelligence Platform

Recipe ·

Gemini 3.1

Gemini 3.1 is a large multimodal artificial intelligence model developed by Google DeepMind, representing a significant iteration of the Gemini series. Announced on February 20, 2026, it is distinguished by its 'Mixture of Experts' (MoE) architecture, which enables more efficient scaling and task-specific processing. A key technical specification is its 10 million token context window for its 'Pro

6
Techniques inside
5y
Median research → prod
2y
Fastest adoption
9y
Slowest adoption

Ingredient list

  1. Invented by Nous Research · 2023-08 · Velocity 2y

    The 10M token context window suggests use of advanced RoPE extension techniques like YaRN.

    architecturemedium
  2. Invented by Google · 2022-01 · Velocity 4y

    Gemini models are known to be prompted with chain-of-thought reasoning for complex tasks.

    reasoningmedium
  3. Invented by Google · 2021-09 · Velocity 4y

    Gemini models are instruction-tuned, building on Google's FLAN tradition.

    trainingmedium
  4. Invented by Zhuiyi Technology · 2021-04 · Velocity 5y

    Gemini models use Rotary Position Embeddings (RoPE) for position encoding.

    architecturemedium
  5. Invented by Google · 2017-06 · Velocity 9y

    Gemini is a Transformer-based model, using self-attention as its core architecture.

    architecturehigh
  6. Invented by Google · 2017-01 · Velocity 9y

    Gemini 3.1 is distinguished by its 'Mixture of Experts' (MoE) architecture.

    architecturehigh

This recipe is part of the gentic.news Deployment Atlas. Every ingredient has an origin paper + evidence. Methodology is public. Dataset is CC BY 4.0.