Subgraph Atlas · centered on entity
Mistral Small 4
ai model2 mentions· velocity: stableMistral Small 4, developed by Mistral AI, is a 119B-parameter Mixture of Experts model that unifies reasoning, multimodal, and agentic capabilities into a single efficient model.
Two-hop subgraph: this entity, every entity it directly relates to, and every entity those neighbors relate to. Drag a node, scroll to zoom, click to inspect — or click any neighbor and re-center the atlas there.
0 nodes · 0 edges · loading…
companypersonai_modelproductresearch_labbenchmarkframework
drag to move · scroll to zoom · click a node
Top connections
Mixture of Experts (Sparse MoE for LLMs)technique
12 mentions
→ Center atlas here
Mistral AIcompany
10 mentions
→ Center atlas here
Rotary Position Embedding (RoPE)technique
0 mentions
→ Center atlas here
Grouped-Query Attention (GQA)technique
0 mentions
→ Center atlas here
YaRN RoPE Context Extensiontechnique
0 mentions
→ Center atlas here
FlashAttentiontechnique
0 mentions
→ Center atlas here
PagedAttention (vLLM)technique
0 mentions
→ Center atlas here