Subgraph Atlas · centered on entity
Llama 4 Scout
ai model0 mentions· velocity: stableMeta's first natively multimodal open-weight MoE model with 17B active / 109B total params, 16 experts, and an industry-leading 10M token context. Multimodal (text+image), 12 languages, runs on a single H100 with Int4 quantization.
Two-hop subgraph: this entity, every entity it directly relates to, and every entity those neighbors relate to. Drag a node, scroll to zoom, click to inspect — or click any neighbor and re-center the atlas there.
0 nodes · 0 edges · loading…
companypersonai_modelproductresearch_labbenchmarkframework
drag to move · scroll to zoom · click a node
Top connections
Metacompany
137 mentions
→ Center atlas here
CLIP (Contrastive Language-Image Pretraining)technique
13 mentions
→ Center atlas here
Mixture of Experts (Sparse MoE for LLMs)technique
12 mentions
→ Center atlas here
Rotary Position Embedding (RoPE)technique
0 mentions
→ Center atlas here
Transformer Self-Attentiontechnique
0 mentions
→ Center atlas here
LLaVA (Visual Instruction Tuning)technique
0 mentions
→ Center atlas here