Subgraph Atlas · centered on entity
LoRA (Low-Rank Adaptation)
technique9 mentions· velocity: stableParameter-efficient fine-tuning that injects low-rank decomposition matrices into attention weights, training <1% of parameters.
Two-hop subgraph: this entity, every entity it directly relates to, and every entity those neighbors relate to. Drag a node, scroll to zoom, click to inspect — or click any neighbor and re-center the atlas there.
0 nodes · 0 edges · loading…
companypersonai_modelproductresearch_labbenchmarkframework
drag to move · scroll to zoom · click a node
Top connections
Microsoftcompany
107 mentions
→ Center atlas here
Stanford Universityorganization
24 mentions
→ Center atlas here
VMLOpsproduct
10 mentions
→ Center atlas here
transformer modeltechnology
9 mentions
→ Center atlas here
Parameter-Efficient Fine-Tuning (PEFT)technology
4 mentions
→ Center atlas here
Qwen 3.6ai model
3 mentions
→ Center atlas here
MetaClawproduct
3 mentions
→ Center atlas here
Doc-to-LoRAtechnology
1 mentions
→ Center atlas here