Subgraph Atlas · centered on entity
transformer model
technology9 mentions· velocity: stableIn deep learning, the transformer is an artificial neural network architecture based on the multi-head attention mechanism, in which text is converted to numerical representations called tokens, and each token is converted into a vector via lookup from a word embedding table. At each layer, each tok
Two-hop subgraph: this entity, every entity it directly relates to, and every entity those neighbors relate to. Drag a node, scroll to zoom, click to inspect — or click any neighbor and re-center the atlas there.
0 nodes · 0 edges · loading…
companypersonai_modelproductresearch_labbenchmarkframework
drag to move · scroll to zoom · click a node
Top connections
LoRA (Low-Rank Adaptation)technique
9 mentions
→ Center atlas here
Nemotron 3 Superai model
8 mentions
→ Center atlas here
Attention Residualstechnology
3 mentions
→ Center atlas here
LLM-powered agentstechnology
2 mentions
→ Center atlas here
FlashAttention-4technology
1 mentions
→ Center atlas here
TimeSqueezetechnology
1 mentions
→ Center atlas here