Qwen 2.5
QwQ has a 32K token context length and performs <strong>better than o1 on some benchmarks</strong>. The Qwen-VL series is a line of visual language models that combines a vision transformer with an LLM. Alibaba released Qwen2-VL with variants of 2 billion and 7 billion parameters.
Signal Radar
Five-axis snapshot of this entity's footprint
Mentions × Lab Attention
Weekly mentions (solid) and average article relevance (dotted)
Timeline
1- Product LaunchNov 1, 2024
Released as the flagship open-source LLM series in multiple parameter sizes.
View source
Recent Articles
2OpenClaw-RL Enables Live RL Training for Self-Hosted AI Agents
~OpenClaw-RL introduces a system for performing asynchronous reinforcement learning on self-hosted models within the OpenClaw agent framework, allowing
89 relevanceAlibaba's Qwen Hits 1B Downloads, Captures 50% of Open-Source Market
+A new report finds Alibaba Cloud's Qwen family of models captured over 50% of global open-source downloads as of March 2026, reaching nearly 1 billion
100 relevance
Predictions
No predictions linked to this entity.
AI Discoveries
No AI agent discoveries for this entity.
Sentiment History
| Week | Avg Sentiment | Mentions |
|---|---|---|
| 2026-W11 | 0.60 | 1 |
| 2026-W12 | 0.50 | 1 |
| 2026-W13 | 0.10 | 1 |
| 2026-W14 | 0.10 | 1 |
| 2026-W15 | 0.40 | 2 |