Despite having a fraction of DeepSeek R1's claimed 671 billion parameters, Alibaba touts its comparatively compact 32-billion ...
A dense AI model with 32B parameters, excelling in coding, math, and local deployment. Compact, efficient, and powerful ...
While DeepSeek-R1 operates with 671 billion parameters, QwQ-32B achieves comparable performance with a much smaller footprint ...
Alibaba’s QWQ-32B is a 32-billion-parameter AI designed for mathematical reasoning and coding. Unlike massive models, it ...
This remarkable outcome underscores the effectiveness of RL when applied to robust foundation models pre-trained on extensive ...
After DeepSeek sparked a revolution in China's AI industry in early 2025, Alibaba's Tongyi Qianwen QwQ-32B is poised to become the next widely adopted large model, thanks to its parameters and ...
Alibaba is positioned to dominate China's AI market with its groundbreaking, highly efficient QwQ-32B model, surpassing ...
Alibaba developed QwQ-32B through two training sessions. The first session focused on teaching the model math and coding ...
China's AI scene is brimming with confidence, with some media even suggesting that domestic firms could outpace OpenAI.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results