8K Downloads
A powerful 30B MoE coding model from Alibaba Qwen, joining its larger 480B counterpart
Trained for Ttool use
Specialized coding model with native support for 256K context length
MoE model with 3.3B activated weights out of 30.5B total parameters, 128 experts with 8 active
Excels at agentic coding, browser automation, and repository-scale code understanding
Advanced tool calling capabilities supporting platforms like Qwen Code and CLINE
Optimized for coding tasks with significant performance improvements in agentic workflows
Note: This model does not support thinking mode and will not generate <think></think>
blocks
The underlying model files this model uses
When you download this model, LM Studio picks the source that will best suit your machine (you can override this)
Custom configuration options included with this model