1.8K Downloads
Updated version of Qwen3-235B-A22B featuring significant improvements in general capabilities including instruction following, logical reasoning, text comprehension, mathematics, science, coding and tool usage.
Trained for Ttool use
Updated version of Qwen3-235B-A22B featuring significant improvements in general capabilities including instruction following, logical reasoning, text comprehension, mathematics, science, coding and tool usage.
This MoE model uses 22B activated parameters from 128 total experts with 8 active at any time. Compared to the original Qwen3-235B-A22B, it delivers substantial gains in long-tail knowledge coverage across multiple languages and markedly better alignment with user preferences in subjective and open-ended tasks.
Supports a context length of up to 262,144 tokens natively with enhanced 256k long-context understanding.
Advanced agent capabilities and support for over 100 languages and dialects.
Note: This model supports only non-thinking mode and does not generate <think></think>
blocks in its output.
The underlying model files this model uses
When you download this model, LM Studio picks the source that will best suit your machine (you can override this)
Custom configuration options included with this model