Description
The 30B parameter (MoE) version of the Qwen3 model family.
Stats
25.5K Downloads
7 stars
Capabilities
Minimum system memory
Tags
Last updated
Updated on May 24byREADME
Supports a context length of up to 131,072 tokens with YaRN (default 32k)
Supports /no_think to disable reasoning, just add it at the end of your prompt
MoE model with 3.3B activated weights, 128 total and 8 active experts
Supports both thinking and non-thinking modes withe enhanced reasoning in both for significantly enhanced mathematics, coding, and commonsense
Excels at creative writing, role-playing, multi-turn dialogues, and instruction following
Advanced agent capabilities and support for over 100 languages and dialects
Custom Fields
Special features defined by the model author
Enable Thinking
: boolean
(default=true)
Controls whether the model will think before replying
Parameters
Custom configuration options included with this model
Sources
The underlying model files this model uses