10 Downloads
Description
Enhanced version of Qwen3-235B-A22B featuring significant improvements in thinking and reasoning capabilities.
Use cases
Minimum system memory
Tags
Last update
Updated on July 25byREADME
Enhanced version of Qwen3-235B-A22B featuring significant improvements in thinking and reasoning capabilities with state-of-the-art performance among open-source thinking models.
This MoE model uses 22B activated parameters from 128 total experts with 8 active at any time. Features dramatically improved performance on reasoning tasks including logical reasoning, mathematics, science, coding, and academic benchmarks that typically require human expertise.
Supports a context length of up to 262,144 tokens natively with enhanced long-context understanding.
Advanced agent capabilities and support for over 100 languages and dialects.
Parameters
Custom configuration options included with this model
Sources
The underlying model files this model uses