Model

Qwen3 Coder

Public

A powerful 30B MoE coding model from Alibaba Qwen, joining its larger 480B counterpart

Use cases

Minimum system memory

15GB

Tags

30B
qwen3moe

README

Qwen3 Coder 30B

Specialized coding model with native support for 256K context length

MoE model with 3.3B activated weights out of 30.5B total parameters, 128 experts with 8 active

Excels at agentic coding, browser automation, and repository-scale code understanding

Advanced tool calling capabilities supporting platforms like Qwen Code and CLINE

Optimized for coding tasks with significant performance improvements in agentic workflows

Note: This model does not support thinking mode and will not generate <think></think> blocks

Parameters

Custom configuration options included with this model

Repeat Penalty
1.05
Temperature
0.7
Top K Sampling
20
Top P Sampling
0.8