Model

minimax-m2

Public

MiniMax M2 is a 230B MoE (10 active) LLM, built for coding and agentic workflows.

Use cases

Reasoning

Minimum system memory

121GB

Tags

230B
minimax-m2

README

MiniMax M2

MiniMax-M2 is an MoE model (230b total, 10b active) built for elite performance in coding and agentic tasks, while maintaining powerful general intelligence.

Highlights

Superior Intelligence. According to benchmarks from Artificial Analysis, MiniMax-M2 demonstrates highly competitive general intelligence across mathematics, science, instruction following, coding, and agentic tool use. Its composite score ranks #1 among open-source models globally.

Advanced Coding. Engineered for end-to-end developer workflows, MiniMax-M2 excels at multi-file edits, coding-run-fix loops, and test-validated repairs. Strong performance on Terminal-Bench and (Multi-)SWE-Bench–style tasks demonstrates practical effectiveness in terminals, IDEs, and CI across languages.

Agent Performance. MiniMax-M2 plans and executes complex, long-horizon toolchains across shell, browser, retrieval, and code runners. In BrowseComp-style evaluations, it consistently locates hard-to-surface sources, maintains evidence traceable, and gracefully recovers from flaky steps.

Efficient Design. With 10 billion activated parameters (230 billion in total), MiniMax-M2 delivers lower latency, lower cost, and higher throughput for interactive agents and batched sampling—perfectly aligned with the shift toward highly deployable models that still shine on coding and agentic tasks.

Parameters

Custom configuration options included with this model

Temperature
1
Top K Sampling
40
Top P Sampling
0.95

Sources

The underlying model files this model uses