5.1K Downloads

mistralai/
magistral-smal...
23.6B
mistral

MistralAI's first reasoning model, based on Mistral Small 3.1

Reasoning

Last Updated   2 days ago

Min.19GB
README

Magistral Small

Magistral Small builds upon Mistral Small 3.1 with added reasoning capabilities through SFT from Magistral Medium traces and RL training. It's a small, efficient reasoning model with 24B parameters that can be deployed locally on a single RTX 4090 or 32GB RAM MacBook once quantized.

The model is capable of long chains of reasoning traces before providing answers and supports dozens of languages including English, French, German, Japanese, Chinese, and many others. It features a 128k context window with an Apache 2.0 license for both commercial and non-commercial use.

custom fields

Special features defined by the model author

Enable Thinking

: boolean

(default=true)

Controls whether the model will think before replying

sources

The underlying model files this model uses

When you download this model, LM Studio picks the source that will best suit your machine (you can override this)

config

Custom configuration options included with this model

Temperature
0.7
Top P Sampling
0.95