Model

Magistral Small

Public

Reasoning model that supports image input and tools calling. By MistralAI.

Use cases

Vision Input
Reasoning

Minimum system memory

15GB

Tags

24B
mistral

README

Magistral Small 1.2

Magistral Small builds upon Mistral Small 3.2 with added reasoning capabilities through SFT from Magistral Medium traces and RL training. It's a small, efficient reasoning model with 24B parameters that can be deployed locally on a single RTX 4090 or 32GB RAM MacBook once quantized.

This model updates Magistral Small 1.1 with improved benchmark performance, better tone and persona, and fewer infinite generations.

Parameters

Custom configuration options included with this model

Reasoning Section Parsing
{ "enabled": true, "startString": "[THINK]", "endString": "[/THINK]" }
Temperature
0.7
Top P Sampling
0.95