Documentation
Running LLMs Locally
User Interface
Advanced
Command Line Interface - lms
API
REST Endpoints
LM Studio SDK (TypeScript)
API Changelog
Set a TTL (in seconds) for models loaded via API requests (docs article: Idle TTL and Auto-Evict)
With lms
:
reasoning_content
in Chat Completion responsesFor DeepSeek R1 models, get reasoning content in a separate field. See more here.
Turn this on in App Settings > Developer.
Use any LLM that supports Tool Use and Function Calling through the OpenAI-like API.
Docs: Tool Use and Function Calling.
lms get
: download models from the terminalYou can now download models directly from the terminal using a keyword
or a full Hugging Face URL
To filter for MLX models only, add --mlx
to the command.