Introducing:

Connect to remote instances of LM Studio, load your models, and use them as if they were local.

Get started

Run AI models, locally and privately.

Use local LLMs like gpt-oss, Qwen3, Gemma3, DeepSeek
and many more, locally on your own hardware.
Screenshot of the LM Studio local AI application
LM Studio
Developer Resources
JS SDK
npm install @lmstudio/sdk
Docs: lmstudio-js
Python SDK
pip install lmstudio
Docs: lmstudio-python
Sign up for updates
Hear about new releases, features, and updates from the LM Studio team.