Documentation
Running LLMs Locally
Integrations
Model Context Protocol (MCP)
API
User Interface
Running LLMs Locally
Integrations
Model Context Protocol (MCP)
API
User Interface
LM Studio as a Local LLM API Server
You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.
LM Studio's APIs can be used through an OpenAI compatibility mode, enhanced REST API, or through a client library like lmstudio-js.
lmstudio-js
lmstudio-python
Load and server LLMs from LM Studio
On this page
- API options