Documentation
Running LLMs Locally
Presets
API
User Interface
Advanced
Running LLMs Locally
Presets
API
User Interface
Advanced
LM Studio as a Local LLM API Server
You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.
LM Studio's APIs can be used through an OpenAI compatibility mode, enhanced REST API, or through a client library like lmstudio-js.
lmstudio-js
lmstudio-python
Load and server LLMs from LM Studio
On this page
- API options