Documentation
Getting Started
Integrations
Model Context Protocol (MCP)
Models (model.yaml)
API
User Interface
Getting Started
Integrations
Model Context Protocol (MCP)
Models (model.yaml)
API
User Interface
API
LM Studio as a Local LLM API Server
Run an LLM API server on localhost with LM Studio
You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.
LM Studio's APIs can be used through an OpenAI compatibility mode, enhanced REST API, or through a client library like lmstudio-js.
lmstudio-js
lmstudio-python
Load and server LLMs from LM Studio
This page's source is available on GitHub