Documentation
Running LLMs Locally
User Interface
Advanced
Command Line Interface - lms
API
REST Endpoints
LM Studio SDK (TypeScript)
LM Studio as a Local LLM API Server
You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.
LM Studio's APIs can be used through an OpenAI compatibility mode, ehanced REST API, or through a client library like lmstudio.js.
lmstudio.js
Load and server LLMs from LM Studio