Documentation
Core
LM Studio REST API
OpenAI Compatible Endpoints
Anthropic Compatible Endpoints
Core
LM Studio REST API
OpenAI Compatible Endpoints
Anthropic Compatible Endpoints
Build with LM Studio's local APIs and SDKs — TypeScript, Python, REST, and OpenAI and Anthropic-compatible endpoints.
lmsllmster for headless deploymentsllmster is LM Studio's core, packaged as a daemon for headless deployment on servers, cloud instances, or CI. The daemon runs standalone, and it is not dependent on the LM Studio GUI.
Mac / Linux
curl -fsSL https://lmstudio.ai/install.sh | bash
Windows
irm https://lmstudio.ai/install.ps1 | iex
Basic usage
lms daemon up # Start the daemon lms get <model> # Download a model lms server start # Start the local server lms chat # Open an interactive session
Learn more: Headless deployments
lmstudio-js)npm install @lmstudio/sdk
import { LMStudioClient } from "@lmstudio/sdk"; const client = new LMStudioClient(); const model = await client.llm.model("openai/gpt-oss-20b"); const result = await model.respond("Who are you, and what can you do?"); console.info(result.content);
Full docs: lmstudio-js, Source: GitHub
lmstudio-python)pip install lmstudio
import lmstudio as lms with lms.Client() as client: model = client.llm.model("openai/gpt-oss-20b") result = model.respond("Who are you, and what can you do?") print(result)
Full docs: lmstudio-python, Source: GitHub
lms server start --port 1234
curl http://localhost:1234/api/v1/chat \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $LM_API_TOKEN" \ -d '{ "model": "openai/gpt-oss-20b", "input": "Who are you, and what can you do?" }'
Full docs: LM Studio REST API
This page's source is available on GitHub
On this page
Install llmster for headless deployments
Super quick start
TypeScript (lmstudio-js)
Python (lmstudio-python)
HTTP (LM Studio REST API)
Helpful links