Documentation

LM Studio Developer Docs

Build with LM Studio's local APIs and SDKs — TypeScript, Python, REST, and OpenAI‑compatible endpoints.

Get to know the stack

What you can build

  • Chat and text generation with streaming
  • Structured output (JSON schema)
  • Tool calling and local agents
  • Embeddings and tokenization
  • Model management (JIT load, TTL, auto‑evict)

Super quick start

TypeScript (lmstudio-js)

npm install @lmstudio/sdk
import { LMStudioClient } from "@lmstudio/sdk";

const client = new LMStudioClient();
const model = await client.llm.model("openai/gpt-oss-20b");
const result = await model.respond("Who are you, and what can you do?");

console.info(result.content);

Full docs: lmstudio-js, Source: GitHub

Python (lmstudio-python)

pip install lmstudio
import lmstudio as lms

with lms.Client() as client:
    model = client.llm.model("openai/gpt-oss-20b")
    result = model.respond("Who are you, and what can you do?")
    print(result)

Full docs: lmstudio-python, Source: GitHub

Try a minimal HTTP request (OpenAI‑compatible)

lms server start --port 1234
curl http://localhost:1234/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "openai/gpt-oss-20b",
    "messages": [{"role": "user", "content": "Who are you, and what can you do?"}]
  }'

Full docs: OpenAI‑compatible endpoints

This page's source is available on GitHub