Documentation
Build with LM Studio's local APIs and SDKs — TypeScript, Python, REST, and OpenAI‑compatible endpoints.
lms
lmstudio-js
)npm install @lmstudio/sdk
import { LMStudioClient } from "@lmstudio/sdk"; const client = new LMStudioClient(); const model = await client.llm.model("openai/gpt-oss-20b"); const result = await model.respond("Who are you, and what can you do?"); console.info(result.content);
Full docs: lmstudio-js, Source: GitHub
lmstudio-python
)pip install lmstudio
import lmstudio as lms with lms.Client() as client: model = client.llm.model("openai/gpt-oss-20b") result = model.respond("Who are you, and what can you do?") print(result)
Full docs: lmstudio-python, Source: GitHub
lms server start --port 1234
curl http://localhost:1234/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "openai/gpt-oss-20b", "messages": [{"role": "user", "content": "Who are you, and what can you do?"}] }'
Full docs: OpenAI‑compatible endpoints
This page's source is available on GitHub
On this page
Super quick start
TypeScript (lmstudio-js)
Python (lmstudio-python)
Try a minimal HTTP request (OpenAI‑compatible)
Helpful links