Documentation
Core
LM Studio REST API
OpenAI Compatible Endpoints
Anthropic Compatible Endpoints
Core
LM Studio REST API
OpenAI Compatible Endpoints
Anthropic Compatible Endpoints
Send Messages requests using the Anthropic-compatible API.
| Endpoint | Method | Docs |
|---|---|---|
/v1/messages | POST | Messages |
For a full walkthrough, see: Use Claude Code with LM Studio.
export ANTHROPIC_BASE_URL=http://localhost:1234 export ANTHROPIC_AUTH_TOKEN=lmstudio claude --model openai/gpt-oss-20b
When Require Authentication is enabled, LM Studio accepts both x-api-key and the standard Authorization: Bearer <token> header. To learn more about enabling auth in LM Studio, checkout Authentication.
Point your Anthropic client (or any HTTP request) at your local LM Studio server.
Note: The following examples assume the server port is 1234.
- curl https://api.anthropic.com/v1/messages \ + curl http://localhost:1234/v1/messages \ -H "Content-Type: application/json" \ + -H "x-api-key: $LM_API_TOKEN" \ -d '{ - "model": "claude-4-5-sonnet", + "model": "ibm/granite-4-micro", "max_tokens": 256, "messages": [ {"role": "user", "content": "Write a haiku about local LLMs."} ] }'
from anthropic import Anthropic client = Anthropic( base_url="http://localhost:1234", api_key="lmstudio", ) message = client.messages.create( max_tokens=1024, messages=[ { "role": "user", "content": "Hello from LM Studio", } ], model="ibm/granite-4-micro", ) print(message.content)
If you have not enabled Require Authentication, the x-api-key header is optional.
For the Python example, you can also omit api_key when authentication is disabled.
If you're running into trouble, hop onto our Discord and enter the developers channel.
This page's source is available on GitHub
On this page
Supported endpoints
Using Claude Code with LM Studio
Authentication headers
Set the base URL to point to LM Studio
cURL example
Python example