OpenAI-Compatible Endpoint Plugin for LM Studio
A plugin for LM Studio that enables using any OpenAI-compatible API with a configurable Base URL and API key.
Features
- ✅ Custom model input — use any model name
- ✅ Configurable Base URL (OpenAI, Anthropic, local servers, proxies)
- ✅ Response streaming
- ✅ Tool calls / function calling support
Configuration
Global Settings (Shared)
| Field | Description |
|---|
| API Key | Your API key (protected field) |
| Base URL | API endpoint URL (default: https://api.openai.com/v1) |
Model Settings (Per-conversation)
| Field | Description |
|---|
| Model | Model name to use (e.g., gpt-4o-mini, claude-3-5-sonnet-20241022) |
Supported Providers
| Provider | Base URL | Example Model |
|---|
| OpenAI | https://api.openai.com/v1 | gpt-4o-mini |
| Anthropic | https://api.anthropic.com/v1 | claude-3-5-sonnet-20241022 |
| LM Studio Local | http://localhost:1234/v1 | local-model |
| Ollama | http://localhost:11434/v1 | llama3.2 |
| OpenRouter | https://openrouter.ai/api/v1 | openai/gpt-4o-mini |
| z.ai | https://api.z.ai/api/coding/paas/v4 | glm-4.7 |
| Custom proxy | https://your-proxy.com/v1 | — |