Forked from tupik/openai-compat-endpoint
Use any OpenAI-compatible API with LM Studio by routing chat requests through a configurable base URL (default: OpenRouter). The plugin streams responses, forwards tool calls, and lets you choose any model ID supported by your provider.
lms CLI available (LM Studio must be run at least once before lms works).Optional (Windows): bootstrap lms into PATH if needed:
npm run dev runs lms dev, which starts the plugin dev server, verifies manifest.json, installs deps if needed, and rebuilds on changes. Note: lms dev is part of LM Studio Plugins (beta).
Configuration is done in LM Studio plugin settings.
Global (shared across chats):
sk-or-... for OpenRouter)https://openrouter.ai/api/v1)Per-chat:
allenai/molmo-2-8b:freeIf the model is missing, the plugin will return an error asking you to set it.
src/.dist/ (generated by TypeScript).src/generator.ts, src/config.ts, manifest.json.npm run push wraps lms push to upload the current folder as a plugin revision. lms push prompts for confirmation unless you pass -y.
If you have not authenticated with LM Studio Hub yet, run:
Never hardcode API keys. Use the protected global config field instead.
ISC
cmd /c %USERPROFILE%/.lmstudio/bin/lms.exe bootstrap
npm install
npm run dev
npm run push
lms login