Forked from tupik/openai-compat-endpoint
AGENTS.md
The core plugin code lives in src/:
index.ts defines the LM Studio entry point.generator.ts adapts LM Studio chat streams to OpenAI-compatible APIs.config.ts holds plugin config schematics and defaults.constants.ts stores shared model lists/constants.Repository metadata and build config are in manifest.json, , and . TypeScript build output is written to (generated).
package.jsontsconfig.jsondist/npm install installs dependencies.npm run dev runs lms dev for local plugin development in LM Studio.npm run push runs lms push to publish the plugin.npx tsc -p tsconfig.json compiles TypeScript to dist/ (not wired as a script).There is no test script configured yet.
tsconfig.json).camelCase for variables/functions, PascalCase for types, and UPPER_SNAKE for constants (e.g., MAX_REQUESTS).src/ and preserve the existing module structure.No test framework or coverage targets are defined. If you add tests, place them under src/__tests__/ (or similar) and add a test script to package.json.
This workspace does not include a .git history, so no existing commit convention can be inferred. Use short, imperative subjects (e.g., Add model selection fallback).
For PRs, include:
Do not hardcode API keys. Use LM Studio global config fields (apiKey, baseUrl), which are marked protected. If defaults change, update config.ts placeholders and call it out in the PR description.