Project structure fixed
src/dist/.lmstudio/entry.ts imports from dist/Model dropdown works
models-cache.json on startupTwo config fields
Model ā dropdown for selectionCustom Model ID ā manual input fieldModel selection logic
customModel > model from list > autocustomModel is filled ā use itFixed LM Studio validation error
generate() uses schema with type "string" for reading configoptions listProblem: When selecting a model from the dropdown, the value is NOT copied to the Custom Model ID field.
Possible solutions:
onUpdate hooksStatus: Not critical, works as-is
Problem: If models-cache.json doesn't exist or is outdated, the model list is empty on first launch.
Possible solutions:
Status: Needs work
Problem: The onlyFreeModels flag currently doesn't affect the UI list ā only free models are always shown.
Possible solutions:
Status: Needs work
Problem: There's no explicit user notification when model loading fails.
Possible solutions:
Status: Partially implemented in logs
Problem: generateDisplayName() may incorrectly format some model names.
Possible solutions:
Status: Works acceptably
options after schema creationonUpdate hooks for config fields (or documentation is incomplete)getPluginConfig() ā value must be in options for type "select"# Build TypeScript npm run build # Run in development mode (install in LM Studio) npm run dev # Push updated version npm run push
lms dev --install -y
D:\LMS-src\o!-endpoint\ āāā src/ # TypeScript source ā āāā api.ts # API calls (fetchModels) ā āāā config.ts # Default config schema ā āāā file-cache.ts # Model caching ā āāā generator.ts # Response generation (uses string schema) ā āāā index.ts # Entry point, cache loading ā āāā models.ts # Model loading logic ā āāā schema.ts # Schema creation with options (select) āāā dist/ # Compiled .js files āāā .lmstudio/ ā āāā entry.ts # Entry point for LM Studio ā āāā dev.js # Bundle from lms dev āāā models-cache.json # Model cache (created on startup) āāā package.json
@lmstudio/sdk v1.4.0Forked from tupik/openai-compat-endpoint