openai-compat-endpoint-ionos

Public

Forked from lmstudio/openai-compat-endpoint

Include ionos ai model hub

Last updated

Updated 5 days agoby
yai

README

IONOS AI Model Hub Adapter for LM Studio

BETA

Expose the OpenAI-compatible openai/gpt-oss-120b and meta-llama/Llama-3.3-70B-Instruct models hosted on the IONOS AI Model Hub inside LM Studio. The plugin sends chat-completion requests through the LM Studio runtime and never stores your API key in source control.

Currently, the GPT-OSS 120 B is in beta mode (see here: https://docs.ionos.com/cloud/ai/ai-model-hub/models/llms/openai-gpt-oss-120b)

The Plugin runs with meta-llama/Llama-3.3-70B-Instruct (see here: https://docs.ionos.com/cloud/ai/ai-model-hub/models/llms/meta-llama-3-3-70b)