Documentation

API

LM Studio as a Local LLM API Server

Run an LLM API server on localhost with LM Studio

You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.

LM Studio's APIs can be used through an OpenAI compatibility mode, enhanced REST API, or through a client library like lmstudio-js.

API options

undefined

Load and server LLMs from LM Studio

This page's source is available on GitHub