Documentation

Core

LM Studio as a Local LLM API Server

Run an LLM API server on localhost with LM Studio

You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.

LM Studio's APIs can be used through REST API, client libraries like lmstudio-js and lmstudio-python, and OpenAI compatibility endpoints

API options

undefined

Load and serve LLMs from LM Studio

This page's source is available on GitHub