Documentation

Running LLMs Locally

User Interface

Advanced

Command Line Interface - lms

LM Studio as a Local LLM API Server

You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.

LM Studio's APIs can be used through an OpenAI compatibility mode, ehanced REST API, or through a client library like lmstudio.js.

API options

undefined

Load and server LLMs from LM Studio