Login or Signup
Models
Developer Docs
Careers
Enterprise Solutions
Privacy Policy
Terms of Use
Blog
Documentation
Introduction
API Changelog
Core
Authentication
Headless Mode
Linux Startup Task
Using MCP via API
Idle TTL and Auto-Evict
Local Server
LM Studio REST API
Overview
Quickstart
Stateful Chats
Streaming events
Chat with a model
List your models
Load a model
Download a model
Unload a model
Get download status
REST API v0
OpenAI Compatible Endpoints
Structured Output
Tools and Function Calling
List Models
Responses
Chat Completions
Embeddings
Completions (Legacy)
List available models via the OpenAI-compatible endpoint.
GET
curl http://localhost:1234/v1/models
This page's source is available on GitHub
On this page