Documentation
Local LLM Server
You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.
The server can be used both in OpenAI compatibility mode, or as a server for lmstudio.js.
lmstudio.js
Load and server LLMs from LM Studio