Documentation
You can serve local LLMs from LM Studio's Developer tab, either on localhost or on the network.
LM Studio's APIs can be used through REST API, client libraries like lmstudio-js and lmstudio-python, and OpenAI compatibility endpoints
lmstudio-js
lmstudio-python
Load and serve LLMs from LM Studio
This page's source is available on GitHub