Documentation
Getting Started
Predicting with LLMs
Agentic Flows
Text Embedding
Tokenization
Manage Models
Model Info
lmstudio-python
(Python SDK)
lmstudio-python
provides you a set APIs to interact with LLMs, embeddings models, and agentic flows.
lmstudio-python
is available as a PyPI package. You can install it using pip.
pip install lmstudio
For the source code and open source contribution, visit lmstudio-python on GitHub.
import lmstudio as lms
model = lms.llm("llama-3.2-1b-instruct")
result = model.respond("What is the meaning of life?")
print(result)
The above code requires the Llama 3.2 1B model. If you don't have the model, run the following command in the terminal to download it.
lms get llama-3.2-1b-instruct
Read more about lms get
in LM Studio's CLI here.
As shown in the example above, there are two distinct approaches for working with the LM Studio Python SDK.
The first is the interactive convenience API (listed as "Python (convenience API)" in examples), which focuses on the use of a default LM Studio client instance for convenient interactions at a Python prompt, or when using Jupyter notebooks.
The second is a scoped resource API (listed as "Python (scoped resource API)" in examples), which uses context managers to ensure that allocated resources (such as network connections) are released deterministically, rather than potentially remaining open until the entire process is terminated.