Documentation
Getting Started
Predicting with LLMs
Agentic Flows
Text Embedding
Tokenization
Getting Started
Predicting with LLMs
Agentic Flows
Text Embedding
Tokenization
Get Model Info
You can access general information and metadata about a model itself from a loaded instance of that model.
In the below examples, the LLM reference can be replaced with an embedding model reference without requiring any other changes.
import lmstudio as lms
model = lms.llm()
print(model.get_info())
LlmInstanceInfo.from_dict({ "architecture": "qwen2", "contextLength": 4096, "displayName": "Qwen2.5 7B Instruct 1M", "format": "gguf", "identifier": "qwen2.5-7b-instruct", "instanceReference": "lpFZPBQjhSZPrFevGyY6Leq8", "maxContextLength": 1010000, "modelKey": "qwen2.5-7b-instruct-1m", "paramsString": "7B", "path": "lmstudio-community/Qwen2.5-7B-Instruct-1M-GGUF/Qwen2.5-7B-Instruct-1M-Q4_K_M.gguf", "sizeBytes": 4683073888, "trainedForToolUse": true, "type": "llm", "vision": false })
On this page
Example output