Documentation
runtime
lms log stream lets you inspect the exact strings LM Studio sends to and receives from models, and (new in 0.3.26) stream server logs. This is useful for debugging prompt templates, model IO, and server operations.
-s, --source (optional) : string
Source of logs: model or server (default: model)
--stats (optional) : flag
Print prediction stats when available
--filter (optional) : string
Filter for model source: input, output, or both
--json (optional) : flag
Output logs as JSON (newline separated)
Stream model IO (default):
lms log stream
Stream server logs:
lms log stream --source server
# Only the formatted user input lms log stream --source model --filter input # Only the model output (emitted once the message completes) lms log stream --source model --filter output # Both directions lms log stream --source model --filter input,output
Emit JSON:
lms log stream --source model --filter input,output --json
Include prediction stats:
lms log stream --source model --filter output --stats
This page's source is available on GitHub