lms log stream
Stream logs from LM Studio. Useful for debugging prompts sent to the model.
lms log stream lets you inspect the exact strings LM Studio sends to and receives from models, and (new in 0.3.26) stream server logs. This is useful for debugging prompt templates, model IO, and server operations.
Flags
-s, --source (optional) : string
Source of logs: model or server (default: model)
--stats (optional) : flag
Print prediction stats when available
--filter (optional) : string
Filter for model source: input, output, or both
--json (optional) : flag
Output logs as JSON (newline separated)
Quick start
Stream model IO (default):
lms log streamStream server logs:
lms log stream --source serverFilter model logs
# Only the formatted user input
lms log stream --source model --filter input
# Only the model output (emitted once the message completes)
lms log stream --source model --filter output
# Both directions
lms log stream --source model --filter input,outputJSON output and stats
Emit JSON:
lms log stream --source model --filter input,output --jsonInclude prediction stats:
lms log stream --source model --filter output --stats