Documentation

serve

lms log stream

Stream logs from LM Studio. Useful for debugging prompts sent to the model.

lms log stream lets you inspect the exact strings LM Studio sends to and receives from models, and (new in 0.3.26) stream server logs. This is useful for debugging prompt templates, model IO, and server operations.

Flags

-s, --source (optional) : string

Source of logs: model or server (default: model)

--stats (optional) : flag

Print prediction stats when available

--filter (optional) : string

Filter for model source: input, output, or both

--json (optional) : flag

Output logs as JSON (newline separated)

Quick start

Stream model IO (default):

lms log stream

Stream server logs:

lms log stream --source server

Filter model logs

# Only the formatted user input
lms log stream --source model --filter input

# Only the model output (emitted once the message completes)
lms log stream --source model --filter output

# Both directions
lms log stream --source model --filter input,output

JSON output and stats

Emit JSON:

lms log stream --source model --filter input,output --json

Include prediction stats:

lms log stream --source model --filter output --stats

This page's source is available on GitHub