LM Studio 0.3.26

2025-09-15

LM Studio 0.3.26 is now available as a stable release. Update in-app or download the latest version.

CLI log upgrades

We've added new features to lms log stream. Previously, lms log stream outputted the formatted user messages only. Starting LM Studio 0.3.26, lms log stream gains a few new options:

  • --source: choose the log source (e.g. server, model)
  • --filter: filter logs by type (e.g. input, output, or input,output)
  • --json: output logs in JSON format.
  • --stats: output tok/sec and other stats. Works with --source model.

Server logs

Use lms log stream --source server to stream logs from the HTTP API server.

Terminal
$ lms log stream --source server Streaming logs from LM Studio [2025-09-15 15:07:55][INFO][LM STUDIO SERVER] Success! HTTP server listening on port 1234 [2025-09-15 15:07:55][INFO] [2025-09-15 15:07:55][INFO][LM STUDIO SERVER] Supported endpoints: [2025-09-15 15:07:55][INFO][LM STUDIO SERVER] → GET http://localhost:1234/v1/models [2025-09-15 15:07:55][INFO][LM STUDIO SERVER] → POST http://localhost:1234/v1/chat/completions [2025-09-15 15:07:55][INFO][LM STUDIO SERVER] → POST http://localhost:1234/v1/completions [2025-09-15 15:07:55][INFO][LM STUDIO SERVER] → POST http://localhost:1234/v1/embeddings [2025-09-15 15:07:55][INFO] [2025-09-15 15:07:55][INFO][LM STUDIO SERVER] Logs are saved into /Users/yb/.lmstudio/server-logs [2025-09-15 15:07:55][INFO] Server started. [2025-09-15 15:07:55][INFO] Just-in-time model loading active.

Model log streaming

You can now stream model output, as well as user input.

Log formatted user message

lms log stream --source model --filter input

Log model output

Note that the model message will be queued up until it's complete, and only then be printed.

lms log stream --source model --filter output

Log both input and output

lms log stream --source model --filter input,output` to stream logs from the user and model

Example output

Terminal
$ lms log stream --source model --filter input,output Streaming logs from LM Studio timestamp: 9/15/2025, 3:16:39 PM type: llm.prediction.input modelIdentifier: gpt-oss-20b-mlx modelPath: lmstudio-community/gpt-oss-20b-mlx-8bit input: <|start|>system<|message|>You are ChatGPT, a large language model trained by OpenAI. Knowledge cutoff: 2024-06 Current date: 2025-09-15 Reasoning: medium # Valid channels: analysis, commentary, final. Channel must be included for every message.<|end|><|start|>user<|message|>hello<|end|><|start|>assistant timestamp: 9/15/2025, 3:16:40 PM type: llm.prediction.output modelIdentifier: gpt-oss-20b-mlx output: <|channel|>analysis<|message|>User says "hello". We should respond politely. Provide greeting. Possibly ask how can help. That is straightforward.<|end|><|start|>assistant<|channel|>final<|message|>Hello! 👋 How can I assist you today?

Desktop app improvements

  • Use native context menus across the app for a consistent feel
  • Add an "Enclose in Folder" bulk action when selecting multiple chats
  • Extra mechanisms to ensure child processes are cleaned up when LM Studio receives SIGKILL

Linux fixes

  • Fix rag-v1 on Linux. In 0.3.25, the built-in embedding model was not included, causing it to fail.

Full Changelog

Build 6

  • The LM Studio CLI (lms) now supports streaming server logs, as well as model output.
    • Use lms log stream --source server for server logs
    • Use lms log stream --source model --filter input,output for both model input and output logs
    • Append --json to get JSON-formatted logs
    • Learn more: https://lmstudio.ai/docs/cli/log-stream
  • Fixed a bug where clicking Eject in the developer page would sometime open the configuration panel as well

Build 5

  • [Windows] fixed a bug where Mission Control buttons were hard to click / behaved like a dragging surface

Build 4

  • Fixed a bug where it was sometimes not possible to drag the app action bar when there was an image under it
  • New context menu 'Enclose in Folder' options when selecting multiple chats

Build 3

  • [Linux] Fixed bug where rag-v1 did not work due to a missing embedding model
  • [UI] Switch to native context menus

Build 2

  • Restored lms ls --detailed for backwards compatibility. Use lms ls or lms ls --json instead
  • Fixed bug where child processes were not cleaned up after LM Studio received SIGKILL

Build 1

  • [MLX] Add badge for MXFP4 quantization type

Resources