LM Studio 0.3.16
LM Studio 0.3.16 is available now as a stable release. This version includes the public preview of community presets, automatic deletion of least recently used Runtime Extension Packs, and a way to use LLMs as text embedding models. It also introduces the lms chat command to the CLI, allowing you to chat with a model in the terminal.
📣 An earlier build of 0.3.16 (build 6) was missing engines within the app bundle on Windows. If this was the first version you ever installed, please head to
Ctrl + Shift + Rto install the engines manually, or download and reinstall 0.3.16 build 7 from https://lmstudio.ai/download.
Ability to publish your presets to the LM Studio community hub and share them with others
Staff Picks polish: introduce new aggregate format that combines formats (e.g. GGUF and MLX) into a single entry
Custom Settings for models
Add "Offload KV Cache to GPU Memory" option to model load options and GPU settings
Add lms chat command to the CLI to chat with a model in the terminal (Thanks @mayfer)
LM Studio REST API (/api/v0): return model capabilities in GET /models response
"capabilities": ["tool_use"]Auto-deletion of least recently used Runtime Extension Packs
⌘/Ctrl + ,)Show System Prompt button in chat top bar when sidebar is collapsed
Use Cmd / Ctrl + Shift + D to create a duplicate of the current chat
Use Cmd / Ctrl + E to open the System Prompt editor
Use Cmd / Ctrl + W to close the System Prompt editor tab when it is open
App Settings (⌘/Ctrl + ,): sections now have their own tabs for easier navigation
Add button to access downloads panel in "User" UI mode
Added a dropdown in the model editor (⚙️ in My Models) to allow overriding the domain type of a model
Add "Reveal in Finder" context menu option on the chat sidebar body
[MLX] Register chat_template.jinja as a source for chat templates
Bug Fixes:
"OpenSquareBracket !== CloseStatement"[object Object] when using RAGModuleNotFoundError: No module named 'mlx_engine'