Introducing lms
: LM Studio's CLI
•
2024-05-02
Today, alongside LM Studio 0.2.22, we're releasing the first version of lms
— LM Studio's companion cli tool.
With lms
you can load/unload models, start/stop the API server, and inspect raw LLM input (not just output). It's developed on github and we're welcoming issues and PRs from the community.
lms
ships with LM Studio and lives in LM Studio's working directory, under ~/.cache/lm-studio/bin/
. When you update LM Studio, it also updates your lms
version. If you're a developer, you can also build lms
from source.
lms
on your systemYou need to run LM Studio at least once before you can use lms
.
Afterwards, open your terminal and run one of these commands, depending on your operating system:
Afterwards, open a new terminal window and run lms
.
This is the current output you will get:
lms
is MIT Licensed and it is developed in this repository on GitHub:
https://github.com/lmstudio-ai/lms
lms
to automate and debug your workflowsThis will reflect the current LM Studio models directory, which you set in 📂 My Models tab in the app.
--gpu=1.0
means 'attempt to offload 100% of the computation to the GPU'.
This is useful if you want to keep the model identifier consistent.
lms log stream
lms log stream
allows you to inspect the exact input string that goes to the model.
This is particularly useful for debugging prompt template issues and other unexpected LLM behaviors.
lms
uses lmstudio.js to interact with LM Studio.
You can build your own programs that can do what lms
does and much more.
lmstudio.js
is in pre-release public alpha. Follow along on GitHub: https://github.com/lmstudio-ai/lmstudio.js.
Discuss all things lms
and lmstudio.js
in the new #dev-chat
channel on the LM Studio Discord Server.
Download LM Studio for Mac / Windows / Linux from https://lmstudio.ai.
LM Studio 0.2.22 AMD ROCm - Technology Preview is available in https://lmstudio.ai/rocm
LM Studio on Twitter: https://twitter.com/lmstudio