Documentation
runtime
Use lms chat to talk to a local model directly in the terminal. This is handy for quick experiments or scripting.
[model] (optional) : string
Identifier of the model to use. If omitted, you will be prompted to pick one.
-p, --prompt (optional) : string
Send a one-off prompt and print the response to stdout before exiting
-s, --system-prompt (optional) : string
Custom system prompt for the chat
--stats (optional) : flag
Show detailed prediction statistics after each response
--ttl (optional) : number
Seconds to keep the model loaded after the chat ends (default: 3600)
lms chat
You will be prompted to pick a model if one is not provided.
lms chat my-model
Use -p to print the response to stdout and exit instead of staying interactive:
lms chat my-model -p "Summarize this release note"
lms chat my-model -s "You are a terse assistant. Reply in two sentences."
lms chat my-model --ttl 600
lms chat reads from stdin, so you can pipe content directly into a prompt:
cat my_file.txt | lms chat -p "Summarize this, please"
This page's source is available on GitHub
On this page
Flags
Start an interactive chat
Chat with a specific model
Send a single prompt and exit
Set a system prompt
Keep the model loaded after chatting
Pipe input from another command