lms chat
Start a chat session with a local model from the command line.
Use lms chat to talk to a local model directly in the terminal. This is handy for quick experiments or scripting.
Flags
[model] (optional) : string
Identifier of the model to use. If omitted, you will be prompted to pick one.
-p, --prompt (optional) : string
Send a one-off prompt and print the response to stdout before exiting
-s, --system-prompt (optional) : string
Custom system prompt for the chat
--stats (optional) : flag
Show detailed prediction statistics after each response
--ttl (optional) : number
Seconds to keep the model loaded after the chat ends (default: 3600)
Start an interactive chat
lms chatYou will be prompted to pick a model if one is not provided.
Chat with a specific model
lms chat my-modelSend a single prompt and exit
Use -p to print the response to stdout and exit instead of staying interactive:
lms chat my-model -p "Summarize this release note"Set a system prompt
lms chat my-model -s "You are a terse assistant. Reply in two sentences."Keep the model loaded after chatting
lms chat my-model --ttl 600Pipe input from another command
lms chat reads from stdin, so you can pipe content directly into a prompt:
cat my_file.txt | lms chat -p "Summarize this, please"