Documentation
Running LLMs Locally
User Interface
API
Running LLMs Locally
User Interface
API
Prompt Template
Advanced
By default, LM Studio will automatically configure the prompt template based on the model file's metadata.
However, you can customize the prompt template for any model.
Head over to the My Models tab and click on the gear โ๏ธ icon to edit the model's default parameters.
โ
+ 3
on Mac, or ctrl
+ 3
on Windows / Linux.When a model doesn't come with a prompt template information, LM Studio will surface the Prompt Template
config box in the ๐งช Advanced Configuration sidebar.
The Prompt Template config box in the chat sidebar
You can make this config box always show up by right clicking the sidebar and selecting Always Show Prompt Template.
You can express the prompt template in Jinja.
You can also express the prompt template manually by specifying message role prefixes and suffixes.
On this page
Overriding the Prompt Template for a Specific Model
Customize the Prompt Template
Prompt template options
- Jinja Template
- Manual
- Reasons you might want to edit the prompt template