Documentation
Predicting with LLMs
Agentic Flows
Text Embedding
Tokenization
Manage Models
Model Info
API Reference
Working with Chats
SDK methods such as model.respond()
, model.applyPromptTemplate()
, or model.act()
takes in a chat parameter as an input. There are a few ways to represent a chat in the SDK.
You can use an array of messages to represent a chat. Here is an example with the .respond()
method.
const prediction = model.respond([
{ role: "system", content: "You are a resident AI philosopher." },
{ role: "user", content: "What is the meaning of life?" },
]);
If your chat only has one single user message, you can use a single string to represent the chat. Here is an example with the .respond
method.
const prediction = model.respond("What is the meaning of life?");
Chat
Helper ClassFor more complex tasks, it is recommended to use the Chat
helper classes. It provides various commonly used methods to manage the chat. Here is an example with the Chat
class.
const chat = Chat.empty();
chat.append("system", "You are a resident AI philosopher.");
chat.append("user", "What is the meaning of life?");
const prediction = model.respond(chat);
You can also quickly construct a Chat
object using the Chat.from
method.
const chat = Chat.from([
{ role: "system", content: "You are a resident AI philosopher." },
{ role: "user", content: "What is the meaning of life?" },
]);