Documentation
Predicting with LLMs
Agentic Flows
Plugins
Tools Provider
Prompt Preprocessor
Generators
Custom Configuration
Publishing a Plugin
Text Embedding
Tokenization
API Reference
Model Info
Predicting with LLMs
Agentic Flows
Plugins
Tools Provider
Prompt Preprocessor
Generators
Custom Configuration
Publishing a Plugin
Text Embedding
Tokenization
API Reference
Model Info
Embedding
Generate embeddings for input text. Embeddings are vector representations of text that capture semantic meaning. Embeddings are a building block for RAG (Retrieval-Augmented Generation) and other similarity-based tasks.
If you don't yet have an embedding model, you can download a model like nomic-ai/nomic-embed-text-v1.5
using the following command:
lms get nomic-ai/nomic-embed-text-v1.5
To convert a string to a vector representation, pass it to the embed
method on the corresponding embedding model handle.
import { LMStudioClient } from "@lmstudio/sdk";
const client = new LMStudioClient();
const model = await client.embedding.model("nomic-embed-text-v1.5");
const { embedding } = await model.embed("Hello, world!");
On this page
Prerequisite: Get an Embedding Model
Create Embeddings