Description
Embedding model from Google, built from Gemma 3 (with T5Gemma initialization) and the same research and technology used to create Gemini models
Stats
856 Downloads
4 stars
Capabilities
Minimum system memory
Tags
Last updated
Updated on September 26byREADME
Open embedding model from Google, built from Gemma 3 (with T5Gemma initialization) and the same research and technology used to create Gemini models. 300M parameters, state-of-the-art for its size.
Well-suited for search and retrieval tasks, including classification, clustering, and semantic similarity search.
Trained with data in 100+ spoken languages. Supports a context length of 2048 tokens.
Use via LM Studio's embedding APIs in Python, TypeScript, or utilize the OpenAI compatible /v1/embeddings endpoint.
For more technical details, refer to the paper: EmbeddingGemma: Powerful and Lightweight Text Representations.
Sources
The underlying model files this model uses
Based on