125 Downloads
Embedding model from Google, built from Gemma 3 (with T5Gemma initialization) and the same research and technology used to create Gemini models
Open embedding model from Google, built from Gemma 3 (with T5Gemma initialization) and the same research and technology used to create Gemini models. 300M parameters, state-of-the-art for its size.
Well-suited for search and retrieval tasks, including classification, clustering, and semantic similarity search.
Trained with data in 100+ spoken languages. Supports a context length of 2048 tokens.
Use via LM Studio's embedding APIs in Python, TypeScript, or utilize the OpenAI compatible /v1/embeddings endpoint.
The underlying model files this model uses
Based on
When you download this model, LM Studio picks the source that will best suit your machine (you can override this)
Custom configuration options included with this model