Open embedding model from Google, built from Gemma 3 (with T5Gemma initialization) and the same research and technology used to create Gemini models. 300M parameters, state-of-the-art for its size.
Well-suited for search and retrieval tasks, including classification, clustering, and semantic similarity search.
Trained with data in 100+ spoken languages. Supports a context length of 2048 tokens.