2.9K Downloads
The mid-sized option of the Gemma 2 model family. Built by Google, using from the same research and technology used to create the Gemini models
Trained for Ttool use
Last Updated 24 days ago
Gemma 2 features the same extremely large vocabulary from release 1.1, which tends to help with multilingual and coding proficiency.
Gemma 2 9B was trained on a wide dataset of 8 trillion tokens, 30% larger than Gemma 1.1, using similar datasets including:
For more details check out their blog post here: https://huggingface.co/blog/gemma2
The underlying model files this model uses
Based on
When you download this model, LM Studio picks the source that will best suit your machine (you can override this)
Custom configuration options included with this model