Description
The mid-sized option of the Gemma 2 model family. Built by Google, using from the same research and technology used to create the Gemini models
Stats
14.7K Downloads
3 stars
Capabilities
Minimum system memory
Tags
Last updated
Updated on May 24byREADME
Gemma 2 features the same extremely large vocabulary from release 1.1, which tends to help with multilingual and coding proficiency.
Gemma 2 9B was trained on a wide dataset of 8 trillion tokens, 30% larger than Gemma 1.1, using similar datasets including:
For more details check out their blog post here: https://huggingface.co/blog/gemma2
Sources
The underlying model files this model uses
Based on