The large option of the Gemma 2 model family. Built by Google, using from the same research and technology used to create the Gemini models

14.7K Downloads

3 stars

1 fork

Capabilities

Minimum system memory

16GB

Tags

27B
gemma2

Last updated

Updated on May 24by
lmmy's profile picture
lmmy

README

gemma 2 27b it by google

Gemma 2 features the same extremely large vocabulary from release 1.1, which tends to help with multilingual and coding proficiency.

Gemma 2 27B was trained on a wide dataset of 13 trillion tokens, more than twice as many as Gemma 1.1, and an extra 60% over the 9B model, using similar datasets including:

  • Web Documents: A diverse collection of web text ensures the model is exposed to a broad range of linguistic styles, topics, and vocabulary. Primarily English-language content.
  • Code: Exposing the model to code helps it to learn the syntax and patterns of programming languages, which improves its ability to generate code or understand code-related questions.
  • Mathematics: Training on mathematical text helps the model learn logical reasoning, symbolic representation, and to address mathematical queries.

For more details check out their blog post here: https://huggingface.co/blog/gemma2

Sources

The underlying model files this model uses