Login or Signup
Models
Developer Docs
LM Link
Careers
Blog
Changelog
Enterprise Solutions
Privacy Policy
Terms of Use
11
0
Description
A slightly larger 12B parameter model from Mistral AI, NeMo offers a long 128k token context length, advanced world knowledge, and function calling for developers.
Stats
48.2K Downloads
11 stars
Capabilities
Minimum system memory
Tags
Last updated
README
Mistral Nemo was trained up to 128k context, but supports extra with potentially reduced quality.
This model has amazing performance across a series of benchmarks including multilingual.
For more details, check the blog post here: https://mistral.ai/news/mistral-nemo/
Sources
The underlying model files this model uses
Based on
GGUF
Product
Developer
Company
Legal