← All Models

LFM2

43.1K Downloads

LFM2 is a new generation of hybrid models developed by Liquid AI, specifically designed for edge AI and on-device deployment. It sets a new standard in terms of quality, speed, and memory efficiency.

Models
Updated 2 days ago
220.00 MB
450.00 MB
700.00 MB

Memory Requirements

To run the smallest LFM2, you need at least 220 MB of RAM. The largest one may require up to 700 MB.

Capabilities

LFM2 models are available in gguf and mlx formats.

About LFM2

undefined

LFM2 is a new generation of hybrid models developed by Liquid AI, specifically designed for edge AI and on-device deployment. It sets a new standard in terms of quality, speed, and memory efficiency.

LFM2 includes weights of several post-trained checkpoints with 350M, 700M, and 1.2B parameters available in LM Studio. They provide the following key features to create AI-powered edge applications:

  • Fast training & inference – LFM2 achieves 3x faster training compared to its previous generation. It also benefits from 2x faster decode and prefill speed on CPU compared to Qwen3.
  • Best performance – LFM2 outperforms similarly-sized models across multiple benchmark categories, including knowledge, mathematics, instruction following, and multilingual capabilities.
  • New architecture – LFM2 is a new hybrid Liquid model with multiplicative gates and short convolutions.
  • Flexible deployment – LFM2 runs efficiently on CPU, GPU, and NPU hardware for flexible deployment on smartphones, laptops, or vehicles.

Model details

Due to their small size, Liquid recommends to fine-tuning LFM2 models on narrow use cases to maximize performance. They are particularly suited for agentic tasks, data extraction, RAG, creative writing, and multi-turn conversations. However, Liquid does not recommend using them for tasks that are knowledge-intensive or require programming skills.

PropertyLFM2-350MLFM2-700MLFM2-1.2B
Parameters354,483,968742,489,3441,170,340,608
Layers16 (10 conv + 6 attn)16 (10 conv + 6 attn)16 (10 conv + 6 attn)
Context length32,768 tokens32,768 tokens32,768 tokens
Vocabulary size65,53665,53665,536
Precisionbfloat16bfloat16bfloat16
Training budget10 trillion tokens10 trillion tokens10 trillion tokens
LicenseLFM Open License v1.0LFM Open License v1.0LFM Open License v1.0

Supported languages: English, Arabic, Chinese, French, German, Japanese, Korean, and Spanish.

License

LFM2 models are released under a custom lfm1.0 license.